Organic synapses give insect robots night vision and pattern recognition


Jun 11, 2025

An organic synapse array enables night vision and pattern recognition in insect robots by detecting near-infrared light and triggering real-time motor responses.

(Nanowerk Spotlight) Insect-scale robots promise new capabilities for environmental exploration, disaster response, and surveillance in spaces too small or dangerous for humans or larger machines. These systems combine lightweight mechanical frames with flexible electronics to replicate key features of insect behavior. But despite advances in micromechanics and soft actuators, one critical component remains a limiting factor: the visual system. Without effective vision, these robots cannot recognize threats, navigate in cluttered environments, or respond dynamically to changing conditions. For autonomous function, visual input must be processed and interpreted rapidly and efficiently—ideally on the device itself and under low-light conditions. Existing artificial vision systems typically combine cameras, memory, and processors into rigid modules, which require high power and offer limited integration with mobile platforms. While suitable for larger devices, these architectures are ill-suited for insect-scale robots. Over the past decade, researchers have explored neuromorphic approaches that mimic how biological organisms process visual information. Rather than separating sensing and computation, neuromorphic devices integrate these steps, performing preprocessing, learning, and memory within light-responsive circuits that emulate the function of synapses and neurons. A key enabling development has been the emergence of organic electronic materials that are both flexible and tunable. In particular, bulk heterojunction (BHJ) organic semiconductors—materials that combine electron donors and acceptors to enhance charge separation—have shown potential for broadband photodetection with low power consumption. However, most demonstrations to date have used complex device architectures or narrow spectral windows, limiting their ability to function in dynamic, naturalistic conditions. A new study published in Advanced Materials (“A Polychromatic Neuromorphic Visual System Inspired by Biomimetics for Miniature Insect Robots”) presents a neuromorphic visual system that addresses these limitations. Developed by researchers at Shanghai University and The Hong Kong Polytechnic University, the system integrates a compact, low-energy photosynaptic device with a microcontroller to enable adaptive image recognition and motion control in a live insect platform. The work demonstrates a full hardware implementation capable of detecting broadband light, including near-infrared (NIR), and converting it into motor commands for behavioral response. At the core of the system is a two-terminal artificial synapse made from a PM6:L8-BO blend—a donor-acceptor pair commonly used in organic solar cells. This active layer is sandwiched between buffer layers of CuSCN and PFN-Br, which improve charge trapping and light absorption. The resulting BHJ-OPS (bulk heterojunction organic photosynapse) exhibits sensitivity across a broad spectral range from near-ultraviolet (365 nm) to NIR (1060 nm). Unlike conventional photodetectors, it also displays memory-like behaviors: its electrical output increases with repeated light pulses, simulating short-term and long-term potentiation observed in biological synapses. These dynamic electrical responses are attributed to trapped charge carriers at the interface layers, which accumulate and release over time. The system reproduces key neural phenomena such as paired-pulse facilitation (where successive stimuli produce stronger responses), the transition from short-term to long-term memory (modulated by pulse width, intensity, and frequency), and multibit memory storage (enabling graded responses to light). When illuminated with 808 nm NIR light, the device achieves the strongest and most persistent signals, making it particularly effective for low-light vision applications. The researchers demonstrated that the BHJ-OPS array could also simulate long-term depression (LTD)—a mechanism by which synaptic strength decreases. Using 1060 nm light, the device’s response decreased over time, mimicking the inhibitory processes found in neural circuits. The combination of potentiation and depression allows for fine-tuned learning and memory behaviors, which are essential for image classification and adaptive control. Importantly, these responses were achieved with energy consumption as low as 0.2 femtojoules per synaptic event—significantly lower than most biological synapses and existing artificial devices. To evaluate the system in a real-world application, the team built a neuromorphic vision array consisting of 676 BHJ-OPS synapses arranged in a 26-by-26 grid. This was integrated with an ATmega328P microcontroller and mounted on a custom circuit board, forming a lightweight “electronic backpack” that could be affixed to a live rhinoceros beetle. The microcontroller processed input from the synapse array and sent stimulation signals to the beetle’s leg muscles, triggering motion in response to visual cues. The artificial vision system was tested using projected optical masks shaped like bats (representing predators) and apples (representing food). When a bat pattern was projected using 808 nm light, the system recognized the shape, triggered a motor command, and caused the beetle to turn left—an escape response. Conversely, an apple pattern prompted the beetle to move forward, simulating foraging behavior. These responses occurred autonomously and in real time, demonstrating that the system could process and act on visual input under low-light conditions without external computation. A schematic diagram of the visual neural pathway of the rhinoceros beetle. Diagram of natural and artificial visual-motor pathways in the rhinoceros beetle. (a) The beetle’s native escape response is triggered when visual input is processed by interneurons in the prothoracic ganglion and relayed to motor neurons. (b) The artificial visual system detects near-infrared signals using an organic synapse array and converts them into electrical stimulation patterns that control leg movement, enabling predator evasion in low-light conditions. (Image: reprinted with permission by Wiley-VCH Verlag) Image classification was performed using a simple artificial neural network with 676 input neurons (one for each synapse), 300 hidden neurons, and 2 output neurons. After training on 400 labeled images (200 bats and 200 apples), the system achieved a classification accuracy of 90%. Weight updates in the network were driven entirely by optically induced changes in synaptic conductance, eliminating the need for electrical rewiring or manual calibration. The system’s learning behavior—its ability to strengthen or weaken responses based on optical stimulation—was repeatable and stable over multiple cycles. The ability to detect and process multispectral information was also validated using full-color images. By illuminating images with 380 nm (near-UV), 525 nm (green), and 808 nm (NIR) pulses, the researchers showed that different features could be selectively enhanced or suppressed, enabling basic image filtering and preprocessing functions. This capability allows the system to isolate relevant visual details from cluttered scenes and could be extended to more complex visual tasks with further training. Much of the device’s performance can be traced to its interfacial engineering. The CuSCN buffer layer, in particular, provides a high energy barrier for electron transport, encouraging charge accumulation and slow release—key factors for synaptic emulation. Electrochemical impedance spectroscopy and capacitance-voltage measurements confirmed that devices with this buffer had higher trap densities and slower charge transfer times, both of which improve signal retention and response tunability. Devices with MoO₃ buffer layers performed poorly due to higher surface roughness and faster carrier recombination, underscoring the importance of material choice in interface design. The research presents a fully integrated system in which broadband visual sensing, memory, and decision-making are achieved within a compact organic electronic platform. While demonstrated here on an insect, the underlying principles could be adapted to other platforms where size and power constraints are critical. The work shows that neuromorphic vision systems based on organic materials can match—and in some respects exceed—the performance of traditional electronic vision systems while operating with far lower energy budgets. By mimicking the information processing strategies of biological systems and embedding them directly into flexible, low-cost hardware, this technology opens the door to new forms of autonomous robotics. Future work may explore scaling the arrays, expanding classification capabilities, or interfacing with other biological organisms. For now, the study provides a compelling demonstration of how artificial visual intelligence can be distributed, miniaturized, and trained to act directly in the physical world.


Michael Berger
By
– Michael is author of three books by the Royal Society of Chemistry:
Nano-Society: Pushing the Boundaries of Technology,
Nanotechnology: The Future is Tiny, and
Nanoengineering: The Skills and Tools Making Technology Invisible
Copyright ©




Nanowerk LLC

XYZ Piezo Nanopositioning Stages

Become a Spotlight guest author! Join our large and growing group of guest contributors. Have you just published a scientific paper or have other exciting developments to share with the nanotechnology community? Here is how to publish on nanowerk.com.

Leave a Reply

Your email address will not be published. Required fields are marked *