Cyborg cockroaches guided by ultraviolet light and motion feedback navigate obstacles autonomously, showing how noninvasive control can coordinate biological movement with electronic sensing.
(Nanowerk Spotlight) Small robots often fail where insects succeed. A machine the size of a matchbox can carry a camera or sensor, but once it tries to crawl through debris or climb an uneven surface, movement becomes erratic. Power drains fast, sensors lose accuracy, and mechanical parts jam. Yet a cockroach performs these same tasks effortlessly, crossing cluttered spaces with precise control and minimal energy.
Scientists studying miniature robotics have long recognized this gap between mechanical design and biological coordination. They can copy an insect’s shape but not its integration of sensing and motion.
One response has been to link live insects to tiny electronic devices, creating hybrids that combine biological movement with artificial guidance. The idea is straightforward: the animal provides locomotion and stability while the electronics supply direction and sensing. Such biohybrid systems could eventually reach areas too small, dangerous, or unstable for traditional robots.
The obstacle has always been control. Many previous designs used implanted electrodes to deliver small electric shocks that forced the insect to walk or turn. The method worked temporarily but caused injury, reduced responsiveness, and required constant power input. Animals adapted to the stimuli or stopped moving altogether.
The technique relies on a simple instinct: the insects avoid ultraviolet illumination, a behavior called negative phototaxis. When light reaches one eye, the cockroach naturally turns the opposite way. Instead of overriding the nervous system, the device activates the insect’s own sensory pathway, achieving direction control without pain or fatigue.
Overview of bio-intelligent cyborg insects (BCI) concept. a) Diagram of the setup where UV LEDs stimulate the cockroach’s compound eyes. A 3D-printed frame holds the UV LEDs, targeting the left and right compound eyes (LCE and RCE) to influence directional movement. b,c) Close-up views of the UV helmet mounted on the cockroach’s head, showing the UV LED arrangement for accurate stimulation. d) An illustration of the cyborg insect equipped with a backpack containing a battery and the UV helmet shows how stimulation to the LCE or RCE induces corresponding turning responses. e) The feedback control flowchart outlines the UV stimulation-based system. The system uses IMU data to monitor the cockroach’s movement. When no movement is detected, UV stimulation is activated to induce motion. If movement is detected, the system stops stimulation. This feedback method ensures that UV stimulation is only applied when necessary, utilizing the insect’s natural behavior while reducing the number of stimulations that affect the habituation response. f ) Schematic diagram illustrates the interaction between bio-intelligence and artificial intelligence in cyborg insects. The bio-intelligence system (green) consists of receptors, sensory neurons, and brain/ganglia, enabling natural decision-making and movement control. The electrical stimulation cyborg insects (red) implement direct control by bypassing natural decision-making through external stimulation of motor neurons. The BCI system (blue) employs sensory control, influencing movement by sensory input rather than directly controlling motor neurons. The IMU monitors movement and determines whether stimulation is required. (Image: Reprinted from DOI:10.1002/aisy.202400838, CC BY) (click on image to enlarge)
The system consists of a small headpiece fitted with two ultraviolet light emitting diodes placed near the compound eyes. Each diode sits about two to three millimeters from the surface. Light on the left eye causes a right turn; light on the right eye causes a left turn. The diodes emit at 395 nanometers, a wavelength strongly detected by the insect’s visual receptors. Three second flashes produced no measurable heating, confirming that the reaction depends on vision, not temperature.
A compact backpack supplies power and sensing. It measures roughly 30 x 20 millimeters and weighs five grams. It holds a 40 milliamp hour lithium polymer battery, a Bluetooth Low Energy microcontroller, an inertial sensor that records acceleration and rotation, three time of flight sensors that measure distance between two and 120 centimeters, and a temperature and humidity sensor. The controller modulates light frequency between one and 100 hertz using square wave signals. Average power use is about one quarter watt, which allows approximately 20 minutes of operation.
Before testing full navigation, the team examined how different light frequencies affected turning behavior. Each flash lasted three seconds, and motion capture tracked head orientation at 100 frames per second. As frequency increased, turn angles grew. Light on the left eye produced right turns of about ten degrees at 20 hertz and 25 degrees at 100 hertz. Light on the right eye produced left turns up to 50 degrees. The relationship was stable across individuals, providing dependable steering control.
To assess long term reliability, the researchers compared ultraviolet guidance with traditional electrical stimulation. In the optical trials, three insects received up to 150 light flashes each. In the electrical trials, silver wire electrodes delivered 3.3 volt square waves at 50 hertz for one second to the antenna and thorax.
Under ultraviolet control, turn strength stayed steady. Under electrical control, reactions weakened, and two of three insects ceased responding after roughly sixty trials. The optical approach avoided habituation and prevented tissue damage.
Next, the team tested autonomous navigation in a controlled environment that simulated cluttered terrain. The arena contained sand, small rocks, a ten-centimeter wall, dark shelters, and food that attracted the insects. Without feedback, most insects failed to find an exit. About 64 percent became trapped near obstacles, 12 percent did not move, and only 24 percent escaped within three minutes. Constant manual stimulation could keep them moving but consumed power and produced erratic paths. The researchers therefore introduced a feedback algorithm that activates only when the insect stops.
The closed loop controller tracks forward acceleration. When acceleration remains below about one centimeter per second squared, the system assumes the insect is stationary. It then checks the distance sensors. If an obstacle appears on the left, the left light flashes to trigger a right turn. If an obstacle appears on the right, the right light activates. When no obstacle is detected, the system alternates sides to encourage exploration. Once movement resumes, the light switches off. This selective approach allows natural locomotion while providing quick cues when motion stalls.
Performance improved significantly. Across 50 closed loop trials using five insects, 94 percent reached the exit and six percent remained stuck. Most light flashes occurred near corners and food sources, the areas where insects tended to stop. On average, each insect moved freely for about 80 percent of the session and received light cues for about 20 percent. Mean path length was around 170 centimeters, and average speed was about four centimeters per second. The low frequency of cues reduced power consumption and preserved natural movement.
The backpack and helmet maintained full sensory function. The five-gram payload and 0.1 gram headpiece left antennae unobstructed. Beeswax mounting allowed removal without harm. A brief three-minute exposure to carbon dioxide calmed insects during setup, and recovery required about three hours. Energy use averaged 15 joules per minute, consistent with power readings and the stated endurance. The hardware relies on standard components and can be reproduced with common laboratory tools.
This work departs from earlier cyborg insect designs by cooperating with the animal’s own sensory control instead of overriding it. The electronics provide minimal input, only stepping in when the insect hesitates. The organism itself performs obstacle detection, balance, and fine motor adjustment. The result is efficient navigation and steady responsiveness over extended testing.
Certain limits remain. Sensitivity to ultraviolet light varies among individuals, requiring calibration of brightness. The battery restricts operation to short sessions, and variable environmental conditions such as dust or light changes could alter sensor readings. Even with these constraints, the findings show that guiding natural reflexes can produce reliable control without invasive methods or heavy computation.
The study also points to a larger insight about biohybrid systems. Living nervous tissue performs real time computation that even the smallest processors struggle to match. By aligning electronic signals with these biological mechanisms, engineers can draw on the organism’s existing processing power instead of replacing it.
The ultraviolet guided cockroach demonstrates how cooperation between biological and artificial elements can achieve stable, low energy navigation in conditions where conventional robots fail. As microelectronics continue to shrink, this sensory based design could support future devices that explore confined or hazardous environments using living mobility and artificial intelligence together.
For authors and communications departmentsclick to open
Lay summary
Prefilled posts
Plain-language explainer by Nanowerk
https://www.nanowerk.com/spotlight/spotid=67976.php?ref=li_author
Nanowerk Newsletter
Get our Nanotechnology Spotlight updates to your inbox!
Thank you!
You have successfully joined our subscriber list.
Become a Spotlight guest author! Join our large and growing group of guest contributors. Have you just published a scientific paper or have other exciting developments to share with the nanotechnology community? Here is how to publish on nanowerk.com.