A soft skin-like patch uses iontronic sensing, vibration feedback and synthetic-data learning to send and read all ASCII characters through touch, enabling a complete two-way tactile communication method.
(Nanowerk Spotlight) Human touch can express subtle patterns of pressure, timing and movement. Digital devices capture only simple taps and swipes, leaving much of that detail unused.
This gap between what the skin can sense and what devices can interpret has motivated ongoing efforts to create richer touch-based tools. Scientists have explored gloves with dense sensors, wearable bands that monitor subtle pressure changes and thin surfaces that generate patterned vibration. These prototypes show potential but rarely achieve the flexibility, comfort or precision needed for everyday use. Some are too rigid to conform to the body. Others can detect only simple gestures or cannot provide meaningful feedback. Even more restrictive is the fact that most do not communicate in the same structured language computers use.
Standard digital text relies on ASCII, a seven-bit code that defines one 128 characters including letters, digits, punctuation marks and control symbols. Matching this expressive range through touch alone remains a difficult challenge.
Advances in soft materials and artificial intelligence are beginning to change expectations. Stretchable circuits can move with the skin. Gel-based pressure sensors can register minute forces. Small motors can deliver distinct vibration patterns. Learning algorithms can classify complex time-varying signals with speed and accuracy. Together these developments support a different vision for interaction, one in which the skin is not only a point of contact but also a channel through which information flows in both directions.
A study in Advanced Functional Materials (“A Fully Integrated Patch for Real‐Time AI‐Enhanced Haptic Closed‐Loop Interaction of Complete 128 ASCII Codes”) builds on this idea by presenting a soft, skin-like patch that converts touch into text and returns text-based feedback through the skin. The device links iontronic sensors, flexible circuits, compact vibration modules and an artificial-intelligence model trained to recognize pressing patterns. These elements form a complete, two-way communication loop that represents all one 128 ASCII characters through touch alone.
Fully integrated stretchable patch and haptic closures. a) Traditional haptic-visual and novel haptic-haptic human–computer interaction. b) Haptic sensors, haptic actuators and PC-based AI model work together. c) Spatial distribution of individual components in a fully integrated patch, including BLE, DDS, amplifiers, sensors, actuators and electronics. d) Block diagram of data flow in a fully integrated patch. e) Finite element analysis and experiments are conducted to evaluate the mechanical performance of serpentine stretchable interconnects under (i) 20% uniaxial strain, (ii)5 cm bending load, and (iii) 90° torsion load. (Image: Reproduced with permission from Wiley-VCH Verlag) (click on image to enlarge)
The platform begins with a stretchable circuit made from copper traces patterned in serpentine shapes on polyimide. This structure lets the traces elongate, twist and bend without breaking. A silicone layer encapsulates the system and preserves softness. Tests show that the traces withstand 20 % strain, 5 cm bending and 90° twisting while maintaining a resistance near 1.5 Ω. The full patch has a modulus of about 435.1 kPa, close to that of human skin, which supports conformability. A silicone adhesive allows users to apply and remove the device comfortably.
The core sensing element is an iontronic array. Iontronic sensing uses ionic motion in a soft gel to produce changes in capacitance under pressure. The researchers create the sensor from a stack of polyimide, commercial rice paper and a PVA H₃PO₄ gel. The rice paper provides a fibrous network, and the gel coats this network to form a thin ionic layer. A copper electrode patterned above the layer changes its effective contact area when pressed, producing measurable capacitive signals.
Performance tests show a sensitivity of 997.2 kPa⁻¹, a minimum detectable pressure of 0.8 Pa and a resolution of 200 Pa at 10 kPa. The sensor responds in 34 ms and recovers in 27 ms. It remains stable across 800 cycles at 20 kPa and functions reliably for at least 7 d. Cytotoxicity tests indicate low toxicity, and the use of rice paper suggests scalable manufacturing.
To provide tactile output, the patch includes vibration actuators known as eccentric-rotating-mass motors. These contain a rotating weight that produces vibration when powered. The patch controls vibration strength through pulse-width modulation, which adjusts how long the motor receives power during each cycle. Duty cycles from 30 % to 90 % produce seven clear vibration levels from 0.43 kPa to 1.57 kPa. Pulse durations of 400 ms generate strong but concise signals. Human tests show that volunteers identify which actuator is active with about 91 % accuracy and distinguish vibration intensity with about 92.5 % accuracy.
Text input is encoded by dividing the seven-bit ASCII value of a character into four segments. Each sensor corresponds to a two-bit segment. The number of presses on a sensor within a short window indicates the value of that segment. After decoding, the system returns the same information through vibration pulses. Each actuator vibrates a set number of times that reflect its segment. This creates a tactile communication scheme aligned directly with ASCII.
Training a learning model to classify all characters usually requires large datasets. Instead, the study builds a mathematical model of pressing behavior. Each press has four phases that describe how the signal rises, peaks, falls and returns to baseline. The number of presses, the pressing force and the duration vary within defined ranges. Random sampling within these ranges generates synthetic time series that resemble real sensor data.
These synthetic sequences are compared to measured sequences and show similar shapes and similar frequency content. The researchers train a convolutional-neural-network model on large synthetic datasets. The model examines windows of 1000 data points from the four sensor channels. After training, it separates character classes clearly and reaches perfect accuracy when characters are grouped into uppercase letters, lowercase letters, digits, punctuation marks and communication symbols. Reducing the synthetic dataset size lowers accuracy, which confirms the value of synthetic-data generation.
Two demonstrations show how the patch can be used. In the first, a user enters the text “Go!” through a sequence of presses. The computer decodes the characters and returns tactile confirmation, allowing the interaction to occur without visual cues. In the second, the patch controls a racing game. Presses steer the virtual car while vibration intensity conveys distance to nearby vehicles. Distances of 80, 60, 40 and 20 units map to pressures of 0.43 kPa, 0.78 kPa, 1.23 kPa and 1.57 kPa. When another car approaches from one side, the actuator on that side vibrates more strongly.
The study shows that a soft, skin-like patch can support two-way communication entirely through touch. It brings together sensitive iontronic sensing, flexible circuitry, structured vibration feedback and synthetic-data-driven learning. With further refinement in breathability and onboard computation, this approach could extend digital interaction into contexts where screens and sound are limited or unavailable.
For authors and communications departmentsclick to open
Lay summary
Prefilled posts
Plain-language explainer by Nanowerk
https://www.nanowerk.com/spotlight/spotid=68214.php?ref=li_author
Nanowerk Newsletter
Get our Nanotechnology Spotlight updates to your inbox!
Thank you!
You have successfully joined our subscriber list.
Become a Spotlight guest author! Join our large and growing group of guest contributors. Have you just published a scientific paper or have other exciting developments to share with the nanotechnology community? Here is how to publish on nanowerk.com.