In‐Sensor Computing with Visual‐Tactile Perception Enabled by Mechano‐Optical Artificial Synapse

DOI: 10.1002/adma.202419405 Publication Date: 2025-03-11T22:47:18Z
ABSTRACT
AbstractIn‐sensor computing paradigm holds the promise of realizing rapid and low‐power signal processing. Constructing crossmodal in‐sensor computing systems to emulate human sensory and recognition capabilities has been a persistent pursuit for developing humanoid robotics. Here, an artificial mechano‐optical synapse is reported to implement in‐sensor dynamic computing with visual‐tactile perception. By employing mechanoluminescence (ML) material, direct conversion of the mechanical signals into light emission is achieved and the light is transported to an adjacent photostimulated luminescence (PSL) layer without pre‐ and post‐irradiation. The PSL layer acts as a photon reservoir as well as a processing unit for achieving in‐memory computing. The approach based on ML coupled with PSL material is different from traditional circuit–constrained methods, enabling remote operation and easy accessibility. Individual and synergistic plasticity are elaborately investigated under force and light pulses, including paired‐pulse facilitation, learning behavior, and short‐term and long‐term memory. A multisensory neural network is built for processing the obtained handwritten patterns with a tablet consisting of the device, achieving a recognition accuracy of up to 92.5%. Moreover, material identification has been explored based on visual‐tactile sensing, with an accuracy rate of 98.6%. This work provides a promising strategy to construct in‐sensor computing systems with crossmodal integration and recognition.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (52)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....