Neuromorphic electronics mimicking the sensory cue integration in the brain

Inspired from the multisensory cue integration in the mammalian brain for spatial perception, a motion-cognitive artificial nerve was developed using neuromorphic devices and its perceptual performance matches closely with the biological principles.
Neuromorphic electronics mimicking the sensory cue integration in the brain
Like

Share this post

Choose a social network to share with, or copy the URL to share elsewhere

This is a representation of how your post may appear on social media. The actual post will vary between social networks

Bumble bees display the ability to recognize objects across modalities of visual and tactile senses. Star-nosed moles perceive their surroundings using tactile and olfactory perception in lightless underground environments. Many animal species, especially mammals, demonstrate multisensory integration ability that enables them to combine multiple sensory signals. Indeed, sensory cue integration in the brain may benefit the animals for completing complex tasks and improving perceptual performance.

Is it possible to implement the brain function of sensory cue integration in an electronic device? Neuromorphic electronics may shed light on this topic. Recent research has reported artificial synapses and artificial sensory nerves that can emulate the basic working principles of biological neurons, where information is processed in a parallel, asynchronized, event-driven manner. However, the fundamental principles of spatiotemporal recognition, perceptual enhancement, inverse effectiveness, and perceptual weighting related to multisensory integration in the brain have not been sufficiently investigated. Besides, cognitive intelligence needs to be further combined with multisensory integration to enable their application in wearable electronics and advanced robotics, where real-time recognition and energy-efficient sensing are required.

In our recent work published in Nature Communications, a motion-cognition neuromorphic nerve was developed to achieve a brain-like multisensory integration. For the macaque monkey, self-motion in the environment invokes inertial stimuli in the inner-ear vestibular, and visual stimuli in the retina (Figure 1). The vestibular-inertia and visual-speed information are converted to spike trains carrying different spatiotemporal patterns and then processed in the neural and synaptic networks through the process of sensory perception. Such integration of information from two different sensory modalities results in neural and behavioral response enhancement regarding motion and spatial perception. Similar to the ocular-vestibular system of the macaque, acceleration and angular speed signals are obtained by an accelerometer and a gyroscope in our system (Figure 1). The motion signals are firstly converted into two temporally correlated spikes and then processed by a multi-input synaptic device. The temporal pattern of the two spikes affect the output of the synaptic device. Recognition of motion information is implemented by averaging the spike firing rate and reading the device output in an event-based manner. As a result, neuromorphic perception of motion information through multisensory cue integration is achieved.

Figure 1. Bioinspired motion-cognitive neuromorphic nerve in comparison with the ocular-vestibular cross-modal sensory nerve found in macaques.

As an extension, multisensory integration of sensory cues obtained from completely different sensors has been investigated using our system. Implementing the sensing modules using optical-flow, vibrotactile and inertia sensors allows the detection of multimodal signals corresponding to visual, tactile and vestibular cues. Due to its flexible and portable design, our system can be readily attached to the human skin or wirelessly communicate with an aerial robot and then complete motion-perception tasks including human activity recognition and drone flight mode classification. Compared with unimodal sensory conditions where each sensory cue is segregated, bimodal sensory integration of different cues improves recognition accuracy and realizes perceptual enhancement.

Our system serves as a general neuromorphic platform for the emulation of multisensory neural processing in the brain. Essentially, this system is biologically plausible, because it emulates sensory cue integration and realizes cognitive functions, and its perceptual performance is comparable with the biological principles of perceptual enhancement by multisensory integration. We envision that our system can provide a new paradigm for combining cognitive neuromorphic intelligence with multisensory perception towards applications in sensory robotics, smart wearables, and human-interactive devices.

The original article can be found here:

Jiang, C. et al. Mammalian-brain-inspired neuromorphic motion-cognition nerve achieves cross-modal perceptual enhancement. Nat Commun 14, 1344 (2023).

https://doi.org/10.1038/s41467-023-36935-w

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Follow the Topic

Electrical and Electronic Engineering
Technology and Engineering > Electrical and Electronic Engineering

Related Collections

With collections, you can get published faster and increase your visibility.

Biology of rare genetic disorders

This cross-journal Collection between Nature Communications, Communications Biology, npj Genomic Medicine and Scientific Reports brings together research articles that provide new insights into the biology of rare genetic disorders, also known as Mendelian or monogenic disorders.

Publishing Model: Open Access

Deadline: Jan 31, 2025

Carbon dioxide removal, capture and storage

In this cross-journal Collection, we bring together studies that address novel and existing carbon dioxide removal and carbon capture and storage methods and their potential for up-scaling, including critical questions of timing, location, and cost. We also welcome articles on methodologies that measure and verify the climate and environmental impact and explore public perceptions.

Publishing Model: Open Access

Deadline: Mar 22, 2025