Looking for less, Seeing more - a dolphin inspired sonar

In this work, we developed an improved sonar imaging system which visualises and interprets sonar echoes better than conventional sonars. How did we get there? We drew inspiration from nature - dolphins !
Looking for less, Seeing more - a dolphin inspired sonar

Share this post

Choose a social network to share with, or copy the shortened URL to share elsewhere

This is a representation of how your post may appear on social media. The actual post will vary between social networks

Dolphins scan their environment acoustically by transmitting sound pulses called echolocation clicks onto nearby objects. The echoes from these are received by the dolphin, providing it an acoustic sensing of its surroundings like a directional acoustic flashlight. This biological sonar of dolphins is powerful, and superior to man-made imaging sonars of its size. The dream of replicating this sonar’s performance is not lost on scientists, who have been trying to analyse, reverse-engineer and replicate it by understanding the question: "What makes the dolphin sonar so powerful?". 

Wouldn’t it be great if we could create a small, effective imaging sonar with the size of the dolphin head, that can be mounted on underwater robots or small boats and used for seabed mapping/object detection ? Such a capability would be particularly important for ocean exploration, a task that is especially relevant in the light of the  ongoing UN Decade of Ocean Sciences.

This has been studied since the 1950s, and lots of progress made. Researchers have replicated some features of dolphin sonar in man-made devices and shown that these can improve our sonars. We, too, are intrigued by this problem, and have been studying dolphin biological sonar with specific emphasis on the following question - scientists have been looking at the hardware aspects of the dolphin's sonar, but how much do we know about the 'software'? If millions of years of evolution has led to such an effective echolocation capability in the dolphins, they must have also developed capabilities to use (process) the echoes they receive efficiently to interpret their environment.

Unfortunately, this question is hard to answer fully without assessing the dolphin brain to understand what exactly it perceives. But strides have been made on this front. One study methodology developed for this involves experiments where the dolphin acoustically scans an object underwater, and then picks the matching object from a set of alternatives provided in air. This experiment ensures the dolphin matches information from one sensory modality (acoustics) to another (visual), by perceiving the object shape or its features in some way.

Biomimetic sonar
Biomimetic transmitter used by us to insonify the object

We conducted these experiments and 'listened in' to what the dolphin was doing while it acoustically scanned the object. We recorded the echoes, and processed and visualised them. If the dolphin used those echoes to discern the object and discriminate it from other alternatives, those echoes must contain information about the object shape, right ? We also developed a biomimetic sonar inspired by the dolphin, to scan the same objects the dolphin was scanning, and visualised its echoes.

Unfortunately, using conventional processing, you can't see much of the object in the visualisation. So, this processing is too rudimentary to capture the shape features, and fails where the dolphin succeeded. Can we do better with the data available to us ?

One way is to use prior information. This is something humans do all the time - we turn our understanding of reality into expectations that can speed up our inferences and decisions. Eg., in the absence of other info, the human eye/brain assumes the light on an object is falling from ‘above’. Could the dolphin be using priors for its processing, and are there priors we can use to process the echoes better ?

A schematic of the dolphin echolocating at the object during the trial, the sample objects used during the trials, and visualisations of these objects from our biomimetic sonar (with size similar to of a dolphin's head) with sparsity-based processing. The object features are clearly visible in the visualisations.

Well yes, we do know something beforehand about the objects scanned in this study - they have well-defined boundaries (are not diffuse, like say, dust clouds), and occupy only a small fraction of the space being scanned, i.e there is ‘sparsity’ of the object. This information is useful, because we can now expect that our visualisation of echoes (from the dolphin and biomimetic sonar) should only contain a few parts that correspond to an object.

We incorporate this info into our sonar, and the new approach works better than a conventional approach and helps us see the object features, as you can see in the figure. Thus, we get one step closer to the size-performance tradeoff of the dolphins. A fist-pumping moment ! 

The success of this opens up many exciting directions for us to explore. What other ‘priors’ can we use to push our sonar performance further ? And so on. Exciting research directions lay ahead.

Our paper is available at https://dx.doi.org/10.1038/s44172-022-00010-x

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Follow the Topic

Electrical and Electronic Engineering
Technology and Engineering > Electrical and Electronic Engineering

Related Collections

With collections, you can get published faster and increase your visibility.

Thermal Engineering for Sustainability

This collection will publish papers on the topic of thermal engineering with a particular focus on applications related to sustainability.

Publishing Model: Open Access

Deadline: Mar 30, 2024

Consumer waste valorization

This collection will focus on chemical, materials, mechanical and other engineering advances tackling post-consumer solid waste streams such as textiles and paper, food, electronics and plastics.

Publishing Model: Open Access

Deadline: May 31, 2024