Solving the
Mystery of the Sonar of Dolphins?

by Douglas Moreman

quasi-passive sonar

Videos of Simulations

Modified on

To know a solution exists can be a giant step towards finding a solution. -- Anonymous Mathematician

By the end of 2001, dolphins had made it clear, or clear enough to me, that enough information arrives at its lower jaw to enable one to roughly "see" by means of echoes of its clicks.
So, when I was struck by a first hunch of how information about shapes of objects might be extracted from echoes, I assumed that I might be onto something important and went after it. And, indeed, was able to simulate a mechanism that produced images.

I used the good graces of the US Patent Office to publish both an hypothetical biological mechanism, or enough of one, and also publish some possibly practical ways to immitate it to compute images from echoes of clicks like those of dolphins:
U. S. Patent "Echo scope", was granted in 2008.
Its approach rests on an hypothesis of massively parallel processing by a set of special neurons, I call them "torons," in the brains of dolphins.
More recently, I discovered statistical power in neurons working in groups. Now in 2018, I am attempting to publish, in patents, an upgrade that increases the opportunities for practical applications.

The methods of solution proposed here begin with this idea:
rather than their ears, torons compare times-of-arrival ("toas") of echoes that arrive at echotrigger sensors in their jaw (perhaps just in the chin).

Experiments with my simulators suggest:
one can replace simulated echoes with real echoes and see, not a simulated target, but an object that a real dolphin in actually clicking on. Perhaps some day, we will see such imaging on display in dolphin aquaria?

A new invention, spun-off from more general developments, is that of a simply-built improvement on existing fish-finders:
Streaming Fish-Finder in 3D.

Notes on progress in 2014 and 2015.

Mathematics of the Sonar of Dolphins

This new computational method for sonar and radar, Feature-Based Passive, FBP, enables computation of an image from a single "click" like that of a dolphin. The waves used for imaging are not limited to sound. Ideas for applications include
* the world's best fish-finder and an analogous radar application that can help locate sources of enemy fire.
* high speed, inexpensive imaging for medical triage.

The new approach to computing images from waves has been inspired by the Echotrigger/Toron Theory of the imaging sonar of dolphins. But, since the mathematics applies to waves other than those of sound, a new name, wavar, has been adopted to refer to the general principles.

Wavar, here, is being developed using "experiment-machines" -- simulations software for rapidly crafting and running experiments that probe for information in waves.

The approach, here, is "geometric" in that its calculations use geometry and not sophisticated methods such as those of Fourier analysis.

The new (I think) and simple signal-processing methods herein are "feature-based" and "passive" in that they use times-of-arrival of known features of particular waves and can use, but do not require, knowledge of time and place of emission of those waves.

It seems that in most species of toothed whale, for which sonar-clicks have been recorded and graphed, the clicks all have one prominent instance of a feature called a fang.

A fang is a change in loudness that goes from a low to a high and back to a low in, typically, about 1/100,000 second and is much greater than all the other low-to-high transitions in the click. Given the shape of some feature, the "fang" perhaps, but not knowing the time or the place of the emission of a click, FBP can, nonetheless, make a picture from echoes arriving at an array of sensors.

The scope of potential applications of geometric sonar include all areas of sonar, radar, exploration-seismology, and medical imaging. And more.

Douglas Moreman

Sonar of Dolphins

Applications of This New Technology.
The Echotrigger/Toron Theory, How Neurons Can Image with Sound
Why "Echolocation" Cannot Explain the Sonic Vision of Dolphins.
Abstract: Hypothetical Neurons.
Sample Program for Experimenting.
Odds and Ends.
"Sonic Imaging," presented at a meeting at Tulane University in 2005. The animation of 2005 (below) represents the possible functioning of a first version of "the world's best fish-finder" -- it operated in "active mode."

Animations (also given at the top of this page).
Animation of active sonar at Tulane in 2005.
A first animation of FBP based on runs of a simulator, April 2013. Uses passive in the top view, active in the side view.
Cleaned-up FBP images, simply obtained (2013).
Animation of pure passive Feature-Based sonar, May 2013.

Thanks in Memoriam, two tutors in the use of sounds for detecting and imaging:
John Gitt, sonar.
Donald Haefner, seismology.

This web site was begun in April, 2013.
This page of the site was modified on