When sight ceases to be a reliable means of navigating through the world, hearing has an opportunity to take over. Many animals – like bats, killer whales, and dolphins – use echolocation in lieu of visual cues. Echolocation is a technique that helps determine the location of objects through reflected sound. Without seeing in the traditional sense, echolocation allows an individual to construct an auditory space map, heightening spatial awareness and aiding in object identification. Read on to learn more about how you can “see” without eyes.
There are many ways in which sight and hearing are alike; both depend on sensory information to create a mental image and both rely on energy in the form of waves. The speed of sound is much slower than the speed of light, so it is easy to perceive reflected sound – an echo – from an object. The longer the delay, the farther away the object is. The frequency content of the reflection provides an indication of the size of an object.
Echolocation techniques can be categorized into two forms: passive and active. The passive technique allows the listener to use incidental reflections from sounds generated by external sources, whereas the active technique uses a beacon to generate a sound while the listener waits for the corresponding reflections from surrounding objects.
One beta project that aims to aid visually impaired individuals with spatial awareness and object identification is project BATEYE, which is an experimental device that produces a soundscape using active echolocation. The device is composed of an ultrasonic pulse generator and sensor, a pair of glasses, and an Arduino board. The device puts out an ultrasonic pulse and records the time it takes to return to the user. Once the distance is calculated based on the delay time, the device plays a tone back to the user where the pitch of the tone gives an indication to the corresponding distance of the object. It generates these tones every 5 milliseconds and as a user gets closer to the object, the pitch increases.
Another device is Ausion, which is developed in India and looks somewhat like a bulky cellphone; it makes use of earbuds to alert the user where different objects are. The device can alert the user of the distance to objects nearby, and like BATEYE, uses different frequency tones to indicate how far away objects are. The different notes are from svara, the Indian music equivalent to Western solfege; think do-re-mi-fa-so-la-ti-do. The technology can not only detect how far away an object is, but also if there is a pothole nearby.
Studies conducted by the Department of Psychology at University of Cambridge and the Vision and Eye Research Unit at Anglia Ruskin University have shown that when blind people use echolocation, the visual cortex, the part of cerebral cortex that typically processes information transferred from the retinas, is repurposed to process sound input. The brain is an amazing organ!
The use of echolocation shows us that sight is not the only way to maneuver through our world and for some, sound provides a unique alternative. Now you can really be like Batman.