The human brain can learn to see with sound
By clicking their tongues like bats, humans can navigate pitch-black rooms and identify objects as small as a drink can using only reflected sound waves.
When Daniel Kish clicks his tongue, he isn't just making a noise; he is casting an invisible net of sound into his surroundings. Kish, who lost his eyesight as an infant, is a pioneer of human echolocation, a technique that allows the brain to map physical space by listening to how sound bounces off objects. By producing sharp pulses between 2 and 10 kilohertz, practitioners like Kish can detect a pole from several meters away or distinguish a brick wall from a wooden fence. The physics is remarkably precise, as the brain calculates the tiny delay between the click and its echo to pinpoint an object's distance within a single centimeter.
There's more to this story — open the app to keep reading.