Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects.

By actively creating sounds: by tapping of canes, stomping feet, snapping fingers, or making clicking noises, people trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size.

Sound and hearing are  the mechanisms driving this ability.

Many blind individuals passively use natural environmental echoes to sense details about their environment.

Some blind people can actively produce mouth clicks and are able to gauge information about their environment using the echoes from those clicks.

It is through both passive and active echolocation that blind individuals sense their environment.

Sighted people can see their environments often do not readily perceive echoes from nearby objects, due to an echo suppression phenomenon.

 With  training, sighted individuals with normal hearing can learn to avoid obstacles using only sound, showing that echolocation is a general human ability.

Vision and hearing are alike in that each interprets detections of reflected waves of energy. 

Vision processes light waves that bounce off surfaces throughout the environment and enter the eyes. 

The auditory system processes sound waves as they travel from their source, bounce off surfaces and enter the ears. 

Both neural systems can interpreting the complex patterns of reflected energy that their sense organs receive. 

The sound these waves of reflected energy are referred to as echoes.

Echoes can convey spatial data that are comparable in many respects to those conveyed by light.

Echoes can be perceived as very complex, detailed, and specific features of the world from distances far beyond the reach of the longest cane or arm. 

Echoes allow information about the nature and arrangement of objects and environmental findings: overhangs, walls, doorways and recesses, poles, curbs, steps, people, parked or moving vehicles, trees etc.

Echoes provide detailed information about where objects are, how big they are and their general shape,  density, and distance from the observer and their direction, height, and width.

Echoes allow an understanding the interrelationships of these qualities, allows the perception about the nature of an object or multiple objects. 

Studies on the neural basis of human echolocation report activation of primary visual cortex during echolocation, a mechanism of brain region remapping,  known as neuroplasticity.

Echolocation in the blind allows perception of  the objects based on the echoes, and shows activity in those areas of their brain that normally process visual information in sighted people, primarily the primary visual cortex.

Such visual areas are normally only active during visual tasks. 

The cortex of blind echolocators is plastic and reorganizes such that primary visual cortex, rather than any auditory area, becomes involved in the computation of echolocation tasks.

Leave a Reply

Your email address will not be published. Required fields are marked *