EyeMusic: The App That Lets You Hear Shapesan app that allows people to hear shapes and colours, with the admirable aim of allowing blind people to ‘see’ for the very first time. Amir Amedi and his student, Ella Stiem-Amit, have demonstrated that we can actually use sounds to activate parts of the brain used solely for visual processing, and aim to alter the way in which we understand brain functions. In fact, this new technique suggests that we may be able to train the brain to process information in a dramatically different way than was ever thought possible.
It has previously been assumed that neurons specifically designed for visual processing are merely re-assigned to other tasks when vision is lost – or was never there in the first place. This has led to the commonly held belief that if we lose one sense, the others get stronger. However, studies seem to suggest that this is far from the truth. Those who are blind from birth actually do not show particular strength in sense of smell or touch, and whilst they may make better use of the sounds that they hear, they cannot actually hear any better. Amedi’s studies have put this belief into even further doubt.
One way in which to ‘see’ surroundings is by using sounds to create internal maps and 3D images of the area. Many blind people already use this technique, which is much like bat echolocations – making clicking noises with the tongue and listening out for an echo in order to develop a vision of the landscape. Amedi and Stiem-Amit simply took this notion further. They demonstrated that blind people can actually be trained to recognise silhouettes of the human body in various postures – after a mere ten hours of listening to various sounds. After this surprisingly small amount of training, the brains of the research subjects showed activity in those same areas that sighted people use – areas that were previously thought to be ‘re-assigned’. All the test subjects were blind from birth, meaning that there was absolutely no chance that this brain area could have been primed or trained before vision-loss, making the results of this research even more striking.
In an attempt to recreate these remarkable results outside the laboratory, Amedi has developed an iPhone app, EyeMusic. The app trains the user through a series of sounds associated with certain visual patterns and shapes before sound combinations begin to suggest visual scenes. The app also collects performance data, so that Amedi and his team can develop future research – and even stronger results. Who knows; perhaps in future this will be another important addition to the list of way in which visual problems can be combated, world-wide!