Music is a direct pathway to emotions and memories stored in the depths of the brain. We all know there is a sense of profound connection between music to the brain’s neurons. Likewise, someone’s brain activity patterns may also influence how they process musical notes, according to a new study.
The study, which was published in the journal PLOS Biology, suggests a direct link between brain activity and musical perception, which researchers believe could revolutionize the technology that helps speech-impaired people to speak.
Such devices, known as neuroprostheses, are used to help people with paralysis to compose text by merely imagining writing it. Similarly, some of these devices have been designed to allow people to construct sentences using their thoughts. However, when considering the aspect of speech, there has been a notable challenge in capturing the natural rhythm and emotional nuances present in spoken language, termed “prosody.”
Until now, studies haven’t been able to achieve a more natural and human-like sound. As a result, we’re left with mechanical sounds that lack proper intonation.
The team used music, which naturally has both rhythmic and harmonic elements, to create a model for deciphering and recreating a sound with richer prosody. They managed to decode a song from the brain recordings of a patient.
“Right now, the technology is more like a keyboard for the mind,” lead author Ludovic Bellier, of the University of California, Berkeley, said in a statement. “You can’t read your thoughts from a keyboard. You need to push the buttons. And it makes kind of a robotic voice; for sure there’s less of what I call expressive freedom.”
Researchers are optimistic that their study could bring about improvements in brain-computer interface technology.
“As this whole field of brain-machine interfaces progresses, this gives you a way to add musicality to future brain implants for people who need it,” explained Robert Knight, a UC Berkeley professor of psychology in…
Read the full article here