Scientists can now predict what a bird will sing next based on its thoughts

Researchers have managed to read a bird’s brain, and predict what it is going to sing next, using a computer-brain interface and machine learning. The breakthrough could lead to a machine that can read your mind.

Scientists can now predict what a bird will sing next based on its thoughts

The group, from the University of California San Diego (UCSD), have explained how the machine worked in a new paper, published on the website bioRxiv. The technique worked by using electrodes to monitor the neural activity of the birds. This activity was then fed, alongside the song that it resulted in, to a kind of machine learning called neural network software. 

The software was trained to match the neural activity to the song, and it managed to predict what the bird would sing around 30 milliseconds before it did.

While it is not the first time an animal’s sounds and brain activity have been linked, it is an important breakthrough because birdsong shares many features with human speech. Both are learned from older members of the species, and complex, compared with grunts made by monkeys, for example.

Makoto Fukushima, a fellow at the National Institutes of Health who has studied the sounds made by monkeys, told Technology Review this is why the new results have “important implications for application in human speech.”

The breakthrough that allowed the new paper was simplifying the analysis of brain activity, using a physical model of how birds make sounds. This simplification could potentially also be used to decipher the complex process that goes into creating human speech.

The researchers hope it will lead to a brain machine interface that could potentially read people’s minds and help those who cannot otherwise to communicate. 

brain_machine_interface

“Brain Machine Interfaces (BMIs) hold promise to restore impaired motor function and, because they decode neural signals to infer behaviour, can serve as powerful tools to understand the neural mechanisms of motor control,” the authors said.

“Yet complex behaviours, such as vocal communication, exceed state-of-the-art decoding technologies which are currently restricted to comparatively simple motor actions.” 

The new paper is a step towards getting past these restrictions. 

Companies like Facebook are already pursuing research in this area. For example, Facebook has said it hopes people will one day be able to type directly from their brains. Tesla founder Elon Musk started a company called Neuralink in April this year, which hopes to also create brain-machine interfaces.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.