Offbeat

Science

Brainwaves rock! Scientists decode Pink Floyd tune straight from the noggin

First up: Another Brick in the Wall, Pt. 1


A group of scientists say they are the first to reconstruct a recognizable song from data collected directly from the brain by monitoring electrical activity and modeling the resultant patterns with regression-based decoding models.

That song is Pink Floyd's "Another Brick in the Wall, Pt. 1" and the question on the mind of the researchers was:

What information about the outside world can be extracted from examining the activity elicited in a sensory circuit and which features are represented by different neural populations?

The research appeared on Tuesday in PLOS Biology. The 190.72 seconds of the song were delivered through in-ear monitor headphones at a volume individually comfortable for the 29 patients between 2009 and 2015.

Because all patients had pharmacoresistant epilepsy, they were each already equipped with electrodes placed on the surface of their brain to look for seizures before the team's research even began. Collectively the group had 2,668 electrodes.

By playing the music and recording the output, the electrodes provided an intracranial electroencephalography (iEEG) data set the boffins examined to determine the parts of the brain stimulated by certain frequencies within the song.

Of the electrodes, 347 ended up being significant in encoding the song's acoustics. They were concentrated in three distinct regions, with a higher proportion skewed toward the right side of the brain. The 16th-note rhythm guitar pattern present in the song particularly activated a certain region of the brain in the temporal lobe – a section linked with processing auditory information and encoding memory.

"These results are in accord with prior research, showing that music perception relies on a bilateral network, with a relative right lateralization," wrote the study authors.

The reconstruction of the song involved training 128 models to each restore a frequency that was eventually assembled into a whole. Previous studies had done similar with speech but didn't take on enough nuances to piece together music.

"Music is core to human experience, yet the precise neural dynamics underlying music perception remain unknown," said the researchers. Indeed, music is layered with complexity. Notes, rhythm, harmony, and chords all play a role in evoking an emotional response to music in addition to the phonemes, syllables, words, semantics, and syntax that speech offers.

"Another Brick in the Wall, Pt. 1" was a particularly apt song choice. Its crafted instrumentals complemented by a comparatively sparse 41 seconds of lyrics make it what the boffins described as a "rich and complex auditory stimulus."

Plus, it was reportedly tolerable to the oldies in the group.

The reconstructed bits are available to listen to for anyone who doesn't find it creepy to listen to classic rock as interpreted in someone else's head.

Once that hurdle is passed, those with an ear for Pink Floyd's 11th album can pick up a distorted version of the melody, an appropriate rhythm and even some lyrics.

Should there have been more electrodes or had they been more precisely placed, the researchers believe the reconstructed strong would likely be even clearer. Further work could achieve this, as well as vary the decoding models' features and targets or add a behavioral dimension.

Applications of such research could help those struggling to communicate due to paralysis from stroke, diseases like ALS, or other reasons by better artificially replicating elements of speech than current similar technologies allow. ®

Send us news
38 Comments

UK data watchdog warns Snap over My AI chatbot privacy issues

Plus: 4channers are making troll memes with Bing AI, and more

Bad Vibrations: Music publishers sue Anthropic AI for using copyrighted lyrics

You Can't Always scrape What You Want, even if the lyrics are Blowin' in the (digital) Wind

AI processing could consume 'as much electricity as Ireland'

Boffins estimate power needed if every Google search became an LLM interaction. Gulp

Hyperscale datacenter capacity set to triple because of AI demand

And it's going to suck... up more power too

How 'AI watermarking' system pushed by Microsoft and Adobe will and won't work

Check for 'cr' bubble in pictures if your app supports it, or look in the metadata if it hasn't been stripped, or...

SoftBank boss Masayoshi Son predicts artificial general intelligence is a decade away

'Investo-bot, make me rich' is his vision – powered by Arm chips, natch

UK government embarks on bargain bin hunt for AI policy wonk

Con: You won't get a Menlo Park salary. Pro: You won't have to meet Zuck

Google offers some copyright indemnity to users of its generative AI services

'If you are challenged, we will assume responsibility'

Ampere leads a pack of AI startups to challenge Nvidia's 'unsustainable' GPUs

AI Platform Alliance probably has Jensen Huang in tears...of laughter

Five Eyes intel chiefs warn China's IP theft program now at 'unprecedented' levels

Spies come in from the cold for their first public chinwag

Microsoft reportedly runs GitHub's AI Copilot at a loss

Redmond willing to do its accounts in red ink to get you hooked

Japanese PM says international AI regulations will be here by Christmas

G7 to meet after getting ideas from UN’s Internet Governance Forum