For the first time, scientists identify individual brain cells linked to the linguistic essence of a word

Ultra-detailed brain map shows neurons that encode words’ meaning

One set of neurons (artist's illustration) encodes the meaning of the word 'duck'; an overlapping set encodes the meaning of the word 'egg.'Credit: Juan Gaertner/Science Photo Library

By eavesdropping on the brains of living people, scientists have created the highest-resolution map yet of the neurons that encode the meaning of various words1. The results hint that, across individuals, the brain uses the same standard categories to classify words — helping us to turn sound into sense.

The study is based on words only in English. But it’s a step along the way to working out how the brain stores words in its language library, says neurosurgeon Ziv Williams at the Massachusetts Institute of Technology in Cambridge. By mapping the overlapping sets of brain cells that respond to various words, he says, “we can try to start building a thesaurus of meaning”.

The work was published today in Nature.

Mapping meaning

The brain area called the auditory cortex processes the sound of a word as it enters the ear. But it is the brain’s prefrontal cortex, a region where higher-order brain activity takes place, that works out a word’s ‘semantic meaning’ — its essence or gist.

Previous research2 has studied this process by analysing images of blood flow in the brain, which is a proxy for brain activity. This method allowed researchers to map word meaning to small regions of the brain.

But Williams and his colleagues found a unique opportunity to look at how individual neurons encode language in real time. His group recruited 10 people about to undergo surgery for epilepsy, each of whom had had electrodes implanted in their brains to determine the source of their seizures. The electrodes allowed the researchers to record activity from around 300 neurons in each person’s prefrontal cortex.

As participants listened to multiple short sentences containing a total of around 450 words, the scientists recorded which neurons fired and when. Williams says that around two or three distinct neurons lit up for each word, although he points out that the team recorded only the activity of a tiny fraction of the prefrontal cortex’s billions of neurons. The researchers then looked at the similarity between the words that activated the same neuronal activity.

A neuron for everything

The words that the same set of neurons responded to fell into similar categories, such as actions, or words associated with people. The team also found that words that the brain might associate with one another, such as ‘duck’ and ‘egg’, triggered some of the same neurons. Words with similar meanings, such as ‘mouse’ and ‘rat’, triggered patterns of neuronal activity that were more similar than the patterns triggered by ‘mouse’ and ‘carrot.’ Other groups of neurons responded to words associated with more-abstract concepts: relational words such as ‘above’ and ‘behind’, for instance.

The categories that the brain assigns to words were similar between participants, Williams says, suggesting human brains all group meanings in the same way.

The prefrontal cortex neurons didn’t distinguish words by their sounds, only their meanings. When a person heard the word ‘son’ in a sentence, for instance, words associated with family members lit up. But those neurons didn’t respond to ‘Sun’ in a sentence, despite these words having an identical sound.

Mind reading

To an extent, the researchers were able to determine what people were hearing by watching their neurons fire. Although they couldn’t recreate exact sentences, they could tell, for example, that a sentence contained an animal, an action and a food, in that order.

“To get this level of detail and have a peek at what’s happening at the single-neuron level is pretty cool,” says Vikash Gilja, an engineer at the University of California San Diego and chief scientific officer of the brain–computer interface company Paradromics. He was impressed that the researchers could determine not only the neurons that corresponded to words and their categories, but also the order in which they were spoken.

Recording from neurons is much faster than using imaging; understanding language at its natural speed, he says, will be important for future work developing brain–computer interface devices that restore speech to people who have lost that ability.

doi: https://doi.org/10.1038/d41586-024-02146-6

This story originally appeared on: Nature - Author:Sara Reardon