The fact that sometimes we think we emit the same “wavelength” with another person because our brain is designed to continuously run ahead and “predict ” what our interlocutor is going to say. This is the conclusion of a new American study.
According to experts, the successful prediction of words uttered by our interlocutor occurs due to the synchronization of brain patterns that “anticipate” speech.
The study published in the «Journal of Neuroscience» brings to light new information about the brain processes that take place during communication and are related to the processing of spoken language.
The scientists found that our brain is constantly trying to predict what our interlocutor is going to say to us, in order to better understand and promote discussion.
Until now scientists believed that our brain processes the stimuli received from the environment from the “bottom up”, that is, when we hear someone speak, the auditory cortex of the brain processes the sound first and then activates other areas that are responsible for speech comprehension.
However, more and more neuroscientists seem to support the theory that the brain ultimately analyzes the external stimuli from the “top-down”, which makes the brain a kind of “prediction machine”.
As reported by U.S. researchers, our brain anticipates constantly in order to be able to respond lightning fast and accurately to anything that is going to happen. For example, it is able to predict words and sounds from the context. From the phrase “grass is…” we can easily predict the continuation – it is probably the word “green”.
“Our findings show that the brain of both the speaker and the listener uses the process of language prediction. This results in similar brain patterns in both interlocutors,” said the study’s senior author Dr. Suzanne Dikker from the Department of Psychology, University of New York. “This happens even before the speaker utters the phrase he is thinking“.
“Much of what we know about language and brain function comes from controlled experiments conducted at a laboratory and examines the language as an abstract concept – we hear a series of words or a word at a time,” said Dr. Jason Zevin from the University of Southern California who participated in the study. “It does not focus on communication, but rather on the structure of language. Our experiment focuses on how we use language to express common ground with someone.”
In their experiment, the scientists monitored the brain activity of a speaker while he was describing the images that had just seen. These images contained different issues in order to trigger a different approach to the description every time. In one of them, for example, there was a penguin hugging a star. Another one depicted a guitar stirring a bicycle wheel coming out of a boiling pot of water – an image with even less predictable description.
In the next phase a group of volunteers were asked to listen to the speaker’s descriptions, while they saw the pictures themselves. The researchers monitored their brain activity.
Comparing the brain response of the speaker to that of the audience, the researchers found that the patterns of brain activity in areas responsible for speech processing showed similarities when listeners could predict what the speaker was going to say.
“In addition to the lightning fast reactions towards the stimuli received from the environment, the power of our brain that “predicts” what another person is going to say may play a decisive role in human communication,” concludes Dr. Dikker.