Scientists have learned to read sentence structure and even visual images from thoughts. Now they worry that companies will introduce advertising into dreams.
Mind reading ability
brain signals was taken in 2005 by scientists Yukiyasu Kamitani and Frank Tong of the ATR Computational Neurobiology Laboratory and Princeton University. They discovered that brain and neuronal activity could be monitored using MRI, and learned to determine the direction of a person's gaze.
In 2017, scientists from Carnegie Mellon University reported that albania number data the brain makes up 42 components of a person's complex thoughts, such as describing their actions or state: "personality," "size," "action," and others.
Scientists conducted an experiment: volunteers uttered certain phrases about themselves, and the neural network learned to analyze brain activity. For example, the sentence "the giant troll spoke", according to MRI data, it could decipher as the scheme "size - personality - action".
The neural network identified the components of 240 sentences of neural activity and recognized unfamiliar phrases with 86% accuracy.
“The next step in this direction could be to decipher the topic of a person’s thoughts, what they are thinking about: geology or skateboarding. We are on the way to creating a map of all types of knowledge in the brain,” says Marcel Joost, a professor of psychology at Carnegie Mellon University.
In 2019, Yukiyasu Kamitani and his colleagues showed subjects about a thousand images in an experiment to understand how visual images affect brain activity. Participants presented different pictures during an MRI scan, and a neural network compared the tomography data and the original images.
Once the neural network had learned, the subjects were presented with the same images again—and this time the neural network had to reconstruct the images based on the MRI data.