Kyoto Scientists Visualize People's Dreams

A team of researchers lead by Yukiyasu Kamitani has been able to decipher the contents of the dreams of three people.

To be able to achieve this feat, first they had to collect the data of the dreams of three volunteers by using fMRI and EGG/EOG/EMG/ECG scanners while they were sleeping and also while they were awake. They also wrote down the impressions of each volunteer after waking up, what they said they had been dreaming about. Next, they proceeded to classify as visual the dreams with at least one visual element and they assigned a name in English to each dream using Wordnet. From there, they built data vectors using each concept as the index and they used diverse automatic learning techniques based on support vector machines. The fascinating thing is that they were able to make the learning converge for the three volunteers and then they could use what was learned by the algorithms to be able to know what each person was dreaming before waking up.

To make it even more impressive they decided to map the words of each visual element dreamed by the volunteers with images extracted via Google Images. And that is how images very similar to a person’s dreams can be shown. For example, this is an image of the dream interpretation of subject 2 while he is dreaming:

This is what the subject said when he woke up, he was dreaming about characters:

“What I was just looking at was some kind of characters. There was something like a writing paper for composing an essay, and I was looking at the characters from the essay or whatever it was. It was in black and white and the writing paper was the only thing that was there. And shortly before that I think I saw a movie with a person in it or something but I can’t really remember.”

visualizing dreams
Here the volunteer was dreaming about people (male and female)

They compared the data when they were awake.

This is how the data vectors look like.

Source: Neural Decoding of Visual Imagery During Sleep

No Comments

Sorry, the comment form is closed at this time.