An A.I. that can read your mind is already a reality
A.I. is one of the fastest growing sectors of technology as more and more countries are putting an emphasis on the importance of it and the changes it can bring. Companies such as Google, Ford, Sony and more are working on many projects which you can read about here
Another interesting A.I. project has emerged from Japan, and it is all about reading your mind. The A.I. can analyze a person’s brain scan and provide a written description of what the subject had been looking it. Though currently it is not super detailed in the descriptions it provides, it gives a general idea of the image such as “A dog is sitting on the floor in front of an open person,” which turned out to be accurate.
This A.I. project is definitely something that many people have though of before, a device that can read your mind, but only now is it being developed and researched. The applications of this would primarily be used in security fields such as during lie detector tests or interrogations, however the researchers have already said that it will take a while before the technology is there.
So far the purpose of this project is to understand the brain and how it processes information
Here’s what Ichiro Kobayashi, one of the researchers had to say –
We aim to understand how the brain represents information about the real world. Towards such a goal, we demonstrated that our algorithm can model and read out perceptual contents in the form of sentences from human brain activity. To do this, we modified an existing network model that could generate sentences from images using a deep neural network, a model of visual system, followed by an RNN, a model that can generate sentences. Specifically, using our data set of movies and movie-evoked brain activity, we trained a new model that could infer activation patterns of DNN from brain activity
Mind reading may be something that might face criticism from the general public as an invasion of privacy. Researchers have assured us that there is nothing we have to worry about. The project is currently in research and testing phase. We are unsure when, and if, researchers will make this available for other people to use, but we hope it’s soon.