About a year and a half ago, I wrote an article discussing the possibility of a computer being able to read your mind. The article was based on a device that seemed to be able to do just that. The device, which was created at MIT, was able to turn a person’s internal dialog into computer-synthesized speech. While the MIT device doesn’t truly read your mind, it can verbalize what you are thinking, so it might as well be reading your mind.
Even though neuroscience has not yet progressed to the point of allowing a computer to read someone’s innermost thoughts, the day may be getting closer when we will be able to truly interact with our computers using nothing but thought.
A few weeks ago, I stumbled onto a relatively short article on MIT Technology Review, titled A Map of the Brain Could Help to Guess What You are Reading. The article discussed two findings that really surprised me.
First, the brain behaves in a very similar way regardless of whether someone is reading something or hearing it aloud. For example, if someone were to listen to a podcast their brain would interpret the spoken words in essentially the same way that it would if the person had read a transcript of the podcast instead.
The second finding discussed in the article is that different types of words stimulated activity in different parts of the brain. Words associated with family members for example (father, sister, etc.) were found to stimulate activity on the right side of the brain, behind the ear.
Mapping the brain
As a part of this study, researchers created a 3D map of the brain, mapping out which parts of the brain respond to various words. The research may ultimately be beneficial in treating dyslexia or speech disorders.
The article makes absolutely no mention of using a computer to read someone’s mind. Even so, it really isn’t all that hard to connect the dots. If (and this is a big if) it is possible to map a vocabulary to the various areas of the human brain, then it may eventually be possible to interpret the brain’s activity.
Now obviously this is not something that can be done today. The currently existing maps outlining the parts of the brain that respond to various words are preliminary at best. Even so, the article hints at the idea that additional research may eventually make it possible for a machine to figure out what a person is reading (or hearing) by monitoring brain activity.
By extension, this same mapping brain may eventually prove to have a correlation with a person’s internal dialog. Having studied neurology myself, I can tell you that while there are differences between the way that the brain handles conscious thought and the way that it interprets what a person hears, there are also similarities. My guess is that it will eventually be possible for a machine to figure out what a person is hearing, reading, or thinking by monitoring brain activity. However, I also believe that we are probably at least 30 years away from being able to truly hack the human brain.
Once it does become possible to interact with a computer using only your mind, there are countless issues that will have to be sorted out before the technology will be ready for mainstream use. First of all, there is the issue of how to actually interface with the brain. The previously mentioned study relied on an MRI. Even if it were possible to ditch the MRI and use something like an electroencephalogram instead, it still wouldn’t be a practical solution for casual use. After all, it would be unrealistic to expect a casual user to shave their head and attach a series of electrodes to their scalp in precise locations.
So many issues
Even if you can somehow get past the technical, legal, ethical, and practicality challenges, there are still other issues that may come into play with regard to operating a computer with your mind.
One such issue is that of task efficiency. As someone who writes an insane amount of material each month, I tend to rely heavily on the use of voice dictation software. As a matter of fact, I am dictating this article. I have to imagine that in at least some way, interacting with the computer by thought would be a lot like using your voice to control a computer.
Having used speech dictation software for many years, I have discovered that there are some things that it works really well for, but there are other things that you can do with your voice if you really want to but are easier to accomplish using a keyboard and mouse. Writing a document or composing an email is very easily handled by voice. At that same time though, navigating the Windows desktop (switching to a different window, launching an application, etc.) is quicker and easier to do with a mouse. My guess is that the same will eventually hold true for thought interfaces. I can only imagine being able to compose emails or write articles at the speed of thought. However, I don’t think that thought will be an ideal tool for interacting with the Windows desktop.
How many times have you opened a web browser without even stopping to think about what it is that you’re doing? How much more tedious would your job be if you had to stop and consciously think about every single action that you perform on your computer? Total reliance on conscious thought would probably not only slow a person down but would also be mentally exhausting over the course of a day.
Can it be done?
Personally, I think that the type of brain mapping described by the MIT Technology Review article holds enormous potential. Even so, there are at least two things that I can think of that could potentially derail (or at least slow down) the project.
First, the idea of mapping words to specific locations within the human brain seems to be based on the idea that the words are being conveyed in the subject’s native language. I can’t help but wonder how the brain mapping might differ if the test subject were multilingual. Being multilingual probably wouldn’t cause an issue by itself, but if the test were being performed in English and the test subject spoke English as a second language, then I think that there is a good chance that the test results could be skewed. My guess is that words would be mapped to similar areas of the brain regardless of language, but only if the person is truly fluent in the language that is being spoken.
The other issue that may come into play is that of someone’s mind wandering. The MIT Technology Review article discussed a brain mapping process that was based on words read by a test subject. I don’t know about you, but I’ve lost count of the number of times that I have had to go back and reread something because I realized that my mind was wandering while I was reading, and I had absolutely no idea what I had just read. If reading is to be used as a tool for creating a vocabulary-based brain map, then controls will need to be put in place to prevent the test subject from being distracted.
Featured image: Shutterstock