Controlling a Video Game with Brain Waves

This post describes two projects that involve controlling a video game (Quake II) via electrical signals generated by the body, otherwise known as a “Brian Machine Interface”. The first project is an EOG (electro-oculography) video game controller. In an EOG system, electrical signals generated by eye movements are detected by electrodes placed on the surface of the skin near the eyes. The second project demonstrates game control via EEG (electroencephalogram). With EEG, electrodes are placed on the scalp and neural activity of the brain itself is monitored.

EOG: Video Game Control by Eye Movements

The EOG game control is much easier to implement as compared to EEG. This is partly because of the slightly larger signals generated by EOG. But more importantly, the analysis of EOG signals is straight forward. During eye movements, a voltage potential is generated. By placing electrodes over the muscles responsible for specific eye movements, these movements can be detected and mapped to various inputs of a video game controller.

Here you can see me placing the electrodes on the subject (Il Park). The black cap and chin strap are completely unnecessary in this EOG experiment but we thought it looked cool and decided to use in anyway.

This is a block diagram of the system used to control Quake II. A multi-channel Tucker Davis data acquisition system was used to get the EOG signals and provide a little noise filtering. The buffered and filtered signals were then passed along to a PC running MatLab. This is where a crude pattern matching scheme was used to discriminate between the five different eye movements assigned to game control. When Matlab detected an intended motion, the corresponding command was sent to the running Quake II game. The commands had to be sent via TCP/IP since the game was running on a separate computer.

The results from this project were very good. There was little to no perceivable lag during game play and accuracy averaged 95%. Obviously you wouldn’t want to play the entire game this way. With only five eye movements to choose from (left, right, up, down and blink), control was limited. I was able to shoot and kill some enemies in the game, which you can see in the video. If you are interested in knowing more about how this system works, check out this paper.

This is a video of the system in action. The video shows the output of the Quake II game, the subject’s (which would be me) eye movements and the real time data analysis that is happening on a separate computer. It’s kind of eerie at the end of the video when the camera zooms out and you can see me controlling the game but I’m not moving my hands.

Download video: EOG_game_play.wmv

Download project details: EOG Video Game Control.pdf


EEG: Video Game Control by Thought (Brian Machine Interface)


Unlike EOG signal detection, a brain machine interface using EEG signals isn’t as straight forward. As mentioned earlier, EEG relies on an array of electrodes that contact the scalp. The electrodes are sewn into an elastic cap that is worn on the head (see the pictures) and a conductive gel is squirted around each electrode to get maximum conductivity between the data acquisition equipment and the subject. In contrast to an EOG interface which places an electrode over a specific muscle or nerve, an EEG interface involves a little guesswork to find a control scheme that will work. Not only are the signals captured during EEG fairly noisy, but getting the electrodes in the same place for every experiment is a challenge in itself.

Please excuse me while I vent for a second here…

Every few months various media outlets (CNN, MSNBC, etc) “re-discover” brain machine interfaces (BMI). Every time someone moves a cursor on a computer screen with their mind, the media covers it like the it’s the best thing since sliced bread. I’ve got news for everyone: brain machine interfaces have been around for a long time. Recently CNN had an article about some engineers at Honda who hooked up a robotic hand to an MRI machine. When the subject made a peace sign, the robotic hand made the same motion a few seconds later. Big deal. They could have made the robotic hand play the piano when the subject makes a peace sign, as well… though, that wouldn’t
have created the illusion that their system is actually decoding all the subtle neural signals required to make such a complex hand movement. The CNN reporter not only gave credit to Honda for invented brain machine interfacing (which has probably been around for 20 years now), but said that in the future this technology is going to replace keyboards and buttons on cellphones. Haha! Good one! There are some fundamental reasons this will never happen. If you attempt any kind of BMI experiment, they will become obvious to you. Maybe in a few months they’ll do an article on me playing a video game with my mind.
Back to the EEG video game control…

A common misconception about brain machine interfacing is that you just stick an electrode over the part of the brain that controls a particular movement and ‘viola’ you can detect any time someone makes that movement. If it were only that easy. There are several reasons why this isn’t the case. First, the human brain has billions of neurons versus the limited number electrodes of that can be placed on the subject’s head (see the picture to the left for the electrode placement). Even if there was one neuron that was responsible for a certain hand movement, what are the chances an electrode (that is enormous in comparison) is going to be in exactly the right place to pick up that signal? Second, there is a lot of noise in a system like this. Trying to filter out all the other neural activity is a real challenge. Third, the truth is, we’re not even sure how the brain works yet. Granted, there are some generalities we know about how intended physical movement originates in the premotor cortex and shifts into the motor cortex, but there is a still much to be learned in the area of motor control.

The term “brain waves” refers to the different frequency bands of neural activity in the human brain. These frequency bands can be associated particular mental states. For example, delta waves occur between 0.5 and 4 Hz. Delta waves are seen during deep sleep. The brain wave band used for controlling the video game in the experiment is the mu-band (between 8 and 12 Hz). mu-rhythms occur at the sensorimotor cortex and are associated with movement preparation and motor imagery, but suppressed during any actual motor activity. Meaning you can detect these signals when the subject is relaxed but not while the subject is actually moving a body part or even thinking about moving a body part.


These plots show frequency power spectra during “no imagined movement” and “imagined movement. Notice the mu-rhythms at about 12 Hz.

To control the computer with my mind, mu-rhythms were analyzed in this experiment. Again it is nearly impossible to discriminate signals from different parts of the motor cortex through surface electrodes placed on the scalp. BUT, the left and right hemispheres of the brain can be analyzed separately. This scheme gives two channels for computer control. Basically, the mu-rhythms were measured for each hemisphere of the brain and when a certain threshold was reached that equated to an “on” state. Two hemispheres multiplied by two control states gives four possible outputs for the game control. This was used to make the character in Quake II go left, right, forward (and fire) or stop.

Controlling the computer with your mind requires a little training. Training was done with a visual feedback system that allowed you to see you’re brain waves in real time and try various things to modify them. The strategy that works the best involved small imagined movements. In practice this amounted to sitting in a chair perfectly relaxed and imagining moving a finger and therefore modifying the mu-rhythms for that hemisphere. So when I imagined moving my right finger, the video game character moved right. When I imagined moving my left finger, the computer moved the game to the left. The accuracy for this style of EEG computer control was about 60%. Clearly not as good as EOG.

If you need a more technical description of this system read this paper:

EEG Based Game Control.doc

This entry was posted in Uncategorized. Bookmark the permalink.

One Response to Controlling a Video Game with Brain Waves

  1. Tuomas Karmakallio says:

    Huge thanks for making such a clear reference material source. I will try out your matlab code next week, and see if i can get this thing to work with openEEG.