It’s All in Your Head: Merging Brainwave Science and UX

by Will Dickey

We’ve been using our hands to make tools and interact with our environment for millennia. Even with today’s technology, we’re stuck to trackpads and touch screens. However, recent leaps in technology have opened up some profound new ways to manipulate our surroundings — using only our minds.

Electroencephalography, commonly known as EEG, measures the voltage changes caused by brain activity across the scalp.  These devices are commonly used by neuroscientists to study mental disorders and sleep patterns. Until recently, these EEG technologies have remained bulky, expensive, and exclusive to healthcare practices, but this technology will soon see applications in a broader audience. Personal EEG technology holds the potential to change the way everyone interacts with machines—and the way machines interact with us.

Companies like Neurosky and Emotiv are bringing this technology into the mainstream by giving everyone the ability to track their own brain patterns. Within a few minutes of setup on Neurosky, you can see your current brain activity displayed as a colorful spectrum of fluctuating meters. This activity is generally quantified in terms of frequencies ranging from delta waves (the most common waves during sleep) to gamma waves (representing intense focus). Neurosky has aggregated this data even further to include Attention and Meditation meters.

mindwave-brainwave-sensing-headset-by-neurosky

Emotiv produces a headset that reads brain waves and uses an API to allow recognition of facial expressions, emotions and conscious thought—that is, the headset can (sort of) read your mind. Although the device is aimed at the gaming market, the potential for use in areas outside of gaming is enormous. In a TEDx talk, Emotiv founder Tan Le demonstrates how the headset makes it possible to control both virtual and physical objects with mere thoughts (and a little concentration).

Access to our brain activity can give us new insights into both our conscious and unconscious actions. During user testing, researchers usually need to prompt users to describe their reactions to what they are seeing. Many times, it’s difficult for users to articulate exactly what they are thinking and feeling.  With EEG technology, we can now “see” these reactions taking shape on the screen, informing designers whether or not a page really held our attention, or if a task was frustrating or not.

Personal EEG devices could also prompt feedback from our surroundings. If we have devices that can understand our state of mind and emotions, we could design environments that change themselves based on our mood or attention level. If we’re feeling down, our phone could send us a funny picture or play us our favorite song. If we are feeling flow at work, we could have a light that turns on to tell our co-workers: “Not now, Chief, I’m in the zone.”

At this point, consumer-based EEG technology seems sufficiently advanced to tell us about our general state of mind. Narrowing down to specific states and emotions, however, is proving to be very difficult. One reason for this difficulty is that brain “architecture” varies between individuals (even twins), which leads to different electrical signatures. Another challenge is the styling of the headset. Although we have come a long way from the brain-sucking-octopus-look found in the medical industry, neither Emotiv or Neurosky has produced something that I could comfortably wear on a daily basis. This may change; Canada-based InteraXon is producing the most user-friendly headset I’ve seen, due out next year.

Here at Fresh Tilled Soil, we got our hands on a Neurosky Mindwave and have begun to further investigate the potential of EEG technology on UX by tinkering around with its capabilities. We’ll keep you posted about our findings.

About Fresh Tilled Soil

Fresh Tilled Soil is a Boston-based user interface and experience design firm focused on human centered digital design