The piles of monitors in the front room at the offices of Emotiv Systems in downtown San Francisco resemble an electronics shop that’s holding a going-out-of-business sale. A back room, which contains workstations, is more organised, but not exactly a hive of activity: even in the middle of the day, there are only a few staff fiddling with hardware. It doesn’t look like the headquarters of a company that sits on the verge of changing the way people interact with computers.
The conference room is where Emotiv’s magnetically intelligent CEO Tan Le holds court. Here, she demonstrates to visitors Emotiv’s innovative product: the Epoc, a form-fitting, 14-sensor electroencephalography (EEG) headset that lets people control their computers without touching a key. Though it’s not the only such product on the market, the Epoc has become the device that’s garnering the most praise and attention at tech conferences and laboratories across the world. But, while demonstrating it, Tan, 32, is always sure to make the point that this isn’t the final version.
“I don’t want this technology to be a fad and then go away,” she says. “The idea of having a brain-computer [interface] isn’t a new one. People have wanted to do it for a long time. We want to make sure that the experience you get, with whatever it is you’re trying to do with it, is as good as possible. It’s going to take time. It’s not going to be quick.” For years now, numerous tech companies have been involved in a race to bring to market portable devices that read brainwaves in a way similar to how hospital EEG machines work, but at a much lower cost. Inexpensive gaming devices began appearing last Christmas. Mattel debuted its $80 Mindflex game last year to enthusiastic response.
NeuroSky is helping to usher in the next generation of gaming technology using a single “dry sensor” placed on the player’s forehead.
Meanwhile, the drive is on to create more efficient — yet still complex — EEG headsets. Design remains a challenge. One device, tested this year at the Federal Institute of Technology in Lausanne in Switzerland for a mind operated wheelchair, makes the user look like a character from a community-theatre production of Woody Allen’s Sleeper.
Only Emotiv has decided to open-source its product. Last year, rather than make the headset available to consumers, the company made the business decision to market it to developers and researchers instead. That way, says Tan, when Emotiv finally does launch its public campaign, it will have a thousand applications available rather than just two or three. When asked how long that will take, she says: “Twelve to 18 months.”
Currently, Emotiv has shipped 10,000 Epoc headsets. A development team in Russia has been creating software that allows users to search online based on visual recall of images. The research arm of the US Defence Department is funding grants to test mind-controlled prosthetics and something it calls “brainwave binoculars”. In addition to the mind-operated wheelchairs, there are robot arms and countless games in the works — nearly as many applications of brain control as, well, anyone’s mind can imagine.
For data-gatherers, Epoc offers new opportunities to study brain response — whether for market research or to understand schizophrenia. Matthew Oyer, a tech hobbyist in Princeton, New Jersey, designed a special cap to measure his dog’s brainwaves, inspired, he says, by the talking dog in the Pixar movie Up. In Australia, performance artist Karen Casey adopted the headset for a project that involved “realtime interactivity for EEG-generated video art”. She’d already developed her own software that would allow performers to play a keyboard remotely, or, in one case, “examine the cyborg being through the adoption of a virtual-reality persona. The Emotiv Epoc headsets were definitely the technology we were waiting for,” she says.
Early feedback is extremely positive. Robert Oschler, an independent software developer in Florida, was one of the Epoc’s early adopters, more out of curiosity than an actual belief that it would work. “I was sceptical,” he says. He wrote a program to control a Rovio robot via a Skype connection and, much to his surprise, it worked beautifully.
“The first time I set up my robots and they followed me, like it was telerobotics, it blew my mind. I had a strong emotional reaction. I realised that all the stuff that sounded like marketing was actually real.”He was hooked. Oschler began working on an application that could gauge real-time emotions. Within a few weeks, he had, in his own words, “put the chocolate and the peanut butter together” and uploaded a YouTube demo.
While watching a trailer for a cartoon, Oschler tracks four basic emotions: happiness, sadness, fear and excitement. Then, when the video ends, he recalls those emotions, and the trailer automatically rewinds to the moment when he felt that emotion the most strongly. The Epoc, he says, “opens up a whole range of interactions with computers that just wouldn’t be possible otherwise.”
In another research example, a series of four videos shows a young woman, “Cora” (not her real name), staring at a monitor. A car accident had left her paralysed, with no control over her limbs, or even her neck. Only her facial muscles work. She wears the Epoc on her head. In the videos, Cora works with a therapist while playing a computer game called Spirit Mountain, which comes with the device. In a quasi-mystical dojo-type setting, the player is supposed to work with a “master” who will “train” her brain to interact with the computer. After training, if the user thinks of an action such as “lift”, the program is supposed to respond. But as the video progresses and Cora trains, she seems disengaged at best, essentially bored. The video ends with the viewer just as bored, wondering what the fuss is about.
The second video, made shortly after, shows Cora playing the game fully involved, but clearly struggling to work the Epoc in conjunction with her mind. When she finally gets the headset to transmit an intention, her face lights up with sheer delight. A third video shows her interacting fully, holding commands for 30 seconds or more and playing the game expertly. By the fourth and final video, she’s not only controlling the game but also holding her head upright for the first time in a decade. When the team at Emotiv saw this, they knew that they were on to something. “I think the world is going to get to a point where everything is run on remote control based on biosignals from an individual,” Tan says, with the confident ambition that’s become her public hallmark. “It won’t be something peripheral, outside your body that you have to tell what to do. That’s not adequate, because our world is exploding in terms of information and content, and the information and content are changing radically.”
Other researchers have been more critical of the device. A team at the University of Massachusetts Dartmouth used the Epoc to help to develop an early version of the “NeuroPhone”, which can, among other things, dial a number based on the user merely seeing a picture of the person they want to call. It functioned adequately, says Andrew Campbell, a professor who worked on it. But, he warns, the Epoc is still in its infancy and best confined to the laboratory. “When people use Epoc headsets, they’re sitting down in front of the computer,” he says. “It works beautifully in idealised conditions. But if you get it out in the world, it’s more problematic.”
Tan knows full well that the Epoc hasn’t yet reached its full potential. All day and often deep into the night, she fields calls, keeps an eye on her production facilities in the Philippines and reviews sketches from videogame designers. Her daily schedule includes taking calls from major corporations, the US military and small software developers with obscure technical questions. She tests the Epoc constantly, fully aware that her product is being used in ways that she could have never imagined. “Emotiv cannot usher in a revolution by itself,” she says. “But we can certainly create a platform that can allow it to happen.”