A new user interface
There are possibilities for human interaction with machines. EMOTIV’s devices and machine learning algorithms convert brain waves into digital signals that can be used to control anything that speaks in 1’s and 0’s.
Control with thoughts
EMOTIV’s Mental Commands algorithm recognizes trained thoughts that can be assigned to control virtual and real objects just by thinking. Brain control can replace traditional input devices like keyboards, enhance interactive experiences and provide new ways for the disabled to engage with their surroundings.
With EMOTIV’s Performance Metrics, an individual’s real-time cognitive and emotional state can be used to passively modulate an application. Adapt a VR experience based on a user’s engagement or change the difficulty of an interactive training application based on the focus levels.
Brain-Computer Interface (BCI) directly integrates our thoughts and emotions with the technology that we use every day. Whether commanding drones or wheelchairs, creating music or art, or adapting digital experiences to real-time emotions, the interface between the brain and computer has never been easier.
Performance using EEG
Lisa Park demonstrates and explains her interactive installation “Eunoia II”. “Eunoia II” is a performance using EEG (brainwave sensor) headset to obtain real-time feedback of her brainwaves and emotional reactions. Park is using emotional values (data) picked up by the EEG headset, which then gets translated into sound waves that create vibrations in the pools of water placed atop speakers.
Full-sensory virtual driving experience
Acura’s immersive, full-sensory virtual driving experience is controlled by 30 emotional, cognitive, and physical inputs to create a unique environment with landscape, color and music changing in real-time to reflect the driver’s moods.