Does EMOTIV really measure signals from my brain?
EEG signals captured from around the head are contaminated with signals from muscle movements. In other words we (and every other EEG system) measure BRAIN signals which are sometimes overlaid with MUSCLE ACTIVATION potentials (EMG) and also EOG – electro-oculographic signals, which result from the motion of the eyeball (a strongly polarised organ with a lot of electrical connections on one side and almost no nerves on the other side).
Nearly every other EEG system either completely discards any data collected during EMG and EOG events, or at least tries to filter them out while recovering the actual brain signals. We decided to take this much further. We gratefully accept the muscle signals and we use the distribution of sensors around the face to triangulate muscle sources and to build classification systems to identify specific facial expressions, such as SMILE and BLINK etc. This is the Facial Expressions detection suite. We make no secret of the fact that Facial Expression detections are based on muscle signals, in fact we are delighted to let people know we are making such good use of data that most people throw on the junk pile.
Mental Commands and Performance Metrics are completely different. Performance Metrics is a measurement of your emotional state. It DOES NOT accept muscle of facial expression data as input – we strictly use brain signals, otherwise what’s the point? Because we have a handle on facial expressions, we can characterise the kind of filtering we need to remove EMG and EOG from the data and our detections continue to operate even through a lot of normal activity. Sometimes they just shut down if the data is too corrupted, especially Frustration and Engagement.
Mental Commands works on what you use for training. It is quite possible to include a bit of facial muscle activation in the classifier, but to be honest that is not as reliable and reproducible as proper, pure brain signals. We prefer our users to relax and not remain reasonably still while training Mental Commands, and most people learn to do this after playing with the system for a while. With perfectly calm training data (that is, where there is no muscle involvement, only brain activity) it is then possible to use the Mental Commands in normal activities including while making facial movements, because only the relevant brain signals are built into the classifier. We allowed a small component of muscle signals into the training system so that novices would stand a better chance of working the system first time, but nearly everyone soon learns to leave that behind. Mental Commands work perfectly well for completely locked-in subjects (those with ABSOLUTELY no muscle activity) – which proves that brain signals must be involved.