Using Brain-Computer Interfaces to detect human satisfaction in human-robot interaction

Ehsan Turkish Esfahan. Department of Mechanical Engineering, University of California Riverside, USA


This article discusses the use of a brain–computer interface (BCI) to obtain emotional feedback from a human in response to the motion of humanoid robots in collaborative environments. The purpose of this study is to detect the human satisfaction level and use it as a feedback for correcting and improving the behavior of the robot to maximize human satisfaction. This article describes experiments and algorithms that use human brains activity collected through BCI in order to estimate the level of satisfaction. Users wear an electroencephalogram (EEG) headset and control the movement of the robot by mental imagination. The robots responds to the mental imagination may not be the same as human mental command and this will affect the emotional satisfaction level. The headset records brain activity from 14 locations on the scalp. Power spectral density of each EEG frequency band and four largest Lyapunov exponents of each EEG signal form the feature vector. The Mann–Whitney–Wilcoxon test is then used to rank all the features. The highest rank features are then selected to train a linear discriminant classifier (LDC) to determine the satisfaction level. Our experimental results show an accuracy of 79.2% in detecting the human satisfaction level.


Click here to read the full report