Music Emotion Capture: sonifying emotions in EEG data

Music Emotion Capture: sonifying emotions in EEG data

G. Langroudi, A. Jordanous and L. Li


People’s emotions are not always obviously detectable, due to difficulties expressing emotions, or geographic distance (e.g. if people are communicating online). There are also many occasions where it would be useful for a computer to be able to detect users’ emotions and respond to them appropriately. A person’s brain activity gives vital clues as to emotions they are experiencing at any one time. The aim of this project is to detect, model and sonify people’s emotions. To achieve this, there are two tasks: (1) to detect emotions based on current brain activity as measured by an EEG device; (2) to play appropriate music in real-time, representing the current emotional state of the user. Here we report a pilot study implementing the Music Emotion Capture system. In future work we plan to improve how this project performs emotion detection through EEG, and to generate new music based on emotion-based characteristics of music. Potential applications arise in collaborative/assistive software and brain-computer interfaces for non-verbal communication. Access Article Here  

Temporary down for maintenance.

Please check back soon.