In the EMOIO project is developing an emotionally responsive, neuro-adaptive brain-computer interface. Funded by the German Federal Ministry of Education and Research (BMBF), scientists from Fraunhofer IAO have carried out the first live demonstration at the BMBF future congress “Bringing technology to people,” held in Bonn June 26-27.
Emotions play a crucial role in our interaction with technology – but even smart systems are not yet able to react appropriately to human emotion. In the EMOIO project, Fraunhofer IAO is working with the University of Stuttgart’s Institute of Human Factors and Technology Management IAT to investigate how techniques from neuroscience might be applied to gather and interpret user emotion based on brain activity. The intention is to relay these emotions to computer systems via a brain-computer interface – meaning that their design and behaviour could be adapted to the needs of individual users.
During the live demonstration, a test subject was presented with emotion inducing content such as pictures of baby animals and scenes of war. While looking at the pictures, the test subject’s brain activity was monitored using electroencephalography and near-infrared spectroscopy, and analysed by an algorithm in real time. The algorithm sifts brain signals for patterns that have been identified to correlate with positive and negative emotions. This way, it is able to categorise a test subject’s reaction to an image as positive or negative within the space of a few seconds. This result can then be relayed to any computer system.
This emotional classification determined by the algorithm is not just relevant for computer systems, but might also be interesting to users themselves. For this reason, Fraunhofer IAO has developed a mobile app that shows users their classified emotion in real time. The user’s current emotional state is represented by an emoticon, together with a dynamic chart showing the user’s emotional state over time. In future versions, the app is to be developed so that users can comment on their emotional experiences, and add contextual information such as the place, situation, or image.