Tux Flow

My first experiment in Inria!! Tux Racer, a 3D video game (open source) can be manipulated with a Motor Imagery BCI system, using an EEG cap. 28 participants in the experiment!
Thank you all!!

Motor Imagery means that the commands to manipulate the game are mental commands, imagining left or right hand movement, without moving them. The system can detect a decrease in amplitude of mu waves when imagining the movement and an increase (beta rebound) right after releasing the tension. In a more scientific term, called Event Related Synchronisation or Desynchronisation (ERS/ERD). The detection of these phenomena is possible through filtering signal noise (and there are looots of noise — take for example any muscle movement of the participant, it makes the EEG cap move, and the signals are destroyed), and other signal processing methods. Machine Learning techniques will serve to decode and label (left-right) the EEG features resulting from signal processing. For such ventures we use OpenVibe, a great FREE (open source) software which enables no-programing experts to achieve signal processing and machine learning techniques). With the help of this software (LSL), and Jérémy Frey, we maneuvered Tux (the virtual penguin) with Motor Imagery, to ski to the left or right to catch maximum fish.

As it is difficult to learn to give mental commands, the users need help, motivation etc. The idea of this project was to assist the users to be in a state of Flow, by using a ludic environment, adaptive difficulty tasks and immersive audio background following the task.
The experiment resulted in a paper we submitted for Graz conference 2017.

Our Protocol consisted of ~10mins Graz MI BCI protocol, 6 times play the TuxRacer game with music or 6 times without music in the background, followed by questionnaires for Flow state and Music quality measures, below img:

The magnet symbolically presents the adaptive condition (attracting the penguin closer to the target — fish)

Leave a Reply