Citation
Mindtrack: Using Brain-Computer Interface To Translate Emotions Into Music

Material Information

Title:
Mindtrack: Using Brain-Computer Interface To Translate Emotions Into Music
Series Title:
19th Annual Undergraduate Research Symposium
Creator:
Sirocchi, Sofia
Language:
English
Physical Description:
Undetermined

Subjects

Subjects / Keywords:
Center for Undergraduate Research
Center for Undergraduate Research
Genre:
Conference papers and proceedings
Poster

Notes

Abstract:
The present work describes Mindtrack, a Brain-Computer Musical Interface that uses real-time brainwave data to allow a user to expressively shape progressive music. In Mindtrack, the user wears an electroencephalogram (EEG) EMOTIV Insight headset. The raw EEG data is converted into brain wave components, followed by high-level EEG characteristics (such as emotion) that are used to control the music's tempo and key signature. Other musical parameters, such as harmony, rhythm and melody are specified by the user. Tempo and key are calculated according to the emotion detected from the EEG device. In Mindtrack, the brain is the sole instrument used to translate emotions to music. Mindtrack has the potential to increase the quality of life for persons with physical impairments who still desire to express themselves musically. Furthermore, Mindtrack can be used for music therapy, recreation, and rehabilitation. ( en )
General Note:
Research authors: Bhaveek Desai, Benjamin Chen, Sofia Sirocchi, Kyla A. McMullen - University of Florida
General Note:
Emerging Scholars Program
General Note:
Faculty Mentor: The present work describes Mindtrack, a Brain-Computer Musical Interface that uses real-time brainwave data to allow a user to expressively shape progressive music. In Mindtrack, the user wears an electroencephalogram (EEG) EMOTIV Insight headset. The raw EEG data is converted into brain wave components, followed by high-level EEG characteristics (such as emotion) that are used to control the music's tempo and key signature. Other musical parameters, such as harmony, rhythm and melody are specified by the user. Tempo and key are calculated according to the emotion detected from the EEG device. In Mindtrack, the brain is the sole instrument used to translate emotions to music. Mindtrack has the potential to increase the quality of life for persons with physical impairments who still desire to express themselves musically. Furthermore, Mindtrack can be used for music therapy, recreation, and rehabilitation. - Center for Undergraduate Research, Emerging Scholars Program

Record Information

Source Institution:
University of Florida
Rights Management:
Copyright Sofia Sirocchi. Permission granted to University of Florida to digitize and display this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder.

UFDC Membership

Aggregations:
UF Undergraduate Honors Theses

Downloads

This item is only available as the following downloads:


Full Text

PAGE 1

Mindtrack : Using Brain Computer Interface to Translate Emotions into Music Bhaveek Desai, Benjamin Chen, Sofia Sirocchi, Kyla A. McMullen, PhD Department of Computer and Information Science and Engineering, College of Engineering, University of Florida, Gainesville, F L Methodology Conclusions References Introduction Results Acknowledgements Brain computer interfaces (BCI) read brainwave data from a user, allowing them to be used in a range of mind controlled applications, such as brain drones. This work describes the Mindtrack system, which uses a BCI device to translate emotion driven electroencephalography (EEG) data into music. EMOTIV Insight: five dry electrodes (AF3, AF4, T7, T8, Pz ). Data Acquisition Client: translate raw EEG from BCI into brain waves. User Datagram Protocol (UDP) Server: broadcast brainwave data in real time. UDP User Datagram Protocol Client: receive broadcast data. Emotion Detector: calculate arousal and valance from brain waves to recognize emotions. Music Synthesizer: decide tempo and key signature; user puts on headset, selects melody, harmony, and rhythm. Mindtrack was effectively able to be used to translate emotions into music with minimal latency. This provides a significant amount of promise for BCI to be used in an emotional and musical context. Alpha and beta waves for AF3 and AF4 used for emotion detection. Five degrees of emotions readable in the system: very sad, sad, neutral, happy, very happy. Valance and arousal values (vary among individuals), calculated and interpreted to construct musical elements. User selected melody (stepwise, arpeggio, or pentatonic) rhythm (swing or eighths) harmony (root, third, fourth, fifth, sixth). Valence mapped to melody, rhythm, harmony. Valence decided key signature; arousal was mapped to tempo 1. Brain drone racing bci competition. braindronerace.com, 2016. http://www.braindronerace.com/. 2. E. R. Miranda, W. L. Magee, J. J. Wilson, J. Eaton, and R. Palaniappan Brain computer music interfacing ( bcmi ): from basic research to the real world of special needs. Music and Medicine, 3(3):134 140, 2011. 3. R. Ramirez and Z. Vamvakousis Detecting Emotion from EEG Signals Using the Emotive Epoc Device, pages 175 184. Springer Berlin Heidelberg, Berlin, Heidelberg, 2012. The Mindtrack team would like to thank Dr. Chris Crawford & Dr. Marvin Andujar for their generous and patient guidance in helping the team understand the BCI device's design and functionality. Approach Using a commercially available BCI device, emotional valence and arousal were calculated using the following formulas: Future Work Allow individuals with physical limitations to create music and express themselves. New instrument that can be played recreationally. System modifications to allow for different instruments, harmonization, and rhythmic variation based on emotion. Expand amount of emotions that can be interpreted with BCI devices. Create soundscapes rather than a solo instrumental track. This work was published in the 2018 International Conference on Digital Arts, Media and Technology proceedings. The valence and arousal levels were then mapped to various components of music created by the user. System Overview