Transforming EEG Data into Sound And Visuals

Hello Touchdesigner Programmers,

Recently i’ve got Muse EEG Headset sensor and with the help of Mind Monitor app i’m able to recored the EEG data while meditating in a CSV file.

Now i’m trying to convert the EEG data (Alpha, Beta, Gamma, Delta) from the csv file into sound and visuals in progression with time so that i can have the movement in sound and visuals. This is for the first time i’m working with the csv file format in touchdesigner.

Please help me guys in executing this idea.

Warm Regards
Tejaswi Meshram

Hey,

the FileIn CHOP loads csv files… have you tried that route?

Wasn’t the “Muse Direct” App also capable of streaming OSC?

cheers
Markus

2 Likes

Hello Markus,

Thank you so much for the reply Markus…

I’ll try the File In Chop and will let you know.

And Yes with the mind monitor app i can stream the data via OSC, haven’t tried with the “Muse Direct”. But i was thinking if i can use the data from Muse while meditating and then after i collect the data i start working on the data visualisation.

Warm Regards
Tejaswi

1 Like

Did you find the way? I am trying the same because I cannot connect directly my Emotiv but I want to use the cvs but I do not know how to use the column data as parameter. any suggestion

1 Like

Hello, I would also like to find out how to convert EEG data into visuals, but don’t know where to start really. I am a music composer but have no background in video. It is for a beautiful sound healing project in Costa Rica. Any help would be much appreciated! Thank you

1 Like

Hi Tejaswi, interesting to read about this. Did you succeed to do this in the end and are you still using it? Thank you

Hello Jeff, yes I was able to achieve few things in my process. I had a live eeg data from my muse eeg via mind monitor in TouchDesigner. In Touchdesigner I was using OSC in chop to get all the eeg data and then I was processing the eeg data for visuals inside GLSL shaders and that’s how I was able to animate visuals via EEG.

The idea for this process was to understand when do we enter in the meditative space and for how long we are able to hold that meditative space.

1 Like

working on this same thing. This isn’t using the muse but same general idea – signals interacting with video. One of my goals is to put filters over streaming video of sacred geometry that can only be removed when 2 or more meditators all reach coherence. This series of videos showed me that path: Youtube. Create Interactive Multisensory Experiences with TouchDesigner & Playtron MIDI by Okamirufu Vizualizer

So which device is been used for the eeg data collection and whats the process you following to import the data in Touchdesigner.

Muse 2 headsets. OSC signals in. Added synthesizers that convert plant biodata to MIDI to allow the mind and the flora & fungi to create joyful harmonies and melodies together.

@Tejaswi123 What have you created since you started this journey?