Lab Streaming Layer - LSL integration

Hey TD Fam,

I am approaching a project which includes some brain computer interaction. The devices and system that I need to hook into uses Lab Streaming Layer (LSL). I found some Python interface to LSL here.

I was wondering if any of you already made some experience here before I dive in???

Thank you loads :slight_smile:

1 Like

Hey @daniel.dalfovo I don’t have any solution, sorry :frowning:
I was wondering if you found anything promising since you posted. I am looking for solution to integrate LSL into TD as well. If you are up for it I would love to collaborate or expand anything you might have started.

Hey @fdiazsmith, the python integration of pylsl works quite nicely actually. I was able to receive quite a few LSL streams as well as sending LSL streams (Markers etc). There are some helpful examples in the lib folder → pip install page for dowload.

This is a small example of how to stream data from an EEG device:
“”“Example program to show how to read a multi-channel time series from LSL.”“”

from pylsl import StreamInlet, resolve_stream

# first resolve an EEG stream on the lab network
print("looking for an EEG stream...")
streams = resolve_stream('type', 'EEG')

# create a new inlet to read from the stream
inlet = StreamInlet(streams[0])

while True:
    # get a new sample (you can also omit the timestamp part if you're not
    # interested in it)
    sample, timestamp = inlet.pull_sample()
    print(timestamp, sample)

Maybe that gets you up and running!

Super! thank you for sharing, I will give it a shot, I am sure it will be helpful since I am also working with EEG data.

Hey folks, I’m also working in this space, using the OpenBCI platform. I can get static file data from an old 8-bit Cyton board (recorded with the OpenBCI GUI) into Touchdesigner, and then iterate through it, but I have yet to get any real time data directly from the board into Touchdesigner. From what I can tell, the data is pretty messy in its raw form. I started reading up on LSL and found your posts.

Does anyone have a simple demo of the CHOP or DAT setup needed to capture the LSL once it is set up?

HI Daniel, Thanks for the code. I’m using something almost identical with some additions - code pasted below. It works when run from Virtual Studio Code.
I have installed pylsl but I’m still not receiving a stream, nor any errors. This is in a Script Dat. I tried cutting and pasting your code into a text DAT and running, but it just hung. (Muse was streaming through Petal). I’m on Windows 10. What did you use to stream successfully?

# me - this DAT
# scriptOp - the OP which is cooking
#
# press 'Setup Parameters' in the OP to call this function to re-create the parameters.
def onSetupParameters(scriptOp):
    page = scriptOp.appendCustomPage('Custom')
    p = page.appendFloat('Valuea', label='Value A')
    p = page.appendFloat('Valueb', label='Value B')
    return

# called whenever custom pulse parameter is pushed
def onPulse(par):
    return

def onCook(scriptOp):
    scriptOp.clear()
    
    # -*- coding: utf-8 -*-
    """
    Estimate Relaxation from Band Powers

    This example shows how to buffer, epoch, and transform EEG data from a single
    electrode into values for each of the classic frequencies (e.g. alpha, beta, theta).
    Furthermore, it shows how ratios of the band powers can be used to estimate
    mental state for neurofeedback.

    The neurofeedback protocols described here are inspired by
    *Neurofeedback: A Comprehensive Review on System Design, Methodology and Clinical Applications* by Marzbani et. al

    Adapted from https://github.com/NeuroTechX/bci-workshop
    """

    import numpy as np  # Module that simplifies computations on matrices
    from pylsl import StreamInlet, resolve_byprop  # Module to receive EEG data
    from scipy.signal import butter, lfilter  # For the notch filter
    import utils  # Our own utility functions
    
    # Handy little enum to make code more readable
    class Band:
        Delta = 0
        Theta = 1
        Alpha = 2
        Beta = 3
    
    # Apply a notch filter to the EEG data
    #def apply_notch_filter(data, notch_freq=60, fs=256):
        #nyquist = 0.5 * fs
        #notch = notch_freq / nyquist
        #b, a = butter(2, [notch - 0.5, notch + 0.5], btype='bandstop')
        #return lfilter(b, a, data)
        
    # EXPERIMENTAL PARAMETERS
    # Modify these to change aspects of the signal processing

    # Length of the EEG data buffer (in seconds)
    # This buffer will hold the last n seconds of data and be used for calculations
    BUFFER_LENGTH = 5
    print("hello")
    # Length of the epochs used to compute the FFT (in seconds)
    EPOCH_LENGTH = 1

    # Amount of overlap between two consecutive epochs (in seconds)
    OVERLAP_LENGTH = 0.8

    # Amount to 'shift' the start of each next consecutive epoch
    SHIFT_LENGTH = EPOCH_LENGTH - OVERLAP_LENGTH

    # Index of the channel(s) (electrodes) to be used
    # 0 = left ear, 1 = left forehead, 2 = right forehead, 3 = right ear
    INDEX_CHANNEL = [0]
   
    if __name__ == "__main__":

        # 1. CONNECT TO EEG STREAM
        # Search for active LSL streams
        print('Looking for an EEG stream...')
        streams = resolve_byprop('type', 'EEG', timeout=2)
        if len(streams) == 0:
            raise RuntimeError("Can't find EEG stream.")

        # Set active EEG stream to inlet and apply time correction
        print("Start acquiring data")
        inlet = StreamInlet(streams[0], max_chunklen=12)
        eeg_time_correction = inlet.time_correction()

        # Get the stream info and description
        info = inlet.info()
        description = info.desc()

        # Get the sampling frequency
        # This is an important value that represents how many EEG data points are
        # collected in a second. This influences our frequency band calculation.
        # For the Muse 2016, this should always be 256
        fs = int(info.nominal_srate())

        # 2. INITIALIZE BUFFERS
        # Initialize raw EEG data buffer
        eeg_buffer = np.zeros((int(fs * BUFFER_LENGTH), 1))
        filter_state = None  # for use with the notch filter

        # Compute the number of epochs in "buffer_length"
        n_win_test = int(np.floor((BUFFER_LENGTH - EPOCH_LENGTH) / SHIFT_LENGTH + 1))

        # Initialize the band power buffer (for plotting)
        # Bands will be ordered: [delta, theta, alpha, beta]
        band_buffer = np.zeros((n_win_test, 4))

        # 3. GET DATA
        # The try/except structure allows to quit the while loop by aborting the
        # script with <Ctrl-C>
        print('Press Ctrl-C in the console to break the while loop.')

        try:
            print('Inside try block')
            # The following loop acquires data, computes band powers, and calculates neurofeedback metrics based on those band powers
            while True:
                # 3.1 ACQUIRE DATA
                # Obtain EEG data from the LSL stream
                eeg_data, timestamp = inlet.pull_chunk(
                    timeout=1, max_samples=int(SHIFT_LENGTH * fs))

                # Only keep the channel we're interested in
                ch_data = np.array(eeg_data)[:, INDEX_CHANNEL]

                # Apply notch filter to EEG data
                ch_data = apply_notch_filter(ch_data, fs=fs)

                # Update EEG buffer with the new data
                eeg_buffer, filter_state = utils.update_buffer(
                    eeg_buffer, ch_data, notch=True,
                    filter_state=filter_state)

                # 3.2 COMPUTE BAND POWERS
                # Get the newest samples from the buffer
                data_epoch = utils.get_last_data(eeg_buffer, EPOCH_LENGTH * fs)

                # Compute band powers
                band_powers = utils.compute_band_powers(data_epoch, fs)
                band_buffer, _ = utils.update_buffer(band_buffer,
                                                     np.asarray([band_powers]))

                # 3.3 COMPUTE NEUROFEEDBACK METRICS
                # Update the Table DAT in TouchDesigner with the calculated metrics
                # Assuming you have a Table DAT named "table1" in your TouchDesigner project
                import op

                table = op('table1')  # Reference the Table DAT

                # Calculate the Alpha Relaxation metric
                alpha_metric = band_buffer[-1][Band.Alpha] / band_buffer[-1][Band.Delta]
                print('Alpha Metric:', alpha_metric)
                # Append a new row to the Table DAT with the latest metric value
                table.appendRow([alpha_metric])

        except KeyboardInterrupt:
            print('Closing!')

Hey @ellelala - sorry for being late on this. I think you need to split your code into two parts, one part that only gets executed once (import libraries, declaring all variables, connecting to stream …) and the other part is the EEG stream update, so everything after “inlet.pull_chunk()” - I don’t think your pasted code will work nicely in an onCook() function of a DAT, it will cook to often. Have you checked out Python Extensions? That would be a good starting point to structure your code into initiation and constant cooking.

Hi Daniel, Sorry for my late repy as well. Got sucked into some other projects. This makes a lot of sense. I’ll give that a whirl. I may not be as proficient at Python as I might need to be, but I will check out Python Extensions. Thanks again, I’ll let you know how it goes.