Combining EEG and eye tracking:
a workflow for your mobile experiment

by Sara Pizzamiglio, Ph.D.
Scientific Consultant (Brain Products)

Are you fascinated by mobile EEG and would like to explore real-world scenarios, but are unsure of how to prepare your setup and overcome its technical challenges? Read on to learn about how we combined EEG with wearable eye tracking from our new partner Tobii Pro. We’ll show you how to optimize your setup and provide some inspiration for your next investigations.

Abstract

Emblem EEG-Eye TrackingRecording EEG in the real world presents some challenges; the combination with wearable eye tracking could provide you with a possible solution to technical limitations, such as marking events. In this article, we describe one example application of combined mobile EEG and wearable eye tracking for exploring natural behavior outside of the lab. To record EEG data, we used the Brain Products LiveAmp, while eye movements were detected by the Tobii Pro Glasses 3 wearable eye tracker. Here, we provide a step-by-step guide on how to prepare and optimize your setup, together with some useful tips for how to maximize the quality of your data. Lastly, we describe how to merge the two signals and perform a combined analysis in BrainVision Analyzer 2.

Charm and challenges of mobile and out-of-the-lab scenarios

Advances in technology, paired with the enthusiasm of many scientists, have opened the door to expanding research horizons, whereby human behavior is studied in its natural environment, outside of the lab. Investigations in real-world and ecologically valid scenarios aim to observe the responses of the brain in natural settings. Imagine, for example, that you are interested in how the brain reacts at the sight of beautiful paintings in a museum, or in identifying which objects in the window of a shop attract the attention most. These scenarios are now possible, however, they still bring about a few challenges, for example:

For participants to behave as naturally as possible in these contexts, it is vital for the equipment to be minimal and light, thus minimalistic setups should be favored.

Experimental paradigms are less controlled, as they are not executed on a computer.
Consequently, marking times of relevant events within the EEG data may be challenging.

Among the many approaches you may follow, including eye tracking in your study would allow you to overcome these technical limitations while additionally providing you deeper and more insightful observations of daily activities.

The photo gallery: a mobile EEG and wearable eye tracking application

The combination of EEG and eye tracking is becoming more popular thanks to the complementary information eye tracking provides to the study of brain activity. In real-world scenarios, you could combine wearable eye tracking with a mobile EEG setup to monitor the gaze of your participants when viewing real world objects and thereby mark the times of important events. For example, the start of a fixation on an object of interest could be used as a reference point for defining epochs in the EEG data, allowing you to perform analyses of related potentials (i.e., Fixation Related Potentials, FRP).

Let’s in fact imagine that we are in a photo gallery, walking along its corridors and looking at different pictures. Based on previous research work (1, 2), our goal is to investigate how the brain reacts to the sight of  simple objects and human faces. The eye tracking provides information on fixation times, which we would then import into the EEG and eventually use for FRP analyses.

Video: The photo gallery. Our participant is looking at pictures of objects and faces on a wall whilst EEG activity and eye movements are recorded.

For this specific application we have designed the setup with the following material

  • To record EEG activity, we use the actiCAP snap with active slim electrodes combined with the small and lightweight LiveAmp amplifier from Brain Products. Electrode configuration and impedance measurement, trigger options and data recording are handled in the BrainVision Recorder software.
  • To track eye movements, the latest generation of wearable eye trackers from Tobii Pro, the Glasses 3 are used. Calibration of the eye tracker, live view and data recording are managed via the Glasses 3 controller application.
  • To analyze the eye tracking data and identify the relevant times of interest, we employ the advanced and comprehensive Tobii Pro Lab software.
  • To analyze the EEG data and merge them with eye tracking data, we use the Add Channels transform as introduced with BrainVision Analyzer 2.2.1.
Combining EEG and eye tracking: a workflow for your mobile experiment

Figure 2: Mobile EEG and wearable eye tracking application workflow. The diagram shows each step and material involved in the workflow from the start (i.e., EEG and eye tracking data recording) up to the end (i.e., offline merging of EEG and eye tracking data and subsequent combined analyses).

Let’s walk through the setup, recording and analysis steps of our application!

1. Synchronization and trigger sharing

The first and most important challenge is that eye tracking and EEG produce two completely different types of signals, recorded from two separate devices. Identifying common points of interest within the two signals is extremely important for the success of this combined measurement; thus, when planning your setup, you should always ask yourself: “How can I make sure that the two signals are synchronized?”.

In this setup, the Glasses 3  have a 3.5 mm jack In/Out Sync Port via which TTL pulses can be received and/or sent out. A built-in sync-out signal is emitted automatically as soon as the device starts recording and is based on a pre-defined sequence as shown below:

Combining EEG and eye tracking: a workflow for your mobile experiment

Figure 3: The Glasses 3 sync-out signal. The recording unit emits a sync-out signal (0 V – 3.3 V) as soon as the device starts recording. A start sequence of 3 subsequent TTL pulses each 500 ms long marks the start of the recording. After the start sequence TTL pulses of 1000 ms are regularly sent every 10 seconds. When the recording is stopped no specific sequence is delivered.

The sync-out signal automatically generates markers within the eye tracking data and, if simultaneously transmitted to an external device, it ensures synchronization between different streams. In our setup, this can easily be achieved because the LiveAmp amplifier has a 2.5 mm jack trigger port which can receive a 1-bit trigger signal as an input. Therefore, by connecting the two devices via a simple cable with a 3.5 mm jack on one end and a 2.5 mm jack on the other end, the same triggers can be shared!

The picture below gives an overview of the setup we designed for our mobile application:

Combining EEG and eye tracking: a workflow for your mobile experiment

Figure 4: Mobile EEG and wearable eye tracking application setup. The Glasses 3 Head Unit (i.e., the glasses) is connected to the Recording Unit and communicates wirelessly to the computer running the Glasses 3 controller application. The actiCAP slim electrodes are connected to the LiveAmp amplifier, which transmits data via Bluetooth® to the computer running BrainVision Recorder whilst saving them also locally on the memory card. A 3.5 mm jack to 2.5 mm jack cable connects the Sync Port of the Glasses 3 Recording Unit to the 1-bit trigger input port of the LiveAmp, ensuring synchronization between the two data streams.

2. Prepare your mobile EEG measurement with actiCAP slim electrodes and LiveAmp

The next step when preparing your experiment is to set up your EEG recording workspace in BrainVision Recorder. You can read how  to do this step-by-step in the dedicated user manual, but keep in mind the following best practices to optimize your setup and data quality:

  • Select the highest sampling rate available for your setup to ensure the best temporal precision and synchronization with the eye tracker. For example, with 32 EEG channels and all 3D accelerometer directions enabled, you may go up to 500 Hz.

  • Save the data both on the recording computer and locally on the internal 32 GB memory card of the amplifier to avoid any data loss due to possible interferences and Bluetooth® miscommunication.

Once you have created a new workspace, you are now ready to prepare the cap. For a complete tutorial make sure to watch this video!

3. Prepare your wearable eye tracking measurement with Glasses 3

Once the EEG equipment and workspace are prepared, you can proceed with the eye tracking equipment. Setting up the Glasses 3 is very easy and quick via the dedicated controller application, available for computers running Windows® 10 or later, as well as for some tablets and smartphones running Android 9 or later.

Prepare the glasses so that they properly fit your participant (e.g., select the right nose pad) and let your participant wear them. Make sure the Head Unit is connected to the Recording Unit and then switch the device on by gently pressing the start button of the Recording Unit for a few seconds. Connect them to the computer, tablet, or smartphone via WLAN or Ethernet. There are in fact two versions of the Glasses 3: wireless (both WLAN and Ethernet connections available) or wired (Ethernet connection available only). For this application, we used the wireless version in order to achieve a completely mobile setup.

As soon as the connection from the Glasses to your computer, tablet or smartphone is established, start the controller application; here, you will be able to see the live view. The glasses need to be connected to the controller application to start the recording, to view what the participant is looking at online, and to eventually stop the recording. However, the Recording Unit saves data locally on an internal memory card also when WLAN connection with the controller application is lost.

When ready, start by calibrating the glasses making sure to have the right environmental conditions. Once this has been achieved, you are ready to start your experiment: connect the sync port of the Glasses 3 to the trigger input port of the LiveAmp via the dedicated cable, start the EEG recording and then the eye tracking. Monitor the EEG data in BrainVision Recorder to make sure the start sequence of the glasses sync out signal is correctly received as well as the other triggers during the whole recording.

4. Analyze wearable eye tracking data in Tobii Pro Lab

The very first step in your analysis pipeline is to analyze your wearable eye tracking data with Tobii Pro Lab. Open the software, create a new Glasses project and import your data; remove the memory card from the Recording Unit and connect it to the analysis computer to do so. Start with a quality check of your data. For example, you could verify that the markers related to the sync out signal of the Recording Unit are all available. Then, follow these three steps to complete your analyses:

4.1. Run an Assisted or Manual Mapping

This is an important step for the definition of eye movements on real world objects. The gaze data generated by wearable eye trackers, like the Glasses 3, is in fact by default referenced to the coordinate system of the device. However, to extract meaningful information for your study, it is important that the data is mapped relative to a coordinate system with its origin fixed in the environment around the participant wearing the glasses. Tobii Pro Lab can change the coordinate system to which the gaze data is referenced via the Mapping function within the Analyse tab. This process can be run automatically by the software on a selected time-period of interest (i.e., Assisted Mapping) or manually by the user scrolling through each recording time point (i.e., Manual Mapping). This procedure maps the gaze data onto still images (i.e., snapshots or screenshots) of the objects of interest. Once the mapping process is completed, you will be able to analyse your gaze data referenced to the objects of interest in the real world through visualization tools (e.g., heatmaps and gaze plots), as well as to draw Areas of Interest.

4.2. Define Areas of Interest (i.e., AOI)

… on the snapshots/screenshots of each object of interest to extract information about the eye movements that landed on them. For our analyses, we are most interested in the fixations that happened on each object/face and the relative start point.

4.3. After all the required analyses are completed, you are ready to export your gaze data and related events.

Simply go to the Data Export tab, check that all the relevant data for your combined analysis is selected and choose to export in the Pro Lab Output File format (i.e., .plof). For our goal, it is important that we export the start times of each fixation within the AOIs (i.e., “AOI Hit”).

5. Merge and analyse combined eye tracking and EEG data with BrainVision Analyzer 2

Before importing the eye tracking data and relevant events into BrainVision Analyzer 2, it is good practice to perform a quality check of the EEG data. For example, you should verify that there are no losses or pauses within the data, and you should also check that the triggers shared with the eye tracker are all available and complete. This is important because the two recordings have different lengths and sampling rates, thus BrainVision Analyzer 2 will use the shared triggers to realign them on the same time axis.

Select the Add Channels solution from TransformationsData Preprocessing and identify your .plof file under Import files. In the second dialog, select the shared synchronization markers in the EEG (i.e., Markers in Active Node section) as well as in the eye tracking file (i.e., Markers in Import File section) to allow the software to realign the two signals. If you click on the Details button, you will see if there is an equal number of synchronization markers in both data sets. Move to the following dialog and select the eye tracking data (i.e., gaze directions, pupil sizes, etc.) and related events you would like to import. For our purpose, we would import all the AOI Hit events. Once the import and merging process is complete, the selected eye tracking data will appear as additional signal lines below the EEG channels, whilst the selected events will appear as markers available across all channels.

Combining EEG and eye tracking: a workflow for your mobile experiment

Figure 5: Example of data display in BrainVision Analyzer 2 after applying the Add Channels transformation. Additional data channels from the eye tracker were added below the EEG channels. Markers for Areas of Interest (AOI) were imported as well and they are visible as global interval markers across all channels.

You can now proceed with your analyses, for example performing Fixation Related Potentials. You can epoch your EEG data based on the imported eye tracking events by selecting the Segmentation tool in TransformationsSegment Analysis Functions. In the first dialog of the transformation, select the option “Create new Segments based on a marker position” and in the second dialog select your AOI Hit marker. BrainVision Analyzer 2 will generate the same number of epochs as fixations were available on the AOI. From this point, be creative with as many group and statistical analyses as you need!

Conclusion

Surely your experiments will be based in different scenarios and will answer different research questions, but we hope this article has given you some food for thought as well as valuable considerations for your own setup. If you want to know more, you can watch the dedicated webinars on our webinar channel. Additionally, if mobile EEG and real-world investigations are interesting for you, stay tuned for upcoming events that will cover these topics and many more!

Want to know more? … Get in touch!