by Dr. Thomas Emmerling
Scientific Consultant (Brain Products)
We are very proud to announce the winners of the first ever MoBI Award 2017! With the MoBI Award, Brain Products wants to recognize excellence in various areas of Mobile Brain/Body Imaging research. This field has been rapidly growing and holds a lot of potential for the future. During
SPR 2017 in Vienna, Brain Products was proud to award Rob Zink the first prize for his paper “Mobile EEG on the bike: disentangling attentional and physical contributions to auditory attention tasks.” The first prize of the MoBI award included a complete
LiveAmp system and 3.000 Euro.
When submissions for the MoBI Award 2017 closed on July 31st with 36 submitted publications, we were truly overwhelmed by the quality and number of the submissions. The newest generation of mobile EEG hardware enabled scientists to investigate questions of cognition and neurophysiology outside the lab. With the great help of our jury members and the jury chair Prof. Klaus Gramann the submissions were scored based on their positive impact on society, the innovation and impact of the research, and – of course – the scientific quality.
You can find the 13 highest ranked submissions (including the three winners) on the MoBI Award website. We would like to thank all the jury members for their time and dedication, the SPR board for the opportunity to hold the award ceremony at this well-renowned conference and, last but not least, all the scientists who submitted their work for the MoBI Award 2017.
Below, we would like to recognize our three winners: Rob Zink, Suzanne Dikker, and Ivan Volosyak compiled short summaries of their papers to share in this Press Release. Congratulations again!
We will soon open submissions for the MoBI Award 2018 which will be presented at the MoBI conference 2018 in Berlin, Germany (July 11 – 14, 2018) – stay tuned and sign up here to stay up to date!
Brain Products MoBI Award 2017: 1st place
Zink, R., Hunyadi, B., Van Huffel, S., & De Vos, M. (2016). Mobile EEG on the bike: disentangling attentional and physical contributions to auditory attention tasks. J Neural Eng, 13(4), 046017. https://doi.org/10.1088/1741-2560/13/4/046017
Objective: In the past few years there has been a growing interest in studying brain functioning in natural, real-life situations. Mobile EEG allows to study the brain in real unconstrained environments but it faces the intrinsic challenge that it is impossible to disentangle observed changes in brain activity due to increase in cognitive demands by the complex natural environment or due to the physical involvement. In this work we aim to disentangle the influence of cognitive demands and distractions that arise from such outdoor unconstrained recordings.
Approach: We evaluate the ERP and single trial characteristics of a three-class auditory oddball paradigm recorded in outdoor scenarios while peddling on a fixed bike or biking freely around. In addition we also carefully evaluate the trial specific motion artifacts through independent gyro measurements and control for muscle artifacts.
Main results: A decrease in P300 amplitude was observed in the free biking condition as compared to the fixed bike conditions. Above chance P300 single-trial classification in highly dynamic real life environments while biking outdoors was achieved. Certain significant artifact patterns were identified in the free biking condition, but neither these nor the increase in movement (as derived from continuous gyrometer measurements) can explain the differences in classification accuracy and P300 waveform differences with full clarity. The increased cognitive load in real-life scenarios is shown to play a major role in the observed differences.
Significance: Our findings suggest that auditory oddball results measured in natural real-life scenarios are influenced mainly by increased cognitive load due to being in an unconstrained environment.
Brain Products MoBI Award 2017: 2nd place
Dikker, S., Wan, L., Davidesco, I., Kaggen, L., Oostrik, M., McClintock, J., Rowland, J., Michalareas, G., Van Bavel, J.J., Ding, M., & Poeppel, D. (2017). Brain-to-Brain Synchrony Tracks Real-World Dynamic Group Interactions in the Classroom. Current Biology, 27(9), 1375-1380.
In their article Brain-to-Brain Synchrony Tracks Real-World Dynamic Group Interactions in the Classroom (Current Biology, 2017), Suzanne Dikker and Lu Wan – together with a team of neuroscientists, programmers, and educators of New York University, the University of Florida, Utrecht University, and the Max Planck Institute of Empirical Aesthetics – demonstrate that the synchronization of brainwaves between students during class reflects how much they like the class and each other. They followed a group of 12 high school students and their teacher for an entire semester and recorded their brain activity during their regular biology classes, using portable electroencephalogram (EEG) technology (the EMOTIV EPOC system) combined with custom-software that allowed the simultaneous recording of multiple participants onto a single device.
The researchers compared the EEG readings of the students to each other and then explored the factors that might predict the level of synchronized brain activity between students with their self-reports on classroom engagement (e.g., students’ appreciation ratings of different teaching styles and their day-to-day focus level) and measures of classroom social dynamics: Students were not only asked how much they liked each other and the teacher, but also reported on how much they liked group activities in general. Both classroom engagement and social dynamics have been shown to be critical for learning. The results showed that the more a student’s brainwaves were in sync with those in the classroom as a whole, the more likely he or she was to give the course a favorable rating. Similarly, the greater the synchrony between an individual student and his or her classmates, the more likely they were to give positive ratings to the instructor’s teaching style. The researchers also examined whether or not brain-to-brain synchrony reflected how much students like each other. They found that pairs of students who felt closer to each other were more in sync during class, but only if they had interacted with each other face-to-face immediately before class. This suggests that having face-to-face interaction right before sharing an experience matters, even if you’re not directly interacting during that experience (like watching a video). Finally, students who considered group activities important in their lives, exhibited higher synchrony with their classmates.
Previous studies have typically measured single individuals or one-on-one interactions, in highly controlled laboratory settings. By contrast, the Current Biology work uses mobile technology to study dynamic social interactions in a complex group setting outside of the laboratory, shedding light on the role of brain synchrony in a more natural environment, as such providing a potentially promising new avenue to investigate the brain basis of everyday social interaction.
Brain Products MoBI Award 2017: 3rd place
Saboor, A., Rezeika, A., Stawicki, P., Gembler, F., Benda, M., Grunenberg, T., & Volosyak I. (2017). SSVEP-Based BCI in a Smart Home Scenario. In: Rojas, I., Joya, G., Catala, A. (Eds.), Advances in Computational Intelligence. IWANN 2017. Lecture Notes in Computer Science, vol 10306. Springer, Cham https://doi.org/10.1007/978-3-319-59147-6_41
Brain-Computer Interface (BCI) is a field of Human-Computer Interaction (HCI) in which the brain activity is recorded and analyzed in real-time. The BCIs provide a novel way for communication between human and computer/machines, which does not require any muscle activity. In the steady-state visually evoked potentials (SSVEP)-based BCIs the brain signals are generated in response to visual stimulation. The SSVEP-based BCI systems proved to be efficient and can be used in different environmental conditions (noisy / calm), with any physical state (resting / talking / walking) and irrespective of gender or age differences. The SSVEP-based BCI systems have various areas of applications, such as controlling wheelchairs or robots, entertainment and games, user authentication and many more.
Most of the SSVEP-based BCI systems used today are limited in terms of portability. Overall, the BCI systems are bulky due to use of monitors or LCD panels. Therefore, it was needed to develop a system which can be carried easily, thus, making truly portable BCI system.
This study explored the usage of BCI along with internet of things (IOT) in the smart home scenario. In this respect, the concept of portable BCI system was tested with the semi-transparent smart glasses Epson Moverio BT-200. The visual stimuli of 6Hz, 7Hz, 8Hz and 9 Hz were displayed in four corners of the graphical display presented on BT-200 augmented reality glasses. The QR-codes were used to identify the desired devices that the user could control with the BCI. In order to simulate a real-life scenario, the participants were asked to walk from one room to the other on different floors by using the elevator. During this task, the participants controlled the light switches, elevator, and a coffee machine, by focusing on SSVEP stimuli displayed on the smart glasses. These appliances were attached with wirelessly connected embedded systems (ESP-32 Thing). The signal processing was performed by a custom made processing unit using the minimum energy combination method (MEC). For EEG signal acquisition, the 24 bit active channel amplifier (actiCHamp, manufactured by Brain Products, Gilching, Germany) was used. During the EEG signal acquisition (8 EEG channels), an analog notch filter of around 50 Hz was used to negate the noise originating from nearby power cables, and a digital band pass filter was applied between 3 and 60 Hz.
The BCI performance was assessed by calculating the accuracy, i.e. the number of correct command classifications divided by the total number of classified commands. In total seven SSVEP commands needed to be selected by each of the seven participants. The majority of participants successfully controlled the system with very few errors; achieving the overall accuracy rate of 85.70%.
The results showed that a variety of people can efficiently use the SSVEP-based BCI, with the display interface being provided on smart glasses. All participants smoothly completed their tasks. Moreover, none of the subjects reported any discomfort, pain or dizziness while performing the experiment. Thus, these promising study results indicate that the combination of augmented reality and SSVEP could provide intuitive and reliable smart home control.