Non-invasive brain-machine interfaces to powered exoskeletons for restoration of walking
by Jose L ‘Pepe’ Contreras-Vidal and Atilla Kilicarslan
University of Houston (USA), Department of Electrical and Computer Engineering
This article reviews research at the University of Houston on the design of non-invasive and reliable brain-machine interface (BMI) systems for the control of powered exoskeletons for restoration and rehabilitation of gait in persons with paraplegia and other forms of paralysis. The BMI system, based on risk-free scalp electroencephalography (EEG), actively includes the patient in the control loop, thereby making the assistive device ‘active’ and engaging while stimulating cortical plasticity. This represents a paradigm shift in how users perceive and control restoration of mobility using powered exoskeletons.
Working principles of wheelchairs for people with lower-limb paralysis has remained the same for centuries. In the last decade, researchers around the world focused their efforts in building human-machine systems (HMS) that provide mobility closer to the natural walk patterns of able-bodied people. It is possible to categorize HMS in term of their form factor, type of mobility they provide, their need for external support mechanisms while walking and the extent to which the user is involved in the control of the robot – that is, the level of human-machine interaction (HMI) required to use the exoskeleton. Among commercial robotic exoskeleton systems, the REX exoskeleton (REX Bionics, New Zealand) has the capability of walking without an external support (such as a walker or canes), in a balanced set of configurations. One can initiate the walking (or turning, sitting, standing and stepping up/down motions) by the use of a joystick that is attached to the handlebar of the system.
In these systems the physical human-robot interface is critical as the user’s body, including hip, knees and ankles, must be aligned and properly secured to the exoskeleton to prevent physical injuries. At the University of Houston, our research focuses on providing more natural and intuitive ways of supplying command signals for such systems – that is, the brain-machine interface. With the exception of the Hybrid Assistive Limb (HAL, Cyberdyne, Japan) exoskeleton which combines autonomous and myoelectric (EMG) control, most exoskeletons are currently controlled via an external operator or by detecting the user’s intent from upper-body gestures, hand-controlled, joysticks, or via motion sensors embedded in walkers or instrumented canes. Although there are other types of usable controls signals such as eye movements, lip/tongue motions, and voice commands, it can be argued the most natural way would be to harness one’s own brain activity and using it to drive such systems. In this way, the user would just think about walking or sitting, and the system would interpret the changes in his/her brain activity – that is, his/her motor intentions, as correct commands to the exoskeleton. And the system would do this in real-time while liberating our hands, voice and eyes for communication and tool use.
Our Laboratory for Non-invasive Brain-Machine Interface Systems at the University of Houston is one of the leading laboratories in this field of research. We use non-invasive active EEG system (BrainAmp DC, Brain Products, Gilching, Germany) to acquire brain waves that are processed in real-time to extract the user’s intent to control exoskeletons such as the REX robot. An active and wireless 64-channel electrode cap (actiCAP and MOVE systems, Brain Products, Gilching, Germany) is placed on the user’s head and his/her brain waves are measured in real-time as he conducts a series of motions with the exoskeleton. We then use advanced algorithms to map the slow modulation of the user’s brainwaves (amplitude modulation of the slow cortical potentials) for each motion that is performed with the exoskeleton. During an initial calibration phase, we record as few as 5 – 10 minutes of EEG to build the decoder model to extract motor intent, specific to tasks performed with the robot. We then ask the user to use only his thoughts to control the exoskeleton (thereby putting the user-in-the-loop), namely we ask the user to think about walking (kinesthetically) as we evaluate the model with his/her real-time EEG data. In one instantiation of our approach, the model output is a series of discrete commands to drive the exoskeleton obtained via a decoder model based on machine learning algorithms.
Our ongoing studies show that it is possible to decode efficiently a patient’s brainwaves in real-time to extract his/her motor intent. Our group was the first to demonstrate EEG-based closed-loop BMI control of the REX exoskeleton in patients with spinal cord injury (SCI). This research is in progress, and we are moving forward with new innovative methods to increase the long-term reliability and accuracy of the system in settings outside the laboratory. The personal impact that the ‘NeuroRex’, as we call the BMI-controlled REX system, can have in the users, is best reflected in the following quote by one of our SCI volunteers who uses the system frequently:
“It was great at the beginning when I used the exoskeleton, standing up again and walking with just a push of joystick. But it is after I started using the brain-control interface that I felt the exoskeleton is not carrying me around, but I am controlling it!”
We are currently developing and extending our real-time algorithms to dismiss some of the disadvantages of using non-invasive methods for such tasks. As debated in the literature, EEG signals are often susceptible to physiological and non-physiological distortions (artifacts), which in turn, can cause degradation in signal quality and decoder accuracy. Our laboratory has already showed that EEG can provide enough information to decode a user’s intention of motion, surface EMG, or even the continuous parameters of walking (such as knee angle, velocity and such). By integrating real-time artifact removal techniques to further increase the signal quality will no doubt provide us with more robust applications and increase the overall rate of success. Nevertheless, our working brain-machine interface (BMI) system to REX has proven to be very robust, with first time SCI persons achieving brain control within the first session [2-4]. This September, in Baiona (Spain), we collaborated with Brain Products and Rex Bionics to bring a new SCI volunteer to the First International Workshop on Wearable Robots (WeRob 2014) to demonstrate real-time closed-loop control of a powered exoskeleton using an EEG-based neural interface. Our volunteer, Lee Warn (a SCI survivor from New Zealand) happily agreed to be fitted with the EEG skullcap and REX. After a brief period of decoder training, Lee was able to control REX’s movements with his thoughts (see Lee’s short report below). This was a great moment in that our BMI system showed to be robust to real environments (we performed the demonstration outdoors and in front of many scientists while discussing the system) and easy to calibrate and operate by new subjects.
This work of societal and personal impact is only possible to the collaborative efforts of a team of engineers at the University of Houston and our clinical partners at the Houston Methodist Hospital Research Institute led by Dr. Robert Grossman, and our dedicated volunteers. We are now conducting longitudinal clinical trials at the Houston Methodist Hospital in collaboration with Dr. Robert Grossman to evaluate the benefits and risks associated with NeuroRex . This 3 years study is examining not only brain and gait adaptations due to therapy with NeuroRex, but also potential benefits across multiple physiological systems such as bladder function, bowel movements, cardiovascular health, and skin conditions.
The NeuroRex research is partially funded by the TIRR Mission Connect Foundation and The Cullen Foundation. The neural interface development and decoding of lower-limb movements have been supported in part by awards by the National Institute of Neurological Disorders and Stroke (NINDS) and the National Science Foundation (NSF).
For further information: email to Jlcontreras-vidal[at]uh.edu, or visit www.facebook.com/UHBMIST
 Contreras-Vidal J, Presacco A, Agashe H, Paek A. (2012).
Restoration of whole body movement: toward a noninvasive brain-machine interface system.
IEEE Pulse. 2012 Jan; 3(1): 34-7.
 Presacco A, Forrester LW, Contreras-Vidal JL. (2012).
Decoding intra-limb and inter-limb kinematics during treadmill walking from scalp electroencephalographic (EEG) signals.
IEEE Trans Neural Syst Rehabil Eng. 2012 Mar; 20(2): 212-9.
 Kilicarslan, Atilla; Prasad, Saurabh; Grossman, Robert G.; Contreras-Vidal, Jose L.
High accuracy decoding of user intentions using EEG to control a lower-body exoskeleton
Conf Proc IEEE Eng Med Biol Soc. 2013; 2013:5606-9. doi: 10.1109/EMBC.2013.6610821
 Contreras-Vidal JL, Grossman RG. (2013).
NeuroRex: A clinical neural interface roadmap for EEG-based brain machine interfaces to a lower body robotic exoskeleton.
Conf Proc IEEE Eng Med Biol Soc. 2013;2013:1579-82. doi: 10.1109/EMBC.2013.6609816.
How it feels to walk using your thoughts
My name is Lee Warn and I am a paraplegic who has travelled to the over side of the planet from New Zealand, to walk using nothing but my thoughts! I never thought this was even possible a few years ago, but hear I was in a 9th Century castle thanks to Rex legs and a crazy skull cap from Brain Products, with more than a little computer assistance from Prof. Jose Contreras-Vidal and Prof. Atilla Kilicarslan.
I had the privilege to be invited along as ambassador for Rex Bionics to the WeRob (Wearable Robotics) workshops in Spain. At this event, Rex Bionics asked if I would like to experience walking in Rex exoskeleton using only thought control. Yes, of course I accepted.
Spain as a country was amazing, I spent the majority of my time in the 9th Century castle Parador de Baiona where the WeRob event was taking place, truly is a sight to behold. Here we are, 21st Century exoskeletons and other robotic gadgets in a castle with such a strong historical past. The work that Rex Bionics, Brain Products and the University of Houston (Texas) have done is remarkable to say the least.
Whilst getting set up in the brain skull cap, I talked a lot and asked questions, mostly because I was so nervous. I kept thinking, “…could I make it work, is my brain strong enough to put out a signal that makes Rex move, what if I can’t make it work?”
Dr. Roland Csuhaj from Brain Products and Prof. Atilla Kilicarslan were both very comforting and helped calm me by even laughing at my bad jokes. Yet, they showed no signs of the same concerns I had.
After getting all gelled up, electrode cap on, they unleashed the computers to provide the computer programs with enough of a model to make the system work. Basically I thought of it like this:
The computer controls Rex and with my hands free it creates a picture (model) of what brain activity I have while walking. From there, I think of the same thing I was while the computer was controlling Rex, thereby creating the same model for the computer program and moving Rex with my thoughts.
After a very short training (less than half an hour) we were ready to go, so then the whole system (Rex and I were WiFi linked to the computers which told Rex what to do) outside because the weather was so lovely. Everyone was hesitant as the how I and the complicated system would respond to the new stimuli of being outside, the crowd and I was told about half the normal model they would use …