This article reviews research at the University of Houston on the design of non-invasive and reliable brain-machine interface (BMI) systems for the control of powered exoskeletons for restoration and rehabilitation of gait in persons with paraplegia and other forms of paralysis. The BMI system, based on risk-free scalp electroencephalography (EEG), actively includes the patient in the control loop, thereby making the assistive device ‘active’ and engaging while stimulating cortical plasticity. This represents a paradigm shift in how users perceive and control restoration of mobility using powered exoskeletons.
Working principles of wheelchairs for people with lower-limb paralysis has remained the same for centuries. In the last decade, researchers around the world focused their efforts in building human-machine systems (HMS) that provide mobility closer to the natural walk patterns of able-bodied people. It is possible to categorize HMS in term of their form factor, type of mobility they provide, their need for external support mechanisms while walking and the extent to which the user is involved in the control of the robot – that is, the level of human-machine interaction (HMI) required to use the exoskeleton. Among commercial robotic exoskeleton systems, the REX exoskeleton (REX Bionics, New Zealand) has the capability of walking without an external support (such as a walker or canes), in a balanced set of configurations. One can initiate the walking (or turning, sitting, standing and stepping up/down motions) by the use of a joystick that is attached to the handlebar of the system.
In these systems the physical human-robot interface is critical as the user’s body, including hip, knees and ankles, must be aligned and properly secured to the exoskeleton to prevent physical injuries. At the University of Houston, our research focuses on providing more natural and intuitive ways of supplying command signals for such systems – that is, the brain-machine interface. With the exception of the Hybrid Assistive Limb (HAL, Cyberdyne, Japan) exoskeleton which combines autonomous and myoelectric (EMG) control, most exoskeletons are currently controlled via an external operator or by detecting the user’s intent from upper-body gestures, hand-controlled, joysticks, or via motion sensors embedded in walkers or instrumented canes. Although there are other types of usable controls signals such as eye movements, lip/tongue motions, and voice commands, it can be argued the most natural way would be to harness one’s own brain activity and using it to drive such systems. In this way, the user would just think about walking or sitting, and the system would interpret the changes in his/her brain activity – that is, his/her motor intentions, as correct commands to the exoskeleton. And the system would do this in real-time while liberating our hands, voice and eyes for communication and tool use.
Our Laboratory for Non-invasive Brain-Machine Interface Systems at the University of Houston is one of the leading laboratories in this field of research. We use non-invasive active EEG system (BrainAmp DC, Brain Products, Gilching, Germany) to acquire brain waves that are processed in real-time to extract the user’s intent to control exoskeletons such as the REX robot. An active and wireless 64-channel electrode cap (actiCAP and MOVE systems, Brain Products, Gilching, Germany) is placed on the user’s head and his/her brain waves are measured in real-time as he conducts a series of motions with the exoskeleton. We then use advanced algorithms to map the slow modulation of the user’s brainwaves (amplitude modulation of the slow cortical potentials) for each motion that is performed with the exoskeleton. During an initial calibration phase, we record as few as 5 – 10 minutes of EEG to build the decoder model to extract motor intent, specific to tasks performed with the robot. We then ask the user to use only his thoughts to control the exoskeleton (thereby putting the user-in-the-loop), namely we ask the user to think about walking (kinesthetically) as we evaluate the model with his/her real-time EEG data. In one instantiation of our approach, the model output is a series of discrete commands to drive the exoskeleton obtained via a decoder model based on machine learning algorithms.
Our ongoing studies show that it is possible to decode efficiently a patient’s brainwaves in real-time to extract his/her motor intent. Our group was the first to demonstrate EEG-based closed-loop BMI control of the REX exoskeleton in patients with spinal cord injury (SCI). This research is in progress, and we are moving forward with new innovative methods to increase the long-term reliability and accuracy of the system in settings outside the laboratory. The personal impact that the ‘NeuroRex’, as we call the BMI-controlled REX system, can have in the users, is best reflected in the following quote by one of our SCI volunteers who uses the system frequently:
We are currently developing and extending our real-time algorithms to dismiss some of the disadvantages of using non-invasive methods for such tasks. As debated in the literature, EEG signals are often susceptible to physiological and non-physiological distortions (artifacts), which in turn, can cause degradation in signal quality and decoder accuracy. Our laboratory has already showed that EEG can provide enough information to decode a user’s intention of motion, surface EMG, or even the continuous parameters of walking (such as knee angle, velocity and such). By integrating real-time artifact removal techniques to further increase the signal quality will no doubt provide us with more robust applications and increase the overall rate of success. Nevertheless, our working brain-machine interface (BMI) system to REX has proven to be very robust, with first time SCI persons achieving brain control within the first session [2-4]. This September, in Baiona (Spain), we collaborated with Brain Products and Rex Bionics to bring a new SCI volunteer to the First International Workshop on Wearable Robots (WeRob 2014) to demonstrate real-time closed-loop control of a powered exoskeleton using an EEG-based neural interface. Our volunteer, Lee Warn (a SCI survivor from New Zealand) happily agreed to be fitted with the EEG skullcap and REX. After a brief period of decoder training, Lee was able to control REX’s movements with his thoughts (see Lee’s short report below). This was a great moment in that our BMI system showed to be robust to real environments (we performed the demonstration outdoors and in front of many scientists while discussing the system) and easy to calibrate and operate by new subjects.
This work of societal and personal impact is only possible to the collaborative efforts of a team of engineers at the University of Houston and our clinical partners at the Houston Methodist Hospital Research Institute led by Dr. Robert Grossman, and our dedicated volunteers. We are now conducting longitudinal clinical trials at the Houston Methodist Hospital in collaboration with Dr. Robert Grossman to evaluate the benefits and risks associated with NeuroRex . This 3 years study is examining not only brain and gait adaptations due to therapy with NeuroRex, but also potential benefits across multiple physiological systems such as bladder function, bowel movements, cardiovascular health, and skin conditions.
The NeuroRex research is partially funded by the TIRR Mission Connect Foundation and The Cullen Foundation. The neural interface development and decoding of lower-limb movements have been supported in part by awards by the National Institute of Neurological Disorders and Stroke (NINDS) and the National Science Foundation (NSF).
For further information: email to Jlcontreras-vidal[at]uh.edu, or visit www.facebook.com/UHBMIST