A new research paper has integrated robotics with neuroscience to decode information from the brain to an exoskeleton, and then send sensory information back to the brain in real time.
The paper was authored by Jun Morimoto, head of department of Brain Robot Interface and ATR Computational Neuroscience Labs, and Mitsuo Kawato, director of ATR Brain Information Communication Research Laboratory Group.
The researchers integrated computational neuroscience, brain-motivated robotics and brain-machine interface (BMI) to devise real-time brain-to-exoskeleton information transmission.
“A human volunteer wears a whole-body exoskeleton robot. Her/his brain activity is measured and decoding is carried out in real time. The decoded brain information is used to influence the robot control algorithms to realise brain-to-robot information transmission,” the researchers explained in their paper <i>Creating the brain and interacting with the brain: an integrated approach to understanding the brain</i>.
“Because the robot is attached to a human body, robot motion generates multi-modal sensory feedback to the brain by implementing robot-to-brain information transmission.”
The benefits of this kind of technology are mostly in the health industry where patients with disorders or injuries could use it to retrain themselves to carry out specific physical functions.
“In recent years, it has been found that using brain activity to control a robotic assistive system is also useful to help stroke patients recover their motor functions.
“These exoskeleton robots can be used as assistive or prosthetic devices for stroke and spinal-cord injury patients in rehabilitation programmes.
“As many countries are facing the problem of ageing populations, the development of an exoskeleton robot to assist user movements is becoming an important research topic,” the researchers said.
The researchers developed a ‘DecNef’ method that decodes information from functional magnetic resonance imaging (fMRI) brain activity, and gives real-time neuro-feedback of the decoded information to the brain and unconscious reinforcement learning by the volunteers.
“First, a decoder is constructed that classifies fMRI multi-voxel [like a pixel] patterns according to such specific brain information as visual attributes, emotional states or normal and pathological states in psychiatric disorders.
“In the induction stage, volunteers unconsciously control their brain activity to produce a desirable pattern while being guided by a real-time reward signal computed as decoder output.”
The method can also be used as a non-invasive BMI system for treating patients with psychiatric disorders, and might provide a “revolutionary” cure for autism spectrum disorders where pharmacological and cognitive/behavioural therapies have proved ineffective, the researchers said.
The researchers developed a custom made electroencephalogram (EEG) exoskeleton robot designed to assist lower limb movements of human users. The researchers investigated whether a volunteer’s intended movements can be extracted from EEG signals even assisted by an exoskeleton robot.
A volunteer tested it out by trying to control the exoskeleton by following the up and down movements on a display screen in front of him using motor imagery. The volunteer was able to successfully control the exoskeleton through his brain activities.
“DecNef can also be regarded as an extended and specific type of BMI. The neurofeedback itself is one component in a methodology frequently used in BMI. However, DecNef is an extended version of neurofeedback and uses sophisticated decoding and reward feedback, which are not necessarily always involved in BMI.
“The most compelling neurofeedback device would be an exoskeleton humanoid robot, as it can provide a rich set of sensory as well as kinaesthetic information of all body parts. While decoding from a user brain, we can manipulate an XoR [exoskeleton robot] to control the spatio-temporal activity patterns of the user's brain to a larger extent than with the currently available narrow neurofeedback channel.”
The researchers are planning to also look at other use cases of exoskeleton robots in neuroscience besides treatment of physical disorders or injuries.
“We especially need to investigate how the interaction between the human body and the mind with XoR and its control algorithms can produce emotional information that fuel reinforcement learning.”