The partially paralyzed man had two robotic arms hooked up to his brain through a brain-machine interface.
In a remarkable technological development, a partially paralyzed man was able to feed himself with the use of specialized robotic arms. Research carried out at Johns Hopkins Applied Physics Laboratory led to a brain-machine interface that allowed a pair of robotic arms to be hooked up to his brain. The man hadn't been able to use his fingers in 30 years, but with the help of the specialized robotic arms, he was able to use utensils to cut food and and bring it to his mouth, according to the study published in Frontiers in Neurorobotics. "Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines," Dr. Francesco Tenore, a senior project manager in APL's Research and Exploratory Development Department, told The Independent.
The technology employs a combination of brain-computer interfaces and intelligent robotic systems to help the man control the arms and eat his food. The development could help those who are paralyzed and those with other neurological disorders. "This research is a great example of this philosophy where we knew we had all the tools to demonstrate this complex bimanual activity of daily living that non-disabled people take for granted," said Tenore. "Many challenges still lie ahead, including improved task execution, in terms of both accuracy and timing, and closed-loop control without the constant need for visual feedback," added the senior project manager.
In this particular case, a shared control system was used to carry out a set of task-specific motions. While the robot is trained to carry out a specific function such as "cut the food," it must be able to estimate when the brain-machine interface user is satisfied with their input and is done with the active step so that the robot can move on to the next step. At the start of each task, the robot waits for the user to provide initial input (gestures), then waits for the user to stop providing input, before carrying out the task.
The study says there's a lot more progress to be made before the person can control the arms to the degree they desire and in this particular experiment, the participant with microelectrode arrays in sensorimotor brain regions provide the commands to the robotic arms. "Using neurally-driven shared control, the participant successfully and simultaneously controlled movements of both robotic limbs to cut and eat food in a complex bimanual self-feeding task," noted the study. The development has "major implications for restoring complex movement behaviors for those living with sensorimotor deficits." The study says the technology and progress in the field could see those with sensorimotor impairments, such as a spinal cord injury, could navigate daily tasks such as self-feeding. Brain-machine interfaces have the potential to increase the independence of such individuals by providing control signals to prosthetic limbs. The technology could be used to restore function by decoding neural signals for a variety of applications including handwriting, restoring speech, perceiving artificial stimulation, controlling external robotic limbs, and more.