Researchers at the University of California, San Francisco, have enabled a man paralyzed to regularly control a robotic wing using signals from his brain to transmit your computer.
He was able to capture, move, and release objects simply imagining himself performing actions. The device, known as a brain computer interface (BCI), was successfully entertained for a seven -month record without seeking any adjustment.
So far, such devices had only worked for one or two days.
This bci relics in a pattern of artificial intelligence (the one) that adapts to minor changes in brain activity, as a person repeatedly imagines a movement, gradually improving its act.
“This mix of learning among the human sciences and it is the next stage for these brain-computers interfaces,” said Professor Karunesh Ganguly, a neurologist at the UCSF Weill Institute for Neurosciences. “What is what we need to achieve a sophisticated, lively function.”
The study, founded by the US National Institute of Health, was published on March 6 in the journal Cell.
One of the study participants, who lost the ability to move and speak after a stroke years ago, can now control the robotic arm imagining specific movements.
The main progress involved understanding how brain activity is shifted day by day when the participant repeatedly imagines making these movements.
After the system he was trained to account for these changes, he held performance for months.
Professor Ganguly previously studied brain activity models in animals and observed that these models evolved as animals learned new movements.
He suspected that the same process was happening in humanans, which explained why early BCI soon lost their ability to interpret brain signals.
Ganguly and Dr. Nikhilesh Natraj, a neurology researcher, worked with a participant who was paralyzed by a stroke and could not even move.
The participant had small sensors implanted on the surface of his brain to detect nerve activity when he imagined moving.
To investigate whether these brain models changed over time, participants were asked to imagine moving different parts of the body, such as his hands, feet and head.
While he could not move physically, his brain continued to generate signals that correspond to these imagined movements.
BCI recorded these signals and found that while the general models remained the same, their precise places in the brain were moved slightly daily.
The researchers then asked the participants to simply imagine the movements of the finger, hand and finger for two weeks as the system he learned to interpret his brain activity. Initially, robotic wing movements were incorrect.
To improve accuracy, the practice of participants using a virtual robotic wing that provided reactions on how closely his imagined movements matched the target actions.
Eventually, he was able to take the virtual arm to perform the desired tasks. Once the participant began to practice with the true robotic arm, only a few practical sessions were to transfer his skills to the real world. He was able to use the robotic arm to get blocks, return them and transfer them to new places.
He was even able to open a cabinet, take a cup and keep it under a water distributor. Months later, he could still control the robotic arm after a short 15-minute “tuning” to fix changes in his brain activity over time.
Ganguly and his team are now working to refine the model to make the robotic arm move faster and better. They also plan to test the system in a home environment. For people with paralysis, the ability to simply perform tasks like their nutrition or taking a drink of water can be changing life.
“I am very sure we have learned how to build the system now, and that we can do this work,” Ganguly said.
#Scientists #create #robotic #wing #move #imagination #details #revealed
Image Source : nypost.com