future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

1st March 2017

Two-way communication in brain-machine interface achieved for the first time

Optical techniques for imaging and stimulating brain activity could lead to a new generation of more precise, bidirectional neural prostheses.

 

brain machine interface bmi future timeline
Credit: © Daniel Huber, UNIGE

 

Since the early 1970s, scientists have been developing brain-machine interfaces; the main application being the use of neural prosthesis in paralysed patients or amputees. A prosthetic limb directly controlled by brain activity can partially recover the lost motor function. This is achieved by decoding neuronal activity recorded with electrodes and translating it into robotic movements. However, such systems have limited precision, due to the absence of sensory feedback from the artificial limb.

Neuroscientists at the University of Geneva (UNIGE), Switzerland, looked at whether it was possible to transmit this missing sensation back to the brain by stimulating neural activity in the cortex. They discovered that not only was it possible to create an artificial sensation of neuroprosthetic movements, but that the underlying learning process occurs very rapidly. These findings, published in the scientific journal Neuron, were obtained by using modern imaging and optical stimulation tools, an alternative to the classical electrode approach.

Motor function is at the heart of all behaviour and allows us to interact with the world. Therefore, replacing a lost limb with a robotic prosthesis is the subject of much research, yet successful outcomes are rare. Why is that? Until now, brain-machine interfaces have been operated by relying largely on visual perception: the robotic arm is controlled by looking at it. The direct flow of information between the brain and machine thus remains unidirectional. However, movement perception is not only based on vision, but mostly on proprioception – the sensation of where the limb is located in space.

“We have therefore asked whether it was possible to establish a bidirectional communication in a brain-machine interface: to simultaneously read out neural activity, translate it into prosthetic movement and reinject sensory feedback of this movement back in the brain,” explains Daniel Huber, professor in the Department of Basic Neurosciences at UNIGE.

In contrast to traditional invasive approaches using electrodes, Huber’s team specialises in optical techniques for imaging and stimulating brain activity. Using a method called two-photon microscopy, they routinely measure the activity of hundreds of neurons with single cell resolution: “We wanted to test whether mice could learn to control a neural prosthesis by relying uniquely on an artificial sensory feedback signal”, explains Mario Prsa, researcher at UNIGE and the first author of the study. “We imaged neural activity in the motor cortex. When the mouse activated a specific neuron, the one chosen for neuroprosthetic control, we simultaneously applied stimulation proportional to this activity to the sensory cortex using blue light.”

Neurons of the sensory cortex were rendered photosensitive to this light, allowing them to be activated by a series of optical flashes and thus integrate the artificial sensory feedback signal. The mouse was rewarded upon every above-threshold activation, and just 20 minutes later, once the association was learned, the rodent was able to more frequently generate the correct neuronal activity.

This means that the artificial sensation was not only perceived, but that it was successfully integrated as a feedback of the prosthetic movement. So in this manner, the brain-machine interface functions bidirectionally. The Geneva researchers think that the reason why this fabricated sensation is so rapidly assimilated is because it most likely taps into very basic brain functions. Feeling the position of our limbs occurs automatically, without much thought and probably reflects fundamental neural circuit mechanisms. In the future, this type of bidirectional interface could allow more precisely displacing robotic arms, feeling touched objects or perceiving the necessary force to grasp them.

At present, the neuroscientists at UNIGE are examining how to produce a more efficient sensory feedback. They are currently capable of doing it for a single movement – but is it also possible to provide multiple feedback channels in parallel? This research sets the groundwork for developing a new generation of more precise, bidirectional neural prostheses.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 


 

comments powered by Disqus

 

« Previous Next »
 
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed

Privacy Policy