We’ve all seen the videos of paralyzed or locked in patients moving a robot arm to direct something tasty toward their mouth. While impressive, many folks still wonder just how good these brain-machine interfaces (BMIs) are, and more importantly for the patients themselves, how good will they become?

Studying the brain activity of two patients with Lou Gehrig’s disease has given researchers insight into how neurons control muscle movement. (Credit: Oliver Burston)

Researchers led by Stanford engineer Krishna Shenoy have recently developed new computational methods to make BMIs significantly more accurate. They liken the process to estimating the trajectory of a cannonball at dusk, where you have only a few hundred noisy pixels to see it with. The pixels here would correspond to the measurements of the activity of neurons in arm region of the motor cortex.

Most BMIs today try to estimate a neural state without really understanding the dynamics of the network itself. Typically, they use methods like principal component or factor analysis, which are going to give you a lot of noise for single trial experiments. But a wise cannoneer also knows the cannonball has its own dynamics, in this case represented by standard Newtonian mechanics. By adding in the network dynamics, a neural trajectory from one state to the next can be estimated and used to generate more accurate movements.

The researchers recorded activity from a Utah-style 100-electrode arrays implanted in monkeys choosing objects on a screen. They then estimated the variance in the neural state by modelling the spike counts as a 20-dimensional dynamical system. The results are shown in the video below. The first session shows the monkey successively moving its arm to a spot, while the second shows the results for the BMI model moving the cursor to the spot.

In case you weren’t following the counts too closely, the monkey hit 10 targets in 9.9 seconds with their hands, and hit them in 11.4 seconds using the pure thought-control device. (Credit, Jonathan Kao, Shenoy Lab) That kind of single character-per-second speed at least puts us in the range of your average typist if we assume an average rate of 180 letters per minute (3 letters per second).

Perhaps the larger point here is that the virtual ‘thought arm’ is already nearly as fast as the real kinematic arm in this instance. Think about that for a minute. If a fairly rough-hewn algorithm based on a relatively small sample of neurons is already par for the course, does anyone doubt that pretty soon this flavor of mind control will soon far outstrip the physical control for many mission critical tasks? And when that happens, who won’t be toying with the idea of showing up at the clinic and presenting with some form of early-onset ALS symptomatology, even if only to qualify for an affordable healthcare-provided championship jousting exosuit?

Consider the recently tested EyeControl glasses which are now being made available for ALS patients or those with similar neurological disease. If you have to surrender your eye movements and blinks to a machine in order to accomplish a task, you invariably are going to a significant portion of your remaining basic freedoms. But as the brain is a highly parallel machine, eye movements are probably not going to significantly modulate the activity of neurons in the motor cortex — in fact there is a fair bit of experimental evidence for that.

What this means is that things like external eye control glasses may be an interim stop-gap, but what you ultimate want to wait in line for is something more cerebral. Pilot clinical trials of this new BMI technology for human use are already underway. Considering that this advance is really just a procedural refinement on existing platform, we should expect it in clinics near you.

Source:

By on August 3, 2015

http://www.extremetech.com/extreme/211388-brain-controlled-prosthesis-nearly-as-good-as-one-finger-typing

05.08.2015 | 519 Aufrufe

Kommentare

Avatar
Sicherheitscode