A brain-machine interface enables bimanual arm movements in monkeys - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2013 Nov 6;5(210):210ra154.
doi: 10.1126/scitranslmed.3006159.

A brain-machine interface enables bimanual arm movements in monkeys

Affiliations

A brain-machine interface enables bimanual arm movements in monkeys

Peter J Ifft et al. Sci Transl Med. .

Abstract

Brain-machine interfaces (BMIs) are artificial systems that aim to restore sensation and movement to paralyzed patients. So far, BMIs have enabled only one arm to be moved at a time. Control of bimanual arm movements remains a major challenge. We have developed and tested a bimanual BMI that enables rhesus monkeys to control two avatar arms simultaneously. The bimanual BMI was based on the extracellular activity of 374 to 497 neurons recorded from several frontal and parietal cortical areas of both cerebral hemispheres. Cortical activity was transformed into movements of the two arms with a decoding algorithm called a fifth-order unscented Kalman filter (UKF). The UKF was trained either during a manual task performed with two joysticks or by having the monkeys passively observe the movements of avatar arms. Most cortical neurons changed their modulation patterns when both arms were engaged simultaneously. Representing the two arms jointly in a single UKF decoder resulted in improved decoding performance compared with using separate decoders for each arm. As the animals' performance in bimanual BMI control improved over time, we observed widespread plasticity in frontal and parietal cortical areas. Neuronal representation of the avatar and reach targets was enhanced with learning, whereas pairwise correlations between neurons initially increased and then decreased. These results suggest that cortical networks may assimilate the two avatar arms through BMI control. These findings should help in the design of more sophisticated BMIs capable of enabling bimanual motor control in human patients.

PubMed Disclaimer

Conflict of interest statement

Competing interests: The authors declare that they have no competing interests.

Figures

Fig. 1
Fig. 1. Large scale electrode implants and behavioral tasks
(A) Monkey C (left) and monkey M (right) were chronically implanted with eight and four 96-channel arrays, respectively. (B) The monkey is seated in front of a screen showing two virtual arms and uses either joystick movements or modulations in neural activity to control the avatar arms. (C) 441 sample waveforms from typical Monkey C recording sessions with the color of the waveform indicating the recording site (shown in A). (D) Left to right: Trial sequence began with both hands holding a center target for a random interval. Next, two peripheral targets appeared which had to be reached to and held with the respective hands to receive a juice reward. (E, F) Raster plot of spike events from 438 neurons (y-axis) over time (x-axis) for a single unimanual (E) and bimanual (F) trial. Target location and position traces of trial are indicated to the right of the raster panel.
Fig. 2
Fig. 2. Comparison of bimanual behavioral training with cursor and avatar actuators
(A) Two 2D cursors or (B) two avatar arms were controlled by joystick movements. In both environments, the target for each hand is a white circle. Percentage of total trials containing a threshold amount of movements (avatar arm reached beyond 80% of distance from center to target) with the left arm (C) or right arm (D) shown in lower panels. The first ten sessions of avatar and cursor bimanual training, conducted on alternating days, are shown separately by blue and red marker type. * denotes p<0.05, t-test
Fig. 3
Fig. 3. Modulations of cortical neurons during manually performed unimanual and bimanual movements
(A) Representative left M1 neuron peri-event time histogram (PETH) aligned on target appearance (grey line) for each of 16 left and right target location combinations during bimanual movements. Below the 4×4 grid are corresponding PETHs for the same neuron during unimanual trials in each of the four directions. (B) Same layout as (A) for the population of left M1 neurons. Each row of each color plot panel represents a single neuron and the pixel color is the normalized firing rate or z-score (color scale at bottom). (C–D) Representative neuron (C) and neuronal population (D) in the supplementary motor area (SMA) brain region. (E–H) |Δz|¯ for each of the four movement directions for unimanual (red) and bimanual (blue) trials for the left (top) and right (bottom) arms: for one M1 neuron (E), for a population of M1 neurons (F), for one SMA neuron (G), and for a population of SMA neurons (H).
Fig. 4
Fig. 4. Directional tuning during bimanual versus unimanual movements
(A) Fraction of neurons in each cortical area which had significant tuning to both arms during unimanual (red) and bimanual (blue) trials, determined from regression. (B) The absolute value of the difference between preferred direction of the contralateral arm computed from bimanual trials and unimanual trials. Data shown separately for each cortical area. (C) Same analysis as (B) but showing the difference in preferred direction for the ipsilateral arm. All data are mean ± standard error. Analysis compiled from activity of 492 M1 neurons, 203 SMA neurons, 90 S1 neurons, and 61 PPC neurons.
Fig. 5
Fig. 5. Neuron dropping curves for joystick control
(A) Neuron dropping curves for unimanual joystick control,(B) bimanual joystick control using two 2D decoding models (C), inter-hand spacing, and (D), and bimanual joystick control using one 4D decoding model. Curves are shown separately for each area, indicated by color. (E–F) Offline predictions using 2D UKF for unimanual movements (E) and 4D UKF for bimanual movements (F).
Fig. 6
Fig. 6. Passive observation and brain control paradigms
(A) A monkey was seated in front of a screen with both arms gently restrained and covered by an opaque material during passive observation and BC without arms experiments. (B) Actual left and right arm X-position (black) compared with predicted X-position (red) for passive observation sessions. Pearson’s correlation, r, is indicated. (C) Performance of monkey C (left) and monkey M (right) quantified as fraction correct trials. Shown separately for monkey C are different decoding model parameter settings (red, blue markers) as well as brain control without arm movement sessions (black, both monkeys). Sessions with less than 10 attempted trials were set to zero due to insufficient data (open circles). (D) Fraction of trials where the left arm (green circles) and right arm (blue circles) acquired their respective target during brain control. Linear fit for learning trends of each paradigm is shown as in (C). (E–F) Fraction of correct predictions by k-NN of target location for each arm (blue/green) over the trial period during both passive observation (E) and brain control without arm movement (BC without arm movements) (F) in both monkey C (left column) and monkey M (right column). (G) Mean k-NN target prediction fraction correct from neuron dropping curves separated by cortical area for each monkey (same columns as E–F). UKF, unscented Kalman filter;
Fig. 7
Fig. 7. Cortical plasticity during passive observation and brain control without arm movement experiments
(A) UKF prediction performance r over time using passive observation data from the beginning of each session. (B) Mean correlation r of neural firing among recorded neuronal populations throughout the passive observation and brain control without arm movements (BC without arms) epochs of training sessions. (C) Mean inter- and intra-hemispheric (red) and inter- and intra-area (blue) correlation vs. session. (D) Neuron vs. neuron correlation indicated by pixel color for two monkeys on the first (left) and last (right) day of brain control without hands training for monkey C. Within each panel, neurons are sorted by cortical area and mean correlation strength. (E) Same as (D), except for monkey M. (A–C) Left column: monkey C, right column: monkey M. UKF, unscented Kalman filter; SMA, supplementary motor area; PP, posterior parietal cortex.

Similar articles

Cited by

References

    1. Swinnen SP, Duysens J. Neuro-behavioral determinants of interlimb coordination: a multidisciplinary approach. Kluwer Academic; Boston: 2004. p. xxx.p. 329.
    1. Lebedev MA, Nicolelis MA. Brain-machine interfaces: past, present and future. Trends Neurosci. 2006 Sep;29:536. - PubMed
    1. Nicolelis MA, Lebedev MA. Principles of neural ensemble physiology underlying the operation of brain-machine interfaces. Nat Rev Neurosci. 2009 Jul;10:530. - PubMed
    1. Lebedev MA, Nicolelis MA. Toward a whole-body neuroprosthetic. Prog Brain Res. 2011;194:47. - PubMed
    1. Chestek CA, et al. Neural prosthetic systems: current problems and future directions. Conf Proc IEEE Eng Med Biol Soc. 2009;2009:3369. - PubMed

Publication types