Abstract
Neuroprosthetic limbs reconnect severed neural pathways for control of (and increasingly sensation from) an artificial limb. However, the plastic interaction between robotic and biological components is poorly understood. To gain such insight, we developed a novel noninvasive neuroprosthetic research platform that enables bidirectional electrical communications (action, sensory perception) between a dexterous artificial hand and neuronal cultures living in a multichannel microelectrode array (MEA) chamber. Artificial tactile sensations from robotic fingertips were encoded to mimic slowly adapting (SA) or rapidly adapting (RA) mechanoreceptors. Afferent spike trains were used to stimulate neurons in a region of the neuronal culture. Electrical activity from neurons at another region in the MEA chamber was used as the motor control signal for the artificial hand. Results from artificial neural networks (ANNs) showed that the haptic model used to encode RA or SA fingertip sensations affected biological neural network (BNN) activity patterns, which in turn impacted the behavior of the artificial hand. That is, the exhibited finger tapping behavior of this closed-loop neurorobotic system showed statistical significance (p<0.01) between the haptic encoding methods across two different neuronal cultures and over multiple days. These findings suggest that our noninvasive neuroprosthetic research platform can be used to devise high-throughput experiments exploring how neural plasticity is affected by the mutual interactions between perception and action.
I. INTRODUCTION
Amputation of an upper limb is a devastating injury that impacts millions of people worldwide [1]. Severance of afferent neural pathways deprives amputees of the rich multimodal sensations of touch afforded by the broad distribution of mechanoreceptors in the human fingertips [2], adversely impacting motor control of prosthetic limbs [3]. The field of neuroprosthetics has tremendous potential to restore severed sensations of touch to amputees by use of electrodes implanted in peripheral nerves of the residual limb [4]. The robotic capability to sense haptic properties is well-established [2] and prosthetic fingertip sensations can be encoded into frequency-modulated spike trains to convey graded biomimetic sensations of touch to amputees outfitted with neuroprosthetic limbs [5]. Mechanoreceptors from the human glabrous skin fall into two broad categories: SA and RA. Each type of mechanoreceptor is responsible to process and encode a different aspect of tactile experience depending on specific spatiotemporal properties and response functions. Generally speaking, SA mechanoreceptors can detect static pressure, texture, and lateral skin stretch while the RA mechanoreceptors are used to sense sliding contact and high frequency vibration [2], among other sensations and modalities combining the two.
Several prior works fruitfully explored the control and robotic embodiment of biological neural networks (BNNs) in closed-loop architectures with MEAs in vitro. Potter and colleagues demonstrated control of neural network bursting by modulating the stimulation voltage based on the culture-wide firing rate [7]. Building upon this, the same group developed multiple closed loop MEA architectures for action control. One developed a stimulation technique for goal-directed motion guidance of an animated display using living cortical neurons [8]. Another demonstrated how the dynamics of BNNs impact control of a robotic arm for an artistic painting display [9] and yet another controlled the motion of a simulated mobile robot [10].
A breakthrough work integrated action and perception. It used a biological interface made from rats’ cortical neurons cultured in MEA chambers. This bidirectional mobile robot control architecture mapped robotic sensory input to stimulate and alter neuronal dynamics, and subsequently impact robotic behavior in an obstacle avoidance paradigm [11]. Compartmentalizing the MEA chamber into different sections produced different system level dynamics due to better separation between the input and output signals, producing simulated obstacle avoidance performances [12].
In a challenging paradigm in vivo, a study examined improvement in neuroprosthetic hand control in a subject who, following amputation, was implanted with a transverse intrafascicular multichannel electrode interface to provide a proportional sense of the grip force via her ulnar nerve, conveying signals that SA mechanoreceptors had provided prior to amputation [3]. The subject successfully demonstrated sensory-motor integration by improving grip force control of the artificial hand. Yet, the subject reported incongruent sensations of vibration (similar to what would naturally be transmitted by RA, not SA mechanoreceptors) [3], revealing the hurdles still existing towards restoring haptic information with neurophenomenological fidelity.
With this inherent complexity, there remains much to learn in the field of touch sensation restoration. Nevertheless, regulatory, ethical, and financial constraints remain considerable challenges for state-of-the-art experimentation in vivo. For these reasons, only a limited number of patients have used bidirectional neuroprosthetic hands thus far [6], limiting research progress.
To circumvent these bottlenecks, we propose a novel noninvasive neuroprosthetic research platform. It enables bidirectional electrical communication between a dexterous artificial hand and living BNN cultured in a MEA (Fig. 1). To demonstrate this platform, RA or SA tactile sensations from the robotic fingertip (Fig. 1(e)-(g)) are used to biomimetically stimulate the neurons in the MEA (Fig. 1(a), red) and the recorded neuronal activity (Fig. 1(a), black) is decoded to control the robotic hand (Fig. 1(b)-(d)). This form of BNN embodiment removes many of the regulatory and financial barricades required for invasive human studies. This platform could catalyze a deeper understanding of the interaction between biological and artificial components of neuroprosthetic limbs with high throughput implementation to aid in restoring severed sensations of touch to amputees.
Recently, ANNs have been used to pattern match inputs and outputs within bidirectional BNNs [16]. In this paper, we will show that ANNs can be used to classify differences in BNN activity due to different pulse train patterns, in this case bioinspired SA or RA encodings of fingertip tactile sensations. Furthermore, we will show that different neuronal activity patterns elicit different robotic behaviors in a closed loop fashion. We will describe the functional architecture of the closed loop neurorobotic system (II), detail its operational algorithms for efferent motor decoding (III), and afferent sensation encoding (IV). We next present our MEA culturing procedure, experimental setup (V), and provide proof of concept that different encoding methods (SA or RA) change both the robotic hand’s behavior and the closed-loop BNN dynamics that govern its behavior (VI). We then provide a short conclusion and outlook (VII).
II. Closed-Loop Neurorobotic System Overview
Two subsystems comprise this closed-loop platform. The first subsystem is the robotic unit (Fig. 2, right panel) which includes a Shadow Hand (Shadow Robot Company, London) fit with BioTac SP tactile sensor array (SynTouch, CA), a PC using Robot Operating System (ROS, Open Robotics, CA) managing efferent (Fig. 2(d), ROS Node 1) and afferent (Fig. 2(h), ROS Node 2) computations. Node 1 uses efferent signaling from the BNNs (Fig. 2(a)-(c)) to compute the motor control signals for the artificial hand. Node 2 converts the BioTac pressure signals into afferent pulse trains of action potentials in real-time (Fig. 2(g),(h)). The pulse trains were passed to a custom-fabricated action potential generator (APG) board (Fig. 2(i)) that triggered neurostimulations according to a neurocomputational model of haptic encoding, discussed in Section IV.
The second subsystem is the neurophysiological unit (Figure 2, left panel), which includes the head stage of the MEA (MultiChannel Systems, Reutlingen, Germany) that houses the BNN culture, signal collector unit, interface board and their interconnection. The MEA has 200μm inter-electrode distance and 30μm diameter electrodes made from titanium nitride. The data acquisition of the MEA relies on a dedicated high-performance workstation optimized for storage capacity, fast data transfer and temporal accuracy of the high-density, high-frequency signals (60 channels, 20kHz) that were sampled from the neuronal culture. Online data were filtered with lowpass and highpass Butterworth filters with cutoff frequencies set to 3.5 kHz and 100 Hz, respectively, to reliably eliminate unstable baselines in real time and avoid artifacts. An additional data stream was preserved for further offline classification with ANNs.
In the present work, a pair of electrodes was selected for their healthy spontaneous activity. One of the sites was assigned the function of recording electrode: it provided the efferent signals to operate the artificial hand. Its spike events (Fig. 2(a)) were routed through ROS node 1 (Fig. 2(d)) to elicit the robotic finger tapping behavior. The other site (Fig. 2(k)) was assigned the function of stimulation electrode, and it used the haptic feedback from the encoded robotic fingertip sensations. To ensure compatibility with spike signaling in the BNN, the tactile signals received from the BioTac sensor were transformed into afferent trains of action potentials (Fig. 1(e)-(g), Fig. 2(g)-(i)) via the Izhikevich model [14].
The SCB-68A DAQ (National Instruments, TX) was placed at the interface between both subsystems (Fig. 2(c)). Real-time feedback tests were performed to quantify the delay between instructed stimulation in Simulink (ROS Node 2), its detection by the MultiChannel System’s proprietary software of the MEA, and return of the signal back to Simulink. Closed-loop latency was consistently between 0.8-1.0 ms.
III. Efferent Decoding For Motor Control
An efferent neurorobotic control signal was implemented in ROS Node 1 (Fig. 2(d)), with spike trains from the recording site of the MEA as input to specify the desired joint angle () of the Shadow Hand’s index finger metacarpophalangeal (MCP) joint. Briefly, the algorithm performed three functions: {1} a thresholding of its biological neural input signal to separate spikes from background noise, {2} a temporal aggregation to recruit MEA neural activity that satisfies criteria over a neurophysiologically-meaningful time interval, and {3} a desired index finger MCP joint angle signal for triggering robotic fingertip tapping motions to generate fingertip forces.
For computational efficiency, we used the extracellular multiunit MEA activity, VMEA (Fig. 2(a)), from the selected recording electrode as an input to the efferent decoding algorithm for motion control of the Shadow Hand. Spikes (S) were detected as:
(1) |
where Vthresh is the action potential spike detection threshold. Subsequently, S is integrated over a window of time, BinSize (50ms, (Fig. 1(b),(c))), and compared to a spatiotemporal aggregation coefficient, Sthres (3 spikes to provide a tapping rate within the operational bandwidth of the finger). Then, the algorithm outputs a TTL pulse of 100 ms, MEAout, determined by:
(2) |
MEAout was then passed through a low pass filter. MEAout affects the desired joint angle () of the Shadow Hand finger:
(3) |
Desired joint angle corresponds to a fully open hand, where the fingertip does not contact anything. However, desired joint angle corresponds to index finger flexion to create fingertip contact with a surface, producing tactile forces. The measured joint angle, , is realized by a PID joint angle controller of the tendon-driven Shadow Hand. The joint angle () is related to the fingertip force () by:
(4) |
where is the inertia matrix, is the matrix representing the Coriolis and centrifugal forces, is the viscous friction coefficient, is the vector representing the gravitational effect, is the actuating joint torque, is the joint friction, is the Jacobian matrix of the finger kinematics, and is the contact force at the fingertip [15].
Both and the rate of change of the fingertip force () are used within the Izhikevich neurocomputational model to generate SA and RA afferent action potential pulse trains that electrically stimulated the neuronal cultures (Fig. 1(e)-(g)).
IV. Afferent Encoding of Tactile Sensations
The afferent neurorobotic feedback signals were implemented in ROS Node 2 (Fig. 2(h)) for SA or RA encodings of tactile sensations as the feedback signals to the MEA (Fig. 2(g)-(k)).
A. Izhikevich Model for SA and RA Mechanoreceptors
Upon robotic fingertip contact with the environment, the Izhikevich neurocomputational model [14] was employed to convert the tactile fingertip forces (, , (4)) into spike trains of action potentials representative of RA and SA mechanoreceptors [2] (sample data shown in Fig. 1(e)-(g)). The neuron input current, Iinput, was generated corresponding to the RA and SA experiments respectively:
(5) |
(6) |
where, , , kSA, kRA are constants for tuning the tactile firing patterns. SA and RA pulse trains using input currents (5) and (6), respectively, were generated using the Izhikevich neuron model [14]:
(7) |
(8) |
(9) |
where is the membrane potential, u is the adaptation variable, and X, Y, Z, W are standard parameters for the model. Parameters a, b, c, and d are decay rates, sensitivity, membrane rest potential, and reset values, respectively (Table 1). MEAin is the signal sent to trigger a stimulation of neurons cultured in the MEA system (Fig. 2(j)-(k)). This model was implemented in real-time using Python 3.
Table 1.
a | b | c | d | X | Y | Z | W |
---|---|---|---|---|---|---|---|
0.1 | 0.2 | −0.65 | 8 | 0.04 | 5 | 140 | 1 |
B. Action Potential Generator Board
A custom-made APG board was developed to forward the action potential-like electrical stimuli representing the RA and SA spiking patterns to the MEA. The APG board has eight independently customizable stimulation channels, two of which were used in the present study. Stimulation pulses are output from the APG digitally using the onboard Teensy 3.6 microcontroller via two quad 16-bit digital-to-analog converters with a high-speed SPI interface. Its digital-to-analog output was low-pass filtered and fed to an output amplifier providing a user-selectable gain. The analog signal was passed to the MEA system’s interface board to stimulate the electrode chosen to receive afferent information. The stimulation wave shape and amplitude used was a positive-first biphasic pulse (2 mV, 400 ms/phase).
V. Experimental Setup and Analysis Strategy
A. Culturing Biological Neural Networks in the MEA
Primary cortical neurons were harvested from postnatal day 0-1 mouse pups. All animal procedures were approved by the Institutional Animal Care and Use Committee and in compliance with the National Institutes of Health Guidelines for the Care and Use of Laboratory Animals. Pups were euthanized by quick decapitation and brains were immediately removed and placed in ice-cold dissection medium (1 mM sodium pyruvate, 0.1% glucose, 10 mM HEPES, 1% penicillin/streptomycin in HEPES-buffered saline solution). Cortex were extracted under a dissecting microscope and pooled together. The tissue was digested with 0.25% trypsin in the dissection buffer for 15 min at 37oC followed by further incubation with 0.04% DNase I (Sigma-Aldrich) for 5 min at room temperature. The digested tissue was triturated with a fire-polished glass pipette 10 times and cells were pelleted by centrifugation. Cells were split and plated at ~5,000 cells/mm2 in two MEA-60 chambers (Multichannel systems), (Fig. 1(a)). Culture 1 used neuronal basal medium supplemented with 2% B27 (Invitrogen) and 1% penicillin/streptomycin while Culture 2 used BrainPhys™ neuronal culture medium (STEMCELL Technologies Inc., Vancouver, BC). Culture media in both cultures were half-changed every three days. Spontaneous structural and functional connectivity was allowed to mature before the bidirectional neurorobotic experiments.
B. Baseline MEA Recording Protocol
All experiments began with verification that system noise was within acceptable bounds with a fixed resistance test chamber provided by Multichannel Systems. The test chamber was recorded for 1 minute outside the incubator and then placed inside the incubator; allowing 20 minutes for the temperature to stabilize. Confirmation data were subsequently recorded for 1 minute. After confirming the system noise level was stable and consistent, the chamber containing the BNN was inserted into the headstage (Fig. 2) and allowed to settle for 5 minutes. The chamber was recorded for 5 min with no stimulation to obtain a baseline of spontaneous neural activity.
C. Preventing Crosstalk from Stimulation Electrode to Recording Electrode
To study synaptic plasticity of the sensorimotor network in the MEA culture, it was important to ensure that the stimulation (MEAin, Fig. 2(k)) was not strong enough to propagate through the culture medium to the recording electrode (MEAout, Fig. 2(a)) and cause depolarization of the efferent neuronal population directly. Therefore, we chose the activation threshold (Vthresh) for VMEA (1) to be higher than observed crosstalk from the stimulation electrode. We verified that no temporally coincident spiking activity exceeded the background noise level at the stimulation site’s 8 neighboring electrodes. In this way, we ensured that electrical activity from VMEA and MEAout (Fig. 2(a),(b)) was due to the synaptic connections between the recording (Fig. 2(a)) and stimulation (Fig. 2(k)) electrodes in the MEA chamber, not due to direct stimulation from a distance.
D. Closed loop Noninvasive Neuroprosthetic Hand Experiments
Next, we ran the closed loop neurorobotic experiment with the SA input current (5) encoding the fingertip tactile sensations through the Izhikevich model (7)-(9) for 5 min. Upon completion, another 5 min recording was made without stimulation. Then the experiment was repeated with the RA encoding of fingertip sensations (6), with another 5 min post-stimulation recording session. Microscope imaging was taken before the baseline recordings and after the experiment. This procedure was repeated for both cultures of neurons. Culture 1 experiments were conducted on the 35th day in vitro (DIV) while the cultures still had stable spontaneous activity on a pair of electrodes. Experiments with Culture 2 were repeated over four consecutive days (DIV 20-23) while they exhibited stable spontaneous activity on a pair of electrodes, alternating the order of RA and SA encoding methods each day.
E. Effect of the BNN Activity on Robotic Behavior
To investigate the coordination between the robotic behavior and the self-organization of evoked neural information, we analyzed the Inter-Tap-Interval (ITI) of the fingertip for both SA (Fig. 3(b)-(e)) and RA (Fig. 3(f)-(i)) closed loop experiments. This represents the time interval between finger taps triggered by the neural activity. We extracted the timestamps of each neural event, MEAout, and calculated the time interval between each two events. MATLAB was used to perform an unbalanced ANOVA of the data to determine if the RA and SA encoding methods significantly impacted the ITIs with both cultures.
F. Artificial Neural Networks to Classify BNN Activity
We next investigated whether the robotic ITI behaviors produced by the RA and SA encoding methods impacted BNN activity. After conducting the online experiments, the recorded BNN data was classified offline via neural network pattern recognition (nprtool, MATLAB). The objective of the offline analysis was to investigate if the SA or RA tactile sensation encoding methods produced different evoked patterns of activity in the BNNs measured at the recording site (Fig. 2(a)). Additionally, the offline analysis compared potential ANN methods for recognizing patterns of MEA culture dynamics to apply to future real-time experiments.
There were four methods explored for classifying SA and RA activity using techniques ranging from computationally inexpensive (conducive to real-time control) to computationally expensive (to yield higher classification accuracy). In the mid-range of computational expense, the time-based signals of neuronal activity were used (VMEA-time, Fig. 4(c),(k)). On the higher side of computational expense, the time-based features were decomposed into corresponding frequency components to explore potential for high classification accuracy (VMEA-Spectrogram, Fig. 4(b),(j)). On the low side of computational expense, the TTL spikes triggered by the detected neural activity ((1), S-time, Fig. 4(d),(l)) and their corresponding spectral components (S-Spectrogram, Fig. 4(a),(i)) were also used to train ANNs to classify SA and RA activity. For the frequency-based classification approaches, signal power features extracted from the time domain data were calculated using a 512-point FFT with 0.08s frame length and a Hanning window with 90% overlap. For all four of these signal processing approaches, we classified recorded MEA activity into either SA or RA classes over both 1s and 2s time windows.
This data preprocessing was used to create and train a unique ANN for each of these four approaches applied over the two different time windows. Their performances were evaluated using cross-entropy and confusion matrices. A two-layer feedforward network with sigmoid hidden and SoftMax output neurons was used to classify the collected data into the SA and RA classes. The hidden layer neurons were selected to be 100 neurons for all the configurations. The networks were trained with scaled conjugate gradient backpropagation.
To train and test the ANN, the collected data were divided into 3 categories: training, testing, and validation. The ANNs were trained using 70% of the data and the network was adjusted based on the error generated from this dataset. Network generalization was measured using the 15% validation dataset, and the training stopped when generalization error stopped improving. The 15% testing dataset is not used during ANN training, providing a new performance measure that is independent from the training and validation performance measure. Mean classification accuracy rates were generated from running each classifier 10 times with randomization of the training and testing data.
A two factor ANOVA was performed to assess whether the time window segmentation duration or the signal processing method significantly impacted the classification accuracy.
VI. RESULTS
Transfer of information through the noninvasive neuroprosthetic platform began when MEA efferent site activity (VMEA, Fig. 4(c),(k)) rose above the voltage threshold, Vthresh, to trigger a spike, S (1). When S was triggered Sthresh=3 times within BinSize=50 ms (2) to trigger MEAout, the desired joint angle of the finger () increased from to (3). This caused the finger joint controller to increase the joint angle (, Fig. 4(e)) so that the fingertip contacted the environment (Fig. 3(k)), increasing the fingertip force (FDC, Fig. 4(f), (4)) and force rate of change (FAC, Fig. 4(g)). Illustrative data from the SA and RA encoding of fingertip forces ((5), (6)) produced spike trains that were used to stimulate MEAin (Fig. 4(h)). This process repeated cyclically (Fig. 3(a)), for both SA (Fig. 3(b)-(e)) and RA (Fig. 3(f)-(i)) experiments, producing repetitive finger tapping behavior with variable ITI dependent on the afferent tactile encoding.
A. BNN Activity Impacted Robotic Behavior
The ANOVA revealed that the RA and SA stimulation encodings significantly impacted the ITIs in all cases (Fig. 4(o)). In Culture 1, the ITI of the finger using the SA encoding method was significantly longer than the RA method (p<0.01). In Culture 2, the ITI of the SA encoding method was also significantly longer on each of the four days of experiments. The mean across all four days also showed a statistically significant difference (p<0.05), which is likely due to different organizations of the evoked neural activity. This indicates that the tactile information fed into the stimulation site had altered the BNN behaviors that the robotic hand adopted within the closed-loop system. Or in other words, the haptic information was embodied by the neuronal cultures for functional specialization.
B. Encoding Method of Tactile Sensations Impacted BNN Activity
For the experiments with neuronal Culture 1, the ANN classification accuracy had a trend of increasing accuracy as the time window increased, with more information to process. The highest accuracy was obtained via S-Spectrogram with 85.73% ± 4.00%, while for VMEA-time the accuracy was 85.29% ± 2.58%. The accuracy for VMEA-Spectrogram was 82.62% ± 5.24%, while the accuracy for S-time was 69.72% ± 5.92% (Fig. 4(m)). For Culture 2 experiments, the highest classification accuracy was obtained with VMEA-time, showing an accuracy of 81.60% ± 3.50% (Fig. 4(n)).
In each culture, the two factor ANOVA indicated that both the time window and the signal processing approach for the ANNs significantly impacted the classification accuracy (p<0.01). Interaction was not significant (p>0.05).
The relatively high classification accuracy with both cultures suggests that the different tactile encoding methods, in this case SA or RA, impacted the BNNs’ patterns of activity in a distinguishable manner that can be further explored for online classification during future realtime experiments.
VII. CONCLUSION
We have created a novel bidirectional noninvasive neuroprosthetic research platform that can be used to study the interaction between living BNNs and embodied robotic systems. Results showed that the encoding of tactile sensations from a robotic fingertip in an SA or RA action potential stimulation pattern impacted the behavior of the BNNs in the MEAs across multiple cultures and days. Correspondingly, the different patterns of coupled BNN activity impacted the behavior of the artificial hand in a closed loop fashion as evidenced by the statistically significant ITIs with both cultures. By demonstrating the ability to classify BNN patterns with ANNs, we show a pipeline toward an online approach to classify tactile interactions through different biomimetic stimulation patterns in the embodied BNN. This platform of robotically-embodied biological neurons could enable a model of invasive experiments to be performed in a noninvasive environment [17], [18], accelerating the rate of neuroprosthetic research for understanding the complex aspects of sensorimotor integration and restoration.
Supplementary Material
Acknowledgments
Research reported in this publication was supported by the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health under Award Number R01EB025819. This research was also supported by the National Institute of Aging under 3R01EB025819-04S1, National Science Foundation awards #1317952, #1536136, and #1950400, and pilot grants from Florida Atlantic University’s Brain Institute and I-SENSE.
Contributor Information
Craig Ades, Ocean and Mechanical Engineering Department, Florida Atlantic University, Boca Raton, FL 33431 USA.
Moaed A. Abd, Ocean and Mechanical Engineering Department, Florida Atlantic University, Boca Raton, FL 33431 USA.
E Du, Ocean and Mechanical Engineering Department, Florida Atlantic University, Boca Raton, FL 33431 USA.
Jianning Wei, Biomedical Science Department, Florida Atlantic University, Boca Raton, FL 33431 USA.
Emmanuelle Tognoli, Center for Complex Systems and Brain Sciences, Florida Atlantic University, Boca Raton, FL 33431 USA.
Erik D. Engeberg, Ocean and Mechanical Engineering Department, Florida Atlantic University, Boca Raton, FL 33431 USA
VIII. REFERENCES
- [1].Ziegler-Graham K, MacKenzie EJ, Ephraim PL, Travison TG, and Brookmeyer R, "Estimating the prevalence of limb loss in the United States: 2005 to 2050," Arch. Phys. Med. Rehabilitation, vol. 89, pp. 422–429, 2008. [DOI] [PubMed] [Google Scholar]
- [2].Dahiya RS, Metta G, Valle M, and Sandini G, "Tactile sensing—from humans to humanoids," IEEE Trans. Robotics, vol. 26, pp. 1–20, 2009. [Google Scholar]
- [3].Clemente F, Valle G, Controzzi M, Strauss I, Iberite F, Stieglitz T, et al. , "Intraneural sensory feedback restores grip force control and motor coordination while using a prosthetic hand," Journal of neural engineering, vol. 16, p. 026034, 2019. [DOI] [PubMed] [Google Scholar]
- [4].Horch K, Meek S, Taylor TG, and Hutchinson DT, "Object discrimination with an artificial hand using electrical stimulation of peripheral tactile and proprioceptive pathways with intrafascicular electrodes," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 19, pp. 483–489, 2011. [DOI] [PubMed] [Google Scholar]
- [5].George JA, et al. , "Biomimetic sensory feedback through peripheral nerve stimulation improves dexterous use of a bionic hand," Science Robotics, vol. 4, 2019. [DOI] [PubMed] [Google Scholar]
- [6].Günter C, Delbeke J, and Ortiz-Catalan M, "Safety of long-term electrical peripheral nerve stimulation: review of the state of the art," Journal of neuroengineering and rehabilitation, vol. 16, pp. 1–16, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [7].Wagenaar DA, Madhavan R, Pine J, and Potter SM, "Controlling bursting in cortical cultures with closed-loop multi-electrode stimulation," Journal of Neuroscience, vol. 25, pp. 680–688, 2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [8].Bakkum DJ, Chao ZC, and Potter SM, "Spatio-temporal electrical stimuli shape behavior of an embodied cortical network in a goal-directed learning task," Journal of neural engineering, vol. 5, p. 310, 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [9].Bakkum DJ, Shkolnik AC, Ben-Ary G, Gamblen P, DeMarse TB, and Potter SM, "Removing some ‘A’from AI: Embodied cultured networks," in Embodied artificial intelligence, Springer, 2004, pp. 130–145. [Google Scholar]
- [10].DeMarse TB, Wagenaar DA, Blau AW, and Potter SM, "The neurally controlled animat: biological brains acting with simulated bodies," Autonomous robots, vol. 11, pp. 305–310, 2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [11].Novellino A, D'Angelo P, Cozzi L, Chiappalone M, Sanguineti V, and Martinoia S, "Connecting neurons to a mobile robot: an in vitro bidirectional neural interface," Computational Intelligence and Neuroscience, vol. 2007, 2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [12].Tessadori J, Bisio M, Martinoia S, and Chiappalone M, "Modular neuronal assemblies embodied in a closed-loop environment: toward future integration of brains and machines," Front. Neural Circuits, vol. 6, 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [13].Yi Z, Zhang Y, and Peters J, "Biomimetic tactile sensors and signal processing with spike trains: A review," Sensors and Actuators A: Physical, vol. 269, pp. 41–52, 2018. [Google Scholar]
- [14].Izhikevich EM, "Simple model of spiking neurons," IEEE Transactions on neural networks, vol. 14, no. 6, pp. 1569–1572, 2003. [DOI] [PubMed] [Google Scholar]
- [15].Nguyen K, & Perdereau V, "Fingertip force control based on max torque adjustment for dexterous manipulation of an anthropomorphic hand,". In 2013 IEEE/RSJ IROS (pp. 3557–3563) [Google Scholar]
- [16].Sachihiko S et al. , "Designing a bidirectional, adaptive neural interface incorporating machine learning capabilities and memristor-enhanced hardware," Chaos, Solitons & Fractals, vol. 142, p. 110504, 2021. [Google Scholar]
- [17].Abd M, Ingicco J, Hutchinson D, Tognoli E, and Engeberg E, "Multichannel haptic feedback unlocks prosthetic hand dexterity," Nature Scientific Reports, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [18].Abd M, Paul R, Aravelli A, Bai O, Lagos L, Lin M, and Engeberg E, "Hierarchical Tactile Sensation Integration from Prosthetic Fingertips Enables Multi-Texture Surface Recognition," Sensors, vol. 21, Issue 13, 2021; 10.3390/s21134324 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.