Thakor Lab YouTube page
Brain-Computer Interfaces

& Neuroprostheses
Invasive Cortical Decoding
While noninvasive BCI platforms provide an excellent test bed for validation of cortical control strategies with human subjects, a direct neural interface will be essential for achieving the degrees of freedom necessary to control advanced upper-limb prostheses that mimic a human hand. Although previous research efforts have demonstrated cortical control of a computer cursor or decoded movement intent and arm trajectory, we are among the first to investigate neuroprosthetic control of dexterous movements such as control of individual fingers and reach-to-grasp movements.
Overall system architecture for a closed-loop setup. Simultaneous neural and motion capture data are acquired in real-time, and fed to a Virtua Integration Environment for processing and decoding. The final decoded output controls an actual limb or a virtual limb.
In our work, we demonstrate how different neural recording signal from motor areas - namely single- and multi-unit activity, local field potentials (LFP), and electrocorticogram (ECoG) - can be used to decode an entire suite of dexterous movements including 18 joint angles of the hand during four different reach-to-grasp tasks. The neural activity is recorded from non-human primate experiments performed at collaborating institutions such as the University of Rochester.
Our group focuses on developing novel cortical control strategies for decoding both the discrete cognitive states and the continuous kinematics movements. Specifically, we have designed various algorithms using Hidden Markov Models (HMMs), Artificial Neural Networks (ANNs), Recurrent Neural Networks, and Kalman filters to decode discrete cognitive states as well as continuous kinematics of the hand. As more advanced prosthetic limbs become commercially available, this work paves the way for developing real-time, dexterous manipulation of a multi-fingered upper-limb neuroprosthesis under direct neural control.
Simultaneous neural and kinematic activity are recorded from non-human primates as they reach towards and manipulate four different objects in space. The different grasps evoked by the monkeys when grasping each object are illustrated above. The colored stick figures show the corresponding reconstruction of hand kinematics.
Soumyadipta Acharya, PhD, MD, MSE
Heather Benz
Xiaoxu Kang
Geoffrey Newman
Kyle Rupp
Ryan Smith
Johns Hopkins Applied Physics Laboratory
Marc H. Schieber, MD, PhD - University of Rochester Medical Center
Nathan Crone, MD - Johns Hopkins School of Medicine (Neurology)
Defense Advanced Project and Research Agency (DARPA)
Mollazadeh M, Aggarwal V, Law A, Davidson A, Schieber MH, Thakor NV, Coherency between Spike and LFP Activity during Fine Hand Movements in M1, Conf Proc IEEE Eng Med Biol Soc on Neural Eng, 1:506-09, 2009

Aggarwal V, Acharya S Tenore F, Etienne-Cummings R, Schieber MH, Thakor NV, Asynchronous Decoding of Dexterous Finger Movements using M1 Neurons, IEEE Trans Neural Syst Rehabil Eng,16(1):3-14, 2008

Acharya S, Tenore F, Aggarwal V, Etienne-Cummings R, Schieber MH, Thakor NV, Decoding finger movements using volume-constrained neuronal ensembles in M1, IEEE Trans on Neural Sys and Rehab Eng, 16(1):15-23, 2008

Acharya S, Aggarwal V, Tenore F, Shin HC, Etienne-Cummings R, Schieber MH, Thakor NV, Towards a Brain-Computer Interface for dexterous control of a multi-fingered prosthetic hand, Conf Proc IEEE Eng Med Biol Soc on Neural Eng, 1:200-203, 2007
Go to Top

Copyright © 2008 JHU Neuroengineering & Biomedical Instrumentation Lab
All rights reserved.