Thakor Lab YouTube page
 
 
   
Videos
 

GEAR demo (Intel-Cornell Cup 2016 Grand Prize winner)
May 23, 2016

GEAR was created to serve as an assistive computer interface for individuals with limited upper limb functionality. The main goal of the team - which includes a quadrilateral amputee - was to transfer dexterous control from the hands to the feet. The device benefits people with disabilities, such as upper limb amputees, to retain complex control of computers, which currently requires hands and fingers.


Amputee Grasp Demo
November 19, 2015

A transradial amputee performs a series of grasping tasks with his prosthesis. The addition of skin-like fingertip sensors provide tactile feedback to the prosthesis to help improve grasping functionality.


Egg Grasp Demo
November 19, 2015

A use of novel neuromorphic fingertip sensors for providing tactile feedback to a prosthetic hand. Real-time tactile feedback allows the prosthesis to grab delicate objects without breaking them.


Tactile SmartSkin Material Demo
November 19, 2015

Flexible and compliant skin-like material is being tested for creating a SmartSkin for prosthetic limb. The stretchable material will allow for embedded sensors to provide realistic and neuromorphic tactile feedback to the prosthetic system.


Reach Decoding from Ipsilateral ECoG
December 29, 2014

Offline decoding of reaching movements in three dimensional space using ipsilateral ECoG electrodes over damaged motor regions. This depicts simulated results using the virtual MPL controlled by APL software. PLOS ONE


Online ECoG Control of a Semi-Autonomous BMI
December 12, 2013

ECoG control of an object manipulation task. The natural ECoG correlates of reaching and grasping were used to detect an intended movement. Upon detection, the system initiated a complex sequence of movements in order to retrieve an object selected through tracking the user's eye movements. IEEE TNSRE


Online ECoG BMI Control of Reach and Grasp
October 24, 2013

Simultaneous ECoG control of reaching and grasping. The natural ECoG correlates of reaching and grasping were mapped to corresponding movements performed by the modular prosthetic limb. IEEE TNSRE

 

 

Virus in Microfluidic Chamber
August 3, 2011

Varicella zoster virus moves inside an axon in a channel of a microfluidic chamber.


 

MORPH Prosthesis Control - JASPER Test
May 27, 2011

A transradial amputee demonstrates the advantages of the Myoelectrically-Operated RFID Prosthetic Hand (MORPH) in a battery of functional hand dexterity tasks. EMG thresholding in conjunction with RFID tag readings allows the hand to automatically form complex preshapes, and to operate individual digits in a dexterously relevant manner.


 

MORPH Intelligent Switching (Doorknob Test)
May 27, 2011

An able-bodied subject demonstrates the ability of a Myoelectrically-Operated RFID Prosthetic Hand (MORPH) to intelligently switch between the control of multiple degrees of freedom. In this example, a tennis ball acts as an RFID-tagged doorknob. Once the hand has closed on the object (as if it were a doorknob), it automatically switches from aperture to orientation control. Finally, a co-contraction of both antagonistic muscles initiates the opening of the head and the automatic reorientation.


 

Accelerometer-Controlled Auto-Orientation
May 27, 2011

An accelerometer on the hand is involved in the closed-loop control of the wrist-rotation unit of the hand. An able-bodied subject demonstrates how this advancement can be used to maintain a certain wrist angle relative to gravity; even when the prosthetic socket is disturbed.


 

RFID Prosthesis Control Concept Demo
May 27, 2011

A virtual reality demonstration of the conceptual idea of integrating an RFID reader with myoelectric prosthetic hand control.


 

RFID Control of a Michelangelo Prosthetic Hand
May 27, 2011

A benchtop demonstration of the advantages that come from integrating an RFID reader with myoelectric prosthetic hand control.


 

MORPH Control of a Prosthetic Hand
May 27, 2011

An able-bodied subject demonstrates the ability of a Myoelectrically-Operated RFID Prosthetic Hand (MORPH) to control more than just a prosthetic terminal device. RFID tags situated around the perimeter of a mouspad control the movement of the mouse cursor. Furthermore, EMG control is converted from hand control to the operation of left and right mouse clicks.


 

Amputee Plays Video Game Using EMG
September 26, 2009

Using advanced signal processing algorithms developed by researchers at Johns Hopkins University, a congenital amputee is able to control the popular Wolfenstein video game using only EMG signals from the residual limb.


 

Discovery Channel: Cyborgs
July 16, 2009

Cyborgs may be strictly sci-fi, but brain-computer interfacing is real. Jorge Ribas finds out how this technology is helping people.


 

Indivdiual Finger Control of a Virtual Prosthetic Device Using Surface EMG
March 24, 2009

An able-bodied subject shows how EMG signals from only the surface of his arm can be used to control the individual fingers of a virtual prosthetic in real-time.


 

BCI Control of a Prosthetic Hand with Haptic Feedback
October 12, 2006

A subject uses brain control to grab different objects with a prosthetic hand. The user receives visual and vibrotactile haptic feedback of how hard the object is being gripped.


 

Noninvasive Brain-Computer Interface to Control a Prosthetic Hand
August 28, 2006

A member of the Thakor lab at Johns Hopkins University controls a prosthetic hand using EEG signals from his brain.


 

Monkey Plays Piano with Prosthetic Hand
June 26, 2006

Using streaming data processed in real-time, neural signals from a monkey's brain is turned into movement of a prosthetic hand. The monkey was trained to move each individual finger, and advanced signal processing algorithms developed by researchers at Johns Hopkins University were able to translate this information into a sequence of movements to play the piano.


 

Brain Control of a Robotic Hand
May 16, 2006

Researchers in the Brain-Computer Interface group at Johns Hopkins University show how brain signals can be used to open and control a multi-fingered hand.


 
 
   

Copyright © 2008 JHU Neuroengineering & Biomedical Instrumentation Lab
All rights reserved.