Amputee practicing with the AI-powered neuroprosthetic hand
We use AI models based on recurrent neural networks (RNN) to read and accurately decode the amputee’s intent of moving individual fingers from peripheral nerve activities. The AI models are deployed on an NVIDIA Jetson Nano as a portable, self-contained unit. With this AI-powered nerve interface, the amputee can control a neuroprosthetic hand with life-like dexterity and intuitiveness.
This clinical trial is part of the DARPA’s Hand Proprioception and Touch Interfaces (HAPTIX) program:
More photos:
Papers:
1. Nguyen & Drealan et al. (2021) A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based Finger Control:
2. Luu & Nguyen et al. (2021) Deep Learning-Based Approaches for Decoding Motor Intent from Peripheral Nerve Signals:
3. Nguyen et al. (2021) Redundant Crossfire: A Technique to Achieve S