• 1
  • 2
  • 3
  • 4
  • 5
  • 6

 

 

Model-based Control by Able-bodied Participant

 

 

Description: This video demonstrates the real-time EMG-driven musculoskeletal model-based control of a virtual hand on a computer screen by an able-bodied participant. The virtual hand closely mimics the participant's own movements, even when both the wrist and finger joints are moved simultaneously.

 

 

 

EMG-driven Musculoskeletal Model-based Upper Limb Control

 

 

Description: This video demonstrates the real-time EMG-driven musculoskeletal model-based control of a virtual hand on a computer screen by a participant with transradial amputation. Using only neural signals from the residual limb during bilateral mirrored movements, the model reasonably matches the participant's movement intent observed in the sound limb.

 

 

 

Advances in Tuning Powered Lower Limb Prostheses

 

 

Description: Automatically Tuning Powered Prosthesis Using Cyber Expert System

 

 

Our Research was Highlighted on NSF Science 360

 

 

Description: NSF Science Now: Episode 29 featured our research!

 

 

Neurally-controlled Powered Transfemoral Prosthesis

 

 

Description: F. Zhang, M. Liu, S. Harper, M. Lee, and H. Huang, "Engineering Platform and Experimental Protocol for Design and Evaluation of a Neurally-controlled Powered Transfemoral Prosthesis", Journal of Visualized Experiments, 2014

 

 

Unstable Powered Prosthetic Device

 

 

Description: This video shows what happens when a powered prosthetic device has an error and it throws the user off balance.

 

 

Stable Powered Prosthetic Device

 

 

Description: This video shows what happens when a powered prosthetic device has an error and it doesn't have an effect on the user. Powered lower limb prosthetics hold promise for improving the mobility of amputees, but errors in the technology may also cause some users to stumble or fall. New research from the lab of Helen Huang examines exactly what happens when these technologies fail, with the goal of developing a new generation of more robust powered prostheses. Huang is an associate professor in the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill.

 

 

 

Prosthetic arm control based on EMG pattern recognition

 

 

 

Description: The video shows the basic performance of our robust EMG pattern recognition algorithm for prosthetic arms. This project is sponsored by the National Institute on Disability and Rehabilitation Research (NIDRR).

 

 

Preliminary Testing a Prototype of Powered Prosthetic Knee on a Transfemoral Amputee

 

 

 

Description: This video shows the first testing of our designed powered prosthesis on a patient with a transfemoral amputation. The speed of the treadmill gradually ramps up. Smooth walking pattern was generated by the testing subject after he followed the instructions from a physical therapist.

 

 

 

 

Description: The video demonstrates the performance of our neurally-controlled artificial leg for supporting an amputee to negotiate with stairs.

 

 

 

Neural Control of Powered Artificial Legs Enables Seamless Terrain Transitions

 

 

Description: This video shows our preliminary testing of neurally-controlled, powered transfemoral prostheses. The user intent for switching walking terrains is automatically identified by our neural-machine interface (Huang et al. 2011), which then modulates the autonomous control of powered prosthesis. In this video, a transfemoral amputee can switch walking train seamlessly, safely, and intuitively. Compared to current approaches, such as using body motions or remote control to switch prosthesis control mode in walking terrain transitions, our approach shows breakthrough in function of powered lower limb prostheses, which in turn can further improve the quality of life of patients with lower limb amputations.

 

 

 

Testing a Real-Time Neural-Machine Interface Using Virtual Reality

 

 

 

Description: This video shows the real-time performance of a neural-machine interface (NMI) based on neuromuscular-mechanical fusion for artificial legs. The system was tested on a patient with a transfemoral amputation in real-time. When walking on different terrains, the subject's locomotion mode (i.e. level-ground walking, stair ascent, and stair descent) was recognized by the NMI and animated by a virtual reality (VR) system in real-time. The NMI produced high accuracy for predicting the user's locomotion modes with sufficient prediction time. The designed system can be an effective training tool for leg amputees to neurally control their artificial legs. Detailed methods for fusion-based NMI design can be found in our recent publication (Huang et al. IEEE TBME, 2011).

 

 

 

Towards Active Stumble Recovery in Lower Limb Amputees

 

 

 

Description: Recent advances in design of powered artificial legs have led to increased potential to allow lower limb amputees to actively recover stumbles. To achieve this goal, promptly and accurately identifying stumbles is essential. This study aimed to (1) select potential stumble detection data sources that react reliably and quickly to stumbles and can be measured from a prosthesis, and (2) investigate two different approaches based on selected data sources to detect stumbles and classify stumble types in patients with transfemoral (TF) amputations during ambulation. Detailed methods and results can be found in our recent publication (Zhang et al. IEEE TNSRE 2011).

 

 

 

Real Time Recognition of Transitions between Sitting and Standing (Tested on a transfemoral amputee)

 

 

 

Description: This video shows the real-time recognition of sit-to-stand and stand-to-sit transitions based on the EMG signals of gluteal muscles and residual thigh muscles in an above-knee amputee. The algorithm was implemented on an embedded system. The output was indicated on the monitor. The system presents high accuracy for identifying the tested task transitions. Although the subject moved the instrumented side of leg and shifted the body weight in the sitting or standing position, no erroneous task switch was presented. More details about this work can be found in our recent publication (Zhang et al. IEEE TII 2011).