A Wearable Computer Vision System With Gimbal Enables Position-, Speed-, and Phase-Independent Terrain Classification for Lower Limb Prostheses

Computer vision can provide upcoming walking environment information for lower limb-assisted robots, thereby enabling more accurate and robust decisions for high-level control. However, current computer vision systems in lower extremity devices are still constrained by the disruptions that occur in...

Full description

Saved in:
Bibliographic Details
Main Authors: Linrong Li (Author), Xiaoming Wang (Author), Qiaoling Meng (Author), Hongliu Yu (Author)
Format: Book
Published: IEEE, 2023-01-01T00:00:00Z.
Subjects:
Online Access:Connect to this object online.
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000 am a22000003u 4500
001 doaj_0cdf8ce0d62e48e0a2de9fa884c5c6e6
042 |a dc 
100 1 0 |a Linrong Li  |e author 
700 1 0 |a Xiaoming Wang  |e author 
700 1 0 |a Qiaoling Meng  |e author 
700 1 0 |a Hongliu Yu  |e author 
245 0 0 |a A Wearable Computer Vision System With Gimbal Enables Position-, Speed-, and Phase-Independent Terrain Classification for Lower Limb Prostheses 
260 |b IEEE,   |c 2023-01-01T00:00:00Z. 
500 |a 1558-0210 
500 |a 10.1109/TNSRE.2023.3331273 
520 |a Computer vision can provide upcoming walking environment information for lower limb-assisted robots, thereby enabling more accurate and robust decisions for high-level control. However, current computer vision systems in lower extremity devices are still constrained by the disruptions that occur in the interaction between human, machine, and the environment, which hinder optimal performance. In this paper, we propose a gimbal-based terrain classification system that can be adapted to different lower limb movements, different walking speeds, and gait phases. We use a linear active disturbance rejection controller to realize fast response and anti-disturbance control of the gimbal, which allows computer vision to continuously and stably focus on the desired field of view angle during lower limb motion interaction. We also deployed a lightweight MobileNetV2 model in an embedded vision module for real-time and highly accurate inference performance. By using the proposed terrain classification system, it can provide the ability to classify and predict terrain independent of mounting position (thighs and shanks), gait phase, and walking speed. This also makes our system applicable to subjects with different physical conditions (e.g., non-disabled subjects and individuals with transfemoral amputation) without tuning the parameters, which will contribute to the plug-and-play functionality of terrain classification. Finally, our approach is promising to improve the adaptability of lower limb assisted robots in complex terrain, allowing the wearer to walk more safely. 
546 |a EN 
690 |a Computer vision 
690 |a terrain classification 
690 |a continuous prediction 
690 |a phase-independent 
690 |a gimbal control 
690 |a Medical technology 
690 |a R855-855.5 
690 |a Therapeutics. Pharmacology 
690 |a RM1-950 
655 7 |a article  |2 local 
786 0 |n IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol 31, Pp 4539-4548 (2023) 
787 0 |n https://ieeexplore.ieee.org/document/10312774/ 
787 0 |n https://doaj.org/toc/1558-0210 
856 4 1 |u https://doaj.org/article/0cdf8ce0d62e48e0a2de9fa884c5c6e6  |z Connect to this object online.