Bionic Hand Control with Real-Time B-Mode Ultrasound Web AI Vision

Mode (computer interface)
DOI: 10.62487/vjzf7254 Publication Date: 2025-04-05T09:53:48Z
ABSTRACT
Aim: This basic research study aimed to assess the ability of Web AI Vision classify anatomical movement patterns in real-time B-mode ultrasound scans for controlling a virtual bionic limb. Methods: A MobileNetV2 model, implemented via TensorFlow.js library, was used transfer learning and feature extraction from 400 images distal forearm one individual participant, corresponding four different hand positions: 100 fist position, thumb palmar abduction, with an extended forefinger, open palm. Results: After 32 epochs training rate 0.001 batch size 16, model achieved 100% validation accuracy, test loss (crossentropy) 0.0067 differentiating associated specific positions. During manual testing 40 excluded training, validation, testing, able correctly predict position all cases (100%), mean predicted probability 98.9% (SD ± 0.6). When tested cine loops live scanning, successfully performed predictions 20 ms interval between predictions, achieving 50 per second. Conclusion: demonstrated Such ultrasound- AI-powered limbs can be easily automatically retrained recalibrated privacy-safe manner on client side, within web environment, without extensive computational costs. Using same scanner that controls limb, patients efficiently adjust new as needed, relying external services. The advantages this combination warrant further into muscle analysis utilization ultrasound-powered rehabilitation medicine, neuromuscular disease management, advanced prosthetic control amputees.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (19)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....