Navigating Virtual Environments Using Leg Poses and Smartphone Sensors

Identification Avatar
DOI: 10.3390/s19020299 Publication Date: 2019-01-14T17:20:07Z
ABSTRACT
Realization of navigation in virtual environments remains a challenge as it involves complex operating conditions. Decomposition such complexity is attainable by fusion sensors and machine learning techniques. Identifying the right combination sensory information appropriate technique vital ingredient for translating physical actions to movements. The contributions our work include: (i) Synchronization movements using suitable multiple sensor units, (ii) selection significant features an algorithm process them. This proposes innovative approach that allows users move simply moving their legs towards desired direction. necessary hardware includes only smartphone strapped subjects' lower leg. Data from gyroscope, accelerometer campus mobile device are transmitted PC where movement accurately identified Once identified, avatar environment realized. After pre-processing data box plot outliers approach, observed Artificial Neural Networks provided highest identification accuracy 84.2% on training dataset 84.1% testing dataset.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (26)
CITATIONS (17)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....