Multi‐stage part‐aware graph convolutional network for skeleton‐based action recognition
Cross-View Recognition
Artificial intelligence
Gesture Recognition
Biomedical Engineering
Skeleton-Based Recognition
02 engineering and technology
FOS: Medical engineering
Pattern recognition (psychology)
Redundancy (engineering)
Graph
Engineering
Theoretical computer science
0202 electrical engineering, electronic engineering, information engineering
Computer science
Human-Computer Interaction
Algorithm
Gait Recognition for Human Identification
Continuous Recognition
Operating system
Gesture Recognition in Human-Computer Interaction
Action Recognition
Human Action Recognition and Pose Estimation
Computer Science
Physical Sciences
Feature extraction
Computer Vision and Pattern Recognition
DOI:
10.1049/ipr2.12469
Publication Date:
2022-03-09T16:06:29Z
AUTHORS (6)
ABSTRACT
AbstractRecently, graph convolutional networks have shown excellent results in skeleton‐based action recognition. This paper presents a multi‐stage part‐aware graph convolutional network for the problems of model over complication, parameter redundancy and lack of long‐dependence feature information. The structure of this network has a multi‐stream input and two‐stream output, which can greatly reduce the complexity and improve the accuracy of the model without losing sequence information. The two branches of the network have the same backbone, which includes 6 multi‐order feature extraction blocks and 3 temporal attention calibration blocks, and the outputs of the two branches are fused together. In multi‐order feature extraction block, a channel‐spatial attention mechanism and a graph condensation module are proposed, which can extract more distinguishable feature and identify the relationship between parts. In temporal attention calibration block, the temporal dependencies between frames in the skeleton sequence are modeled. Experimental results show that the proposed network outperforms many mainstream methods on NTU and Kinetics datasets, for example, it achieves 92.4% accuracy on the cross‐subject benchmark of NTU‐RGBD60 dataset.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (64)
CITATIONS (2)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....