Hengnu Chen

ORCID: 0000-0001-7859-0109
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Tensor decomposition and applications
  • Parallel Computing and Optimization Techniques
  • Power System Optimization and Stability
  • Advanced Neural Network Applications
  • Solar Radiation and Photovoltaics
  • Advanced Neuroimaging Techniques and Applications
  • Sparse and Compressive Sensing Techniques

Tsinghua University
2021

Recurrent neural networks (RNNs) are powerful in the tasks oriented to sequential data, such as natural language processing and video recognition. However, because modern RNNs have complex topologies expensive space/computation complexity, compressing them becomes a hot promising topic recent years. Among plenty of compression methods, tensor decomposition, e.g., train (TT), block term (BT), ring (TR), hierarchical Tucker (HT), appears be most amazing approach very high ratio might obtained....

10.1109/tnnls.2021.3105961 article EN IEEE Transactions on Neural Networks and Learning Systems 2021-09-17

In recent years, tensor computation has become a promising tool for solving big data analysis, machine learning, medical image, and EDA problems. To ease the memory intensity of processing, decomposition techniques, especially tensor-train (TTD), are widely adopted to compress extremely high-dimensional data. Despite TTD's potential break curse dimensionality, researchers have not yet leveraged its full computational potential, mainly because two reasons: 1) executing TTD itself is time-...

10.1109/tcad.2021.3058317 article EN publisher-specific-oa IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 2021-02-10
Coming Soon ...