Pyramid Attention Networks for Image Restoration

FOS: Computer and information sciences Computer Science - Machine Learning Computer Vision and Pattern Recognition (cs.CV) Image and Video Processing (eess.IV) Computer Science - Computer Vision and Pattern Recognition Machine Learning (stat.ML) 02 engineering and technology Electrical Engineering and Systems Science - Image and Video Processing Image restoration; Image denoising; Demosaicing; Compression artifact reduction; Super-resolution Machine Learning (cs.LG) Demosaicing Image restoration Compression artifact reduction Statistics - Machine Learning Super-resolution Image denoising 0202 electrical engineering, electronic engineering, information engineering FOS: Electrical engineering, electronic engineering, information engineering
DOI: 10.3929/ethz-b-000627374 Publication Date: 2023-08-08
ABSTRACT
AbstractSelf-similarity refers to the image prior widely used in image restoration algorithms that small but similar patterns tend to occur at different locations and scales. However, recent advanced deep convolutional neural network-based methods for image restoration do not take full advantage of self-similarities by relying on self-attention neural modules that only process information at the same scale. To solve this problem, we present a novel Pyramid Attention module for image restoration, which captures long-range feature correspondences from a multi-scale feature pyramid. Inspired by the fact that corruptions, such as noise or compression artifacts, drop drastically at coarser image scales, our attention module is designed to be able to borrow clean signals from their “clean” correspondences at the coarser levels. The proposed pyramid attention module is a generic building block that can be flexibly integrated into various neural architectures. Its effectiveness is validated through extensive experiments on multiple image restoration tasks: image denoising, demosaicing, compression artifact reduction, and super resolution. Without any bells and whistles, our PANet (pyramid attention module with simple network backbones) can produce state-of-the-art results with superior accuracy and visual quality. Our code is available at https://github.com/SHI-Labs/Pyramid-Attention-Networks
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....