MuLA-GAN: Multi-Level Attention GAN for Enhanced Underwater Visibility

Discriminative model Robustness Visibility
DOI: 10.1016/j.ecoinf.2024.102631 Publication Date: 2024-05-11T15:49:00Z
ABSTRACT
The underwater environment presents unique challenges (color distortions, reduced contrast, blurriness) hindering accurate analysis. This work introduces MuLA-GAN, a novel approach leveraging Generative Adversarial Networks (GANs) and specifically adapted Multi-Level Attention for comprehensive image enhancement. MuLA-GAN integrates within the GAN architecture to prioritize learning discriminative features crucial precise restoration. These relevant encompass information on local details regions leveraged by spatial attention at various scales across entire captured multi-level attention. allows identify enhance objects, textures, edges obscured distortions while also reconstructing more visually clear representation of scene analyzing low-level like as well high-level object shapes global information. By selectively focusing these features, excels capturing preserving intricate in imagery, which is essential marine research, exploration, resource management applications. Extensive evaluations diverse datasets (UIEB test, UIEB challenge, U45, UCCS) demonstrate MuLA-GAN's superior performance compared existing methods. Additionally, specialized bio-fouling aquaculture dataset confirms model's robustness challenging environments. On test dataset, achieves exceptional Peak Signal-to-Noise Ratio (PSNR) (25.59) Structural Similarity Index (SSIM) (0.893) scores, surpassing Water-Net (24.36 PSNR, 0.885 SSIM). addresses significant research gap enhancement demonstrating effectiveness combining GANs with mechanisms. tailored offers framework restoring quality, providing valuable insights source code publicly available GitHub https://github.com/AhsanBaidar/MuLA_GAN.git
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (69)
CITATIONS (18)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....