Image super-resolution using generative adversarial networks with efficientNetV2.

The image super-resolution is utilized for the image transformation from low resolution to higher resolution to obtain more detailed information to identify the targets. The superresolution has potential applications in various domains, such as medical image processing, crime investigation, remote s...

全面介紹

Saved in:
書目詳細資料
Main Authors: AlTakrouri,, Saleh, Mohd. Noor, Norliza, Ahmad, Norulhusna, Justinia, Taghreed, Usman, Sahnius
格式: Article
語言:English
出版: Science and Information Organization 2023
主題:
在線閱讀:http://eprints.utm.my/105362/1/SalehAltakrouri2023_ImageSuperResolutionUsingGenerativeAdversarial.pdf
http://eprints.utm.my/105362/
http://dx.doi.org/10.14569/IJACSA.2023.01402100
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:The image super-resolution is utilized for the image transformation from low resolution to higher resolution to obtain more detailed information to identify the targets. The superresolution has potential applications in various domains, such as medical image processing, crime investigation, remote sensing, and other image-processing application domains. The goal of the super-resolution is to obtain the image with minimal mean square error with improved perceptual quality. Therefore, this study introduces the perceptual loss minimization technique through efficient learning criteria. The proposed image reconstruction technique uses the image super-resolution generative adversarial network (ISRGAN), in which the learning of the discriminator in the ISRGAN is performed using the EfficientNet-v2 to obtain a better image quality. The proposed ISRGAN with the EfficientNet-v2 achieved a minimal loss of 0.02, 0.1, and 0.015 at the generator, discriminator, and self-supervised learning, respectively, with a batch size of 32. The minimal mean square error and mean absolute error are 0.001025 and 0.00225, and the maximal peak signal-to-noise ratio and structural similarity index measure obtained are 45.56985 and 0.9997, respectively.