Record Details

A Fast-Dehazing Technique using Generative Adversarial Network model for Illumination Adjustment in Hazy Videos

NOPR - NISCAIR Online Periodicals Repository

View Archive Info
 
 
Field Value
 
Title A Fast-Dehazing Technique using Generative Adversarial Network model for Illumination Adjustment in Hazy Videos
 
Creator Naidu, T M Praneeth
Sekhar, P Chandra
 
Subject Depth estimation
Discriminator model
Generative adversarial networks
Generator model
ResNet
 
Description 328-337
Haze significantly lowers the quality of the photos and videos that are taken. This might potentially be dangerous in
addition to having an impact on the monitoring equipment' dependability. Recent years have seen an increase in issues
brought on by foggy settings, necessitating the development of real-time dehazing techniques. Intelligent vision systems,
such as surveillance and monitoring systems, rely fundamentally on the characteristics of the input pictures having a
significant impact on the accuracy of the object detection. This paper presents a fast video dehazing technique using
Generative Adversarial Network (GAN) model. The haze in the input video is estimated using depth in the scene extracted
using a pre trained monocular depth ResNet model. Based on the amount of haze, an appropriate model is selected which is
trained for specific haze conditions. The novelty of the proposed work is that the generator model is kept simple to get faster
results in real-time. The discriminator is kept complex to make the generator more efficient. The traditional loss function is
replaced with Visual Geometry Group (VGG) feature loss for better dehazing. The proposed model produced better results
when compared to existing models. The Peak Signal to Noise Ratio (PSNR) obtained for most of the frames is above 32.
The execution time is less than 60 milli seconds which makes the proposed model suited for video dehazing.
 
Date 2023-03-07T12:03:06Z
2023-03-07T12:03:06Z
2023-03
 
Type Article
 
Identifier 0022-4456 (Print); 0975-1084 (Online)
http://nopr.niscpr.res.in/handle/123456789/61517
https://doi.org/10.56042/jsir.v82i03.71760
 
Language en
 
Publisher NIScPR-CSIR, India
 
Source JSIR Vol.82(03) [March 2023]