Correlating traditional image quality metrics and DNN-based object
detection: a case study with compressed camera data
Abstract
 Assisted and automated driving (AAD) systems heavily rely on data
collected from perception sensors, such as cameras. While prior research
has explored the quality of camera data via traditional and
well-established image quality assessment (IQA) metrics (e.g. PSNR,
SSIM, BRISQUE) or have considered when noisy/degraded data affects
perception algorithms (e.g. deep neural network (DNN) based object
detection), there are no works that approach the holistic relationship
between IQA and DNN performance. This work proposes that traditional IQA
metrics, designed to evaluate digital image quality according to human
visual perception, can help to predict the sensor data degradation level
that perception algorithms can tolerate before performance deterioration
occurs. Consequently, a correlation analysis was conducted between 17
selected IQA metrics (with and without reference) and DNN average
precision. The evaluated data was increasingly compressed to generate
degradation and artefacts. Notably, the experimental results show that
several IQA metrics had a strong positive correlation (exceeding
correlation scores of 0.7) with average precision, with IW-SSIM and DSS
having very high correlation (> 0.9). Interestingly, the
results show that re-training BRISQUE on compressed data causes an
exceptionally high positive correlation (> 0.97), making it
very suitable for predicting the performance of DNN object detectors. By
effectively relating traditional image quality metrics to DNN
performance, this research offers a series of significant tools to
understand and predict perception degradation based on the quality of
data, thus resulting in a significant impact on the development of
automated driving systems.Â