Image Quality Assessment

Measuring digital picture quality, as perceived by human observers, is increasingly important in the many applications in which humans are the ultimate consumers of visual information. Standard dynamic range (SDR) images are ubiquitous and provide 8 bits per color per pixel. High dynamic range (HDR) images, which can be captured by smart phones and digital cameras, provide an enhanced range of luminance and chrominance values by using 16 or 32 bits per color per pixel.

For synthetic SDR scenes and natural HDR images, we have

We have evaluated full-, reduced- and no-reference objective measures against the subjective scores for the synthetic SDR and natural HDR image databases from controlled experiments in our imaging lab. For the HDR image database, we also conducted a large-scale subjective study using a crowdsourced platform to gather 300,000+ opinion scores on 1,800+ images from 5,000+ unique observers, and compared those results against objective visual quality measure results.

For the mean subtracted contrast normalized, standard deviation, and gradient images for synthetic SDR and natural HDR images, I show that the amplitude statistics are characterized by generalized Gaussian distributions, and that visual distortions show up as deviations from these scene statistics. Among no-reference measures, those based on scene statistics have the highest correlations with human visual quality scores for synthetic SDR and natural HDR images, just as they do in many studies involving natural SDR images.

The description above comes from the spring 2016 PhD dissertation by Dr. Debarati Kundu at The University of Texas at Austin.

Digital Still Camera Introduction

Handheld video cameras, especially in cell phones, have become increasingly popular today because of their portability and price. With handheld cameras, consumers are able to shoot and share videos at anytime and anywhere conveniently. However, the quality of videos shot by handheld cameras is severely affected by unintentional camera motion, such as the up-and-down motion caused by walking or simply jitter caused by hand shake. In addition, rolling shutter effect often exists when there is fast camera motion, whether it is intentional or not.

Rolling shutter effect is a common kind of distortion on CMOS image sensors that have dominated the cellphone camera sensor market over CCD sensors due to lower power consumption and faster data-throughput speed. In a CMOS sensor camera, different rows in a frame are read/reset sequentially from top to bottom, as shown in the figure below. When there is fast relative motion between the scene and the video camera, a frame can be distorted because each row was captured under a different 3D-to-2D projection.

The rolling shutter effect usually includes skew, smear and wobble distortion. The figure below shows an example of the skew distortion caused by rolling shutter (left) and the rectified frame (right).

Our research aims at rectifying the rolling shutter effect and removing the unwanted jitter in the videos (also known as "video stabilization"). Both rolling shutter effect rectification and video stabilization consist of three major steps:

  1. camera motion estimation,
  2. camera motion regeneration and
  3. frame synthesis.
These steps are shown in the figure below.

In the first step, only one camera pose is needed for each frame if the video is captured by a global shutter camera. However, for rolling shutter cameras, we have to estimate camera motion for each row.

In the second step, rolling shutter effect rectification just needs to fix a unique camera motion for all of the rows in each frame, while video stabilization needs to smooth the sequence of camera motions of all of the frames. Actually one can understand rolling shutter rectification as an intraframe video stabilization.

In the last step the new frames are synthesized based on the difference between the original and the re-generated camera motion.

The description above comes from the spring 2014 PhD dissertation by Dr. Chao Jia at The University of Texas at Austin.


Mail comments about this page to bevans@ece.utexas.edu.