Hardware

Nvidia worsens the quality of an sdr monitor when compared to an hdr

Table of contents:

Anonim

In an attempt to demonstrate how important HDR (High Dynamic Range) support is to the image quality of future monitors and televisions, NVIDIA has decided to set up its own booth at Computex 2017 to present the difference between SDR (Range) images. Dynamic Standard) and HDR.

Computex 2017: NVIDIA Modifies the Factory Settings of an SDR Monitor to Highlight the Image Quality of HDR Monitors

However, it seems that the company went further in its attempt to highlight how incredible HDR image quality is, and to this end has decided to modify the SDR monitor image settings to make it look even worse.

This finding was made by the YouTube channel Hardware Canucks, where they claim they had access to the monitor settings that NVIDIA used to run the demo. The company reportedly changed the factory default settings for the standard monitor to reduce brightness, contrast, and even gamma values, which affected image quality considerably.

Resetting the SDR monitor settings to factory settings resulted in less blurring and clearer images compared to the HDR monitor, so NVIDIA has deliberately tried to reduce the image quality of the SDR effect in its presentation.

On the other hand, it must be recognized that the perception of the differences between the SDR and HDR effects depends rather on each one, since not everyone observes the same increase in image quality. However, by modifying and worsening the brightness, contrast, and gamma settings that came by default, it is a clear sign that NVIDIA has been making too much effort to present the power of HDR.

Source: Techpowerup

Hardware

Editor's choice

Back to top button