HDR Tiers: A Guide
Let’s bring the screen to life.
High-Dynamic Range technology, more commonly known by its acronym HDR, increases contrast on the screen. The brighter highlights and darker shadows of HDR create more realistic coloring. Because of the importance of HDR to monitor imaging, we have created this guide to help you through all of the questions that you might have about HDR. This guide will include topics such as different HDR levels, which HDR format is best, if HDR is compatible with PC gaming, what each HDR tier signifies, and if it affects image quality.
So, without further ado, let’s dive in!
What are the different levels of HDR?
Although it may not be automatically apparent, there are three main HDR formats. These are HDR10, HDR10+, and Dolby Vision. Other types of HDR are available, however, they are not commonly used. So, you might be asking… What is the difference between these monitors? Before deciding on which format to choose, it is essential to check certain criteria. The basic measurements for HDR quality are:
- Bit depth-the number of colors a movie/show/game includes
- Brightness- the maximum brightness on the screen
- Metadata- the amount of information sent to the screen
In addition, the factors by which one can define HDR are:
- Color Depth
- Tone mapping
The most basic format for HDR is HDR10; whereas Dolby Vision and HDR 10+ are more advanced versions. To explore these three versions, please refer to the bullet-pointed lists below to find out which HDR format is best suited to your needs:
- Bit Depth: Good
- Peak Brightness Minimum: Good
- Metadata: Static
- Bit Depth: Great
- Peak Brightness Minimum: Good
- Metadata: Dynamic
- Bit Depth: Great
- Peak Brightness Minimum: Great
- Metadata: Best
It is important to note that all HDR content is available in HDR10 as a baseline, with HDR10+ rapidly growing in popularity for large streaming companies like Blue Ray and Amazon Prime Video. Apart from availability, how do you know which HDR option is the best? Let’s explore this in the next section.
What HDR mode is better?
When comparing the three main HDR formats (HDR10, HDR10+, Dolby Vision), there is much to decipher before deciding which mode is the best. Although we mentioned above that HDR10 is the most accessible format, it does not offer the same level of quality as its two other counterparts.
HDR10+, on the other hand, offers the same benefits as Dolby Vision, but it has been slowly picked up by TV makers and therefore has not become an alternative to HDR10.
Therefore, the highest quality of HDR comes in the form of Dolby vision due to its brightness, color, and metadata marks.
Is HDR better than OLED?
OLED is a new technology that has come into popularity in recent years. It focuses on depicting incredible contrasts for imaging on screen. However, it can get confusing when comparing HDR to OLED because both affect the contrast on the screen. So, when OLED technology hit the market, many started to ask which one was better.
To answer this question, it is important to understand what HDR and OLED do to the screen. HDR creates a large variation in light levels and OLED is a technology that focuses on screen brightness and functions by using individually controlled pixels that emit light. Therefore, it’s better to view OLED and HDR working together for the best picture experience instead of comparing them against each other.
So, the answer to this question is divided: OLED offers a better contrast ratio than LED, and, for that reason, it works better with HDR in dark settings. On the other hand, LED works better with HDR in lighter settings due to its extreme brightness. Whether you chose OLED or LED, HDR will help increase the color variation of your screen in any setting.
What are the DisplayHDR Tiers?
Aside from the main three HDR levels, DisplayHDR tiers are put in place in order to set an open standard for HDR quality. Each tier has specific criteria for measuring quality, such as the following:
- Color Gamut
- Bit Depth
- Rise Time
VESA, a regulator for HDR quality, created two new tiers with the title “True Black” in 2019 in order to include the new OLED technology that entered the market. Let’s dive into each HDR tier and the criteria for each one.
This tier is the minimum rating. A monitor within this tier must have:
- at least 400 nits of peak brightness
- 95 percent coverage of sRGB color space
- 8-bit color depth
- Global Dimming
With a slight increase in quality, the requirements for DisplayHDR 500 are:
- Local dimming
- 10-bit color depth
- Over 90 percent coverage of DCI-P3 color space
This tier has almost the same requisites as DisplayHDR 500, the only difference is brightness. Monitors with the rating of DisplayHDR 600 have a significantly higher brightness level than the lower tiers and especially SDR.
In contrast to the 500 and 600 tiers, there is a significant jump in quality for monitors that qualify as Display HDR 1000. The main factors in this quality jump are the black levels and peak brightness. The lower black levels and higher brightness make any monitor with a DisplayHDR 1000 rating perfect for consuming HDR content.
This is the highest HDR certification. The requirements for a DisplayHDR 1400 monitor are:
- Wide color gamut with 99 percent coverage of BT.709
- 95 percent of DCI-P3 color spaces
- Deeper contrast than DisplayHDR1000
- Very high peak brightness level.
DisplayHDR True Black 400,500, and 600
As displayed above, True Black certifications are meant for self-emissive display technologies. For OLED screens, there is no need for measuring local dimming, black levels, or contrast ratio because each feature has infinite potential. However, what can be measured on OLED screens is brightness. Therefore, the DisplayHDR True Black 400, 500, and 600 are based on peak brightness levels.
Is HDR good for gaming?
In this section, we want to focus specifically on PC gaming. HDR has been a hot topic in the gaming world for many years now, but the short version of this answer is not yet.
There are several explanations for this, however, the two main reasons for HDR not being compatible with PC gaming in the past is due to gaming software and monitor capability.
When talking about Gaming software, the problem comes from static metadata. Although some forms of HDR offer dynamic metadata, such as HDR10+ and Dolby Vision, the monitors that have these features are in the higher to premium price range; thereby making them less commonplace in the PC gaming world. In addition, the problem with static metadata is that it causes a problem of consistency for game developers. Meaning, the same game looks different from screen to screen.
Moving onto monitor capabilities, it is the main source for HDR being unfavorable in gaming. The problem is that gaming software has progressed technologically, whereas HDR has not. However, new technology is currently hitting the market and is due to fix the divide between HDR and PC gaming.
This technology is called QD-OLED, or in other words, OLED. With the switch to OLED, the hope in the gaming world is that it will drive better HDR performance, as did mini LED. In addition, OLED is more affordable than monitors that offer more advanced HDR. So, with the arrival of OLED in the gaming world, there are high hopes for HDR.
Does HDR affect image quality?
Yes, HDR does affect image quality, but for the better. It is important to note as well that HDR has nothing to do with resolution.
Image resolution and HDR go hand in hand as seen with 4K HDR. Not only does the resolution go unaffected, but a monitor that has HDR compatibility will also most likely result in a screen that is full of vibrant colors with excellent contrast. The only caveat, however, is that HDR is only compatible with HDR content. So, make sure to check that the content you are streaming is HDR-compatible before buying an HDR monitor.
Did we answer all of your questions? We hope so! Feel free to check bacon on our site for more content relating to monitoring specs and news.