dfo4gold - HDR Gaming And The PC: It’s Complicated

dfo4gold - HDR Gaming And The PC: It’s Complicated

There was a time when all you had to worry about with an LCD display was whether you cared enough to pay extra for a monitor with an IPS panel. Well, that and its size. And resolution. And maybe its native colour depth. And brightness. And contrast. And pixel response. And inputs. OK, it was never that simple. But it’s certainly not getting any simpler: the last few years have added further unfathomables including frame syncing, higher refresh rates, new display interconnects and the 4K standard.

Now there’s more for you to worry about in the form of HDR. Or should that be UHD Premium? Or Rec. 2020? Or BT.2100? Maybe SMPTE 2084 or HDR10? Whatever, it’s mainly about colours, lots and lots of lovely colours. This is already a big thing in HDTVs. It’s coming to the PC. But what’s it all about and is there any chance of making sense of what is, currently, a bit of a mess?

HDR stands for high dynamic range and in this context, that ‘range’ refers to the range of colours a display can generate. The net result is massively more visual pop and vibrancy. It’s a bit like that thing that’s going on with smartphone cameras at the moment. HDR isn’t about more pixels. It’s about making each individual pixel work harder and look better.

The confusing bit right from the get go is that HDR capabilities are not entirely synonymous with the colour depth of a display or the range of colours calculated from the colour depth per channel.

You’ll probably have heard of things like ‘native 8-bit’ displays. Maybe you know that the latest UHD Premium HDTVs support 10-bit colour or ever 12-bit colour. Well, that’s not actually the same as HDR capability. For a flavour of the complexity here, try this excerpt on the new Rec. 2100 HDR standard (which is part of the UHD Premium HDTV standard, more on which in a mo’) from Wikipedia:

“Rec. 2100 defines the high dynamic range (HDR) formats. The HDR formats are Hybrid Log-Gamma (HLG) which was standardized as ARIB STD-B67 and the Perceptual Quantizer (PQ) which was standardized as SMPTE ST 2084. HDR10 uses PQ, a bit depth of 10-bits, and the Rec. 2020 color space. UHD Phase A defines HLG10 as HLG, a bit depth of 10-bits, and the Rec. 2020 color space and defines PQ10 as PQ, a bit depth of 10-bits, and the Rec. 2020 color space.”

Asus MG279Q: Once the messiah of monitors, now utterly outmoded?

Sorry, what? In layman’s terms the distinction to grasp is that HDR as a feature in a display involves both the number of colours on offer and the intensity of light – the latter being that range from the deepest blacks to the brightest highlights. To achieve a proper HDR display using conventional LCD technology, you need both lots of colours and a backlight that can be controlled across a wide range of intensities, including brightnesses at least three times higher than a conventional PC monitor.

The backlight bit is simple enough, conceptually – you need a backlight that’s both brighter and also capable of detailed modulation. So it’s upping the colour fidelity that I’ll primarily focus on. More colours are better. Better as in more realistic, more vibrant, more, well, dynamic. There is, of course, a limit to all of this. And that limit is the human eye.

In simple terms, display technology is constantly striving for and indeed converging with that limit, what you might call the acuity of the human eye. Put another way, the human eye itself has a dynamic range, be that detail, brightness, colours. Once you have a display that achieves or matches that capability, further advances get you nowhere. Humans will not be able to discern them.

Apple cleverly hooked into this notion with its ‘Retina’ displays. The idea with Apple’s Retina displays is that if the pixels are sufficiently close together then their density when viewed by the human eye matches or exceeds the density of the photoreceptor cells in the retina. More specifically the pixel density will match the fovea, which is a small area of the retina that is densely packed with receptor cells and enables the most detailed central area of human vision, known as foveal vision, funnily enough.

Achieve that and you have a display that can generate shapes and curves and forms and details that are indistinguishable from real-world objects. The limitation is then the human eye, not the display.

When it comes to the impact of pixel density, much depends on the distance twixt eye and screen. Apple’s rough definition for its handheld Retina displays was 300 pixels per inch viewed at 10-12 inches. In fact, human vision can probably discern up to 900 pixels per inch at that distance. Likewise, I’m not sure if Apple really sticks to any particular definition for its Retina displays these days. But the concept is obvious enough.

When it comes to colour perception, however, forget viewing distances and pixel pitches. Instead, it’s all about colour space or the range of colours in question. Again, the two important metrics are the range of colours a display can produce and the range of colours the human eye can discern. When the former matches the latter, you have as good a display as we humans can discern.

The relevant target here is known as Pointer’s Gamut. That’s a set of colours that includes every colour that can be reflected off a real-world surface and seen by the human eye. God knows how that is calculated or indeed how it maps to the fact that the ability to sense colour varies fairly dramatically from one person to another. Moreover, the range of reflected colours doesn’t include luminescent colours that can’t be reflected.

Battlefield 1 is set to be the first HDR PC game. Er, I think…

Whatever, what really matters about Pointer’s Gamut is that it’s much larger than the standard colour gamuts or ranges of colours that PC monitor are typically designed to achieve. And as it happens, the full UHD Premium specification includes a colour space known as Rec. 2020 that very nearly covers 100% of Pointer’s Gamut. For the record, UHD is often used interchangeably with 4K, the latter being simply a resolution or number of pixels. But UHD is actually much broader than that and covers everything from resolutions to colours and dynamic range.

Anyway, the most common colour space most PC monitors aim to achieve is sRGB. That only covers a little more than two thirds of the colours of Pointer’s Gamut. Very likely the monitor you are looking at right now can’t even achieve that.

So, your monitor probably can’t achieve sRGB, which is a significantly smaller colour space than Pointer’s Gamut, which in turn doesn’t actually encompass every colour the human eye can perceive. In other words, the range of colours your monitor can display is, quite frankly, pants.

If you want to put numbers on all of this, actually you can. Displays that can natively achieve the sRGB standard have 8-bit per channel colour depth. There are, of course, three colour channels, and if you do the maths, that works out at a little over 16 million colours.

To achieve the Rec 2020 standard, at least 10-bit per channel colour is required, or just over a billion colours. Rec. 2020 also includes 12-bit colour which works out at a staggering 68 billion colours. However you slice it, it’s way more colours than whatever monitor you are using right now.

UHD displays with that near-Pointer’s Rec. 2020 capability are the latest big thing in HDTVs. And that same technology is coming to PC monitors. Unfortunately, it will likely come in many confusing forms. Already, there are inconsistencies in the monitor market with terminology like ‘4K’ and ‘UHD’. Exactly how PC monitor makers are going to deal with marketing Rec. 2020 support as opposed, perhaps, to full HDR support, which itself is open to interpretation, I have no earthly idea. Will they all be sold as ‘HDR’ panels? Or ‘wide gamut’ panels? It’s all very early days.

Where things become further complicated involves the technologies needed to achieve UHD colour depths in practice. HDMI 2.0, for instance, can’t do the full 12-bit per channel at 60fps and 4K resolution. I think DisplayPort 1.4 can. But I’m not totally sure, because it’s all so confusing. Indeed, a list of things your display probably needs to be considered to have basic HDR capability is almost overwhelming:

– HDMI 2.0a or DisplayPort 1.4
– 10-bit per channel colour
– Ability to display very nearly all of the Rec. 2020 or DCI P3 colour spaces
– At least 1000cd/m2 brightness combined with black levels below 0.05cd/m2

Currently, the closest thing I am aware of in terms of a catch-all that tells you a display has at least some HDR capability are the ‘HDR10’ and ‘Dolby Vision’ standards, both of which cover resolution, colour depth and brightness. But I’m not necessarily expecting them to be adopted by monitor manufacturers. Of course, with then latest HDMI standards supported on some PC video cards, there’s always the option of getting in early by using an HDTV as a monitor.

What’s for sure is that these massive new colour depths will be competing with high refresh rates for available bandwidth to your display – more colours means more data sent to your display. But higher refresh rates mean more data, too. And there’s only so much bandwidth. In other words, a display that does it all – 120Hz-plus, adaptive-sync, HDR, the lot – isn’t coming any time soon. When it does, your existing video card almost certainly can’t cope!

Remember the good old days when G-Sync versus FreeSync was all you needed to know?

Then there’s the content side of things. There’s very little HDR or wide-gamut video content out there. It was only in 2014 that the Blu-ray standard was even up-dated to support 10-bit per channel colour. The exception, however, is games. Potentially, at least. HDR gaming is not a new notion and in many ways, games have been ready and waiting for HDR displays to catch up.

In fact, many games have been internally rendering in HDR form and then compressing that down to SDR formats for output for years. More recently, HDR capability has been a big part of the sales pitch for the latest gaming console refresh from Sony and Microsoft, although I’m not sure how good the actual implementation is at this early stage.

As for PC games, my understanding is that there’s nothing that will actually output in HDR currently, but that Battefield 1 will support it at launch. Feel free to correct me! However, if HDR monitors become a big thing, I expect HDR patches for existing games could well become the norm. Put it this way. Of any platform, the PC is arguably the best placed to make full use of HDR as soon as compliant displays appear.

Of course, you’ll also need a compliant video card to achieve HDR on the PC. For Nvidia GPUs, that means Maxwell or Pascal families (GTX 960, GTX 980, GTX 1070, GTX 1080, etc). For AMD, its Radeon R9 300 Series cards can do HDR at 60Hz up to 2,560 by 1,600 pixels. For full 4K 60Hz HDR output, only the latest ‘Polaris’ boards such as the RX 480 can pull it off.

It’s also worth mentioning that OLEDs will do the HDR thing a bit differently and will only add to the complexity when it comes to buying and configuring a system for HDR games or movies. All of which means it’s about as confusing a technology as I can recall and it will take several years for all the competing standards to shake out.

TL;DR
HDR is coming to PC displays and PC gaming and it will mean brighter, more vibrant image quality. But you’ll probably need both a new monitor and graphics card and competing standards will make choosing hardware very complicated.

Comment

Ralated News

Live Chat

  • live chat
  • Email: [email protected]
  • Skype ID: gamehelp365

Completed Order

  • Excellent

    Custorm Buy Delezie (NA - West) 2000M Gold

    Apr/24/2017 06:58:54
  • Excellent

    Blake Buy Cain(NA - East) 40M Gold

    Apr/24/2017 06:58:54
  • Excellent

    Daniel Yi Buy Cain(NA - East) 700M Gold

    Apr/24/2017 06:58:54
  • Excellent

    Awesome service Buy Cain(NA - East) 40M Gold

    Apr/24/2017 06:58:54
  • Excellent

    Gerald Buy Cain(NA - East) 800M Gold

    Apr/24/2017 06:58:54