Thursday, 21/9/2017 | 12:01 UTC+0

Understanding HDR: cameras and displays

HDR. You may have come across this term several times, perhaps even in the course of just today. It seems it is everywhere now and everyone is talking about it. But what is HDR? It’s in your TV, it’s in your phone. It may even be in your camera but is that even the same? And what even is color gamut? What do people mean when they say “whites are whiter and blacks are blacker” and who is this Dolby Vision person? It’s time to find all that out.

Understanding Dynamic Range

To understand HDR or High Dynamic Range, we must first understand dynamic range.

Dynamic range of anything is the difference between the highest and the lowest value of something. While it is used in multiple applications, the dynamic range we will be discussing today pertains to light.

The dynamic range of an optical system is the difference between the highest and lowest value or intensity of light it can detect. The wider this range, the more detail the system can capture. A system with a particularly wide range is called a high dynamic range system.

The human eye has a reasonably wide dynamic range. With our eyes, we can look at a scene and see the details in the brightly lit and the dimly lit areas with relative ease. It’s only when a scene has an intensely bright object that our eyes have to adjust by narrowing the iris, at which point it can only really see the bright object and everything in the darker areas of the scene fades away. If our eyes had a wider dynamic range, we wouldn’t have to squint at bright objects and could see them comfortably. Similarly, we wouldn’t have to strain our eyes so much in the dark while animals such as mice can comfortably see in much less light.

Image credit: LifePixel

Now let’s apply the same logic to a camera system. Just like with the eye, the dynamic range of a camera system is the highest and lowest values of light the system can capture at any given moment. Cameras with wide dynamic range are obviously better but they also tend to be more expensive. Conversely, cheaper cameras or those that are physically smaller (smartphone cameras, for example) generally have worse dynamic range.

In cameras, the dynamic range is more important than it is with our eyes. With our eyes, we can only look and focus at one particular object at a time. Even if we see the entire scene, our eyes are only focused on the object in the center so even if there is something in the corners that is not properly lit it doesn’t really matter because as soon as we shift our gaze there our eyes will adjust to that. With a photograph, we can choose to see different parts of the picture and because they are captured permanently with a certain set of parameters they won’t adjust simply because you choose to look at a different point on the image later.

For this reason, a wide dynamic range is a much sought-after feature in cameras. A high-quality camera system should be able to expose correctly for the bright as well as dark areas of the scene. The image sensor on a good camera can capture enough detail in both, the bright as well as the dark areas of the image. All this light information is often stored in the RAW file, which can later be used to bring out the details in the highlights (the brightest parts of the image) and the shadows (the darkest part of the image) by turning down the former and increasing the latter.

However, there is a limitation to this method and a camera can only capture so much detail in one go in all areas of the image. This is where tone mapping comes in.

Understanding the HDR mode in your phone’s camera

We have all come across this button in our phone’s camera app. Pretty much every phone these days has an HDR option and most of us just choose to leave it on or on Auto.

This HDR mode is actually a misnomer for a technology called tone mapping. What this does is create an image that has details in the brightest as well as the darkest areas of the scene. It can do this by processing a single high-quality image or more commonly, by capturing multiple images at different exposures and combining them.

In the latter method, the photographer first sets the exposure (essentially the brightness levels) of the image low and takes a shot. Then several more shots are taken, gradually increasing the exposure levels while keeping the camera steady. Now you have multiple set of images, with the low exposure shots having great detail in the brightly lit areas of the image but all the darker parts are completely black and the high exposure shots having great detail in the darker areas but all the bright parts are blown out. You can probably see where this is going from here. The photographer then puts all these images in an image editor and superimposes them, which creates a final image that has detail in both, the dark as well as bright images.

Image credit: Klaus Herrmann

Our modern smartphone cameras do all of this for us automatically. They take a bunch of images at different exposures and combine them together to create the “HDR” image. Some may choose to capture a single image and just stretch the shadows and bring down the highlights to achieve a similar effect. But none of this is true HDR.

You see, even though the image has more details in the shadows and highlights, it has been artificially added there. This is because most of us don’t have wide dynamic range monitors or displays so all the high dynamic range content has to be compressed to fit the limited dynamic range of our displays. And because the image isn’t naturally high dynamic range but still has details in shadows and highlights, it also looks unnatural and over processed.

With a true high dynamic range display, you would have been able to see the details in the highlights and shadows of the aforementioned RAW image easily but because most of us don’t have an HDR displays, we have to artificially bring down the highlights and pull up the shadows to match the limited dynamic range of our displays.

Understanding an HDR display

You must have seen television manufacturers claiming HDR support on their latest 4K televisions. Even smartphones are now starting to ship with HDR displays. The first one was the ill-fated Samsung Galaxy Note7 last year but since then we have had the Galaxy S8 and the Galaxy Tab S3, the Xperia ZX Premium and Xperia XZ1, the LG G6 and V30 and most recently, the iPhone X. So what is an HDR display?

An HDR display has three advantages over a standard dynamic range display (let’s just call it SDR, even though that’s not an official term):

  • Wider dynamic range
  • Higher peak brightness
  • Wider color gamut

The first and the second are closely related. The display is capable of showing more detail in the brighter and darker areas of the screen. This is where the often touted but seldom explained “whites are whiter and blacks are backer” adage comes in.

When you look at HDR content (more on this later) on an HDR display and compare it side by side with SDR content on SDR display, you will notice that the brighter areas of the image are brighter. However, at the same time you can actually see more detail there. For example, if there is a shot of someone standing next to a window with bright light coming in, the side of the face facing the window will look overexposed on the SDR display and will just appear white. The same spot on the HDR display will look brighter but even through that you will be able to see the texture and details on the skin without the bright area just looking like a white glowing spot.

Same thing is with the shadows. A dark night shot on an SDR display will have some areas of the image, such as hair or a dark jacket, just appear black but on the HDR display you will be able to make out more details and see the texture.

This is where the high brightness helps. The increased brightness elevates the whole image and lets you see more details. You can ask why can’t they do the same with SDR display but with an SDR display, increasing the brightness will just make the image wash out without adding any more detail.

The third aspect is wider color gamut. Our eyes can see a certain range of colors. Unfortunately, due to various restrictions in transmitting data, whether it’s over the television or the internet, the images we see on our screen use a significantly smaller subset of colors than what our eyes can see. With a wider gamut of color, we are effectively increasing the range of colors the image has. It’s still not close to the limits of what our eyes can see but it’s still better than an SDR image.

What this means is that images appear more lifelike as you can now see a wider range of colors on your screen. A tomato in real life looks intensely red and vivid but bland on screen because the display and the format simply did not have enough range of colors to reproduce the object accurately. With HDR, it will look closer to the real-life version, if not quite the same.

To clarify, a wider gamut is not more saturated colors. It’s not the same as increasing the saturation on your display. Increasing the saturation simply increases the intensity of the color. It does not show you more color. A wider color gamut lets you see more shades of color, which increasing saturation cannot achieve. This is the difference between an oversaturated display and a wide color gamut display.

We now know the display side of the story but there is the content side as well, which we will discuss below.

Understanding HDR formats

An HDR display is only an HDR display if it is showing HDR content. Without that it just goes back to being a really good SDR display.

HDR content is currently available in two major formats, HDR-10 and Dolby Vision. While these are referred to as formats, they use existing codecs such as H.264 or HEVC and existing containers such as MP4 or MOV but have additional metadata in the file to distinguish itself to HDR systems. An HDR format file played back on an SDR display will look flat, with low contrast and low color, as the system cannot display light and color information outside of its own range.

HDR-10 is an open industry standard created by the Consumer Technology Association whereas Dolby Vision is a proprietary standard created by Dolby. Think of HDR-10 as USB-C and Dolby Vision as Lightning and you should get the picture.

HDR-10 is the most commonly used format because it is free to use and does the job. Everything that claims to support HDR uses HDR-10 while some also support Dolby Vision in addition to HDR-10.

To be HDR complaint, the content has to be mastered in a certain way. While existing content can be tweaked in post-production to be HDR-ready, just like with 3D, the best way to get HDR content is to record it that way. Now, there aren’t any HDR cameras out there but all the high-end video cameras used by professionals, such as the RED, ARRI or Blackmagic cameras, capture enough dynamic range and color information by default that the footage can easily be converted into HDR video. While editing, this footage would normally be compressed to fit the narrow dynamic range and color gamut of SDR publishing but for HDR the coloring artist could leave a lot more of the dynamic range and color information in the final edit. Depending upon which format they choose to go with, it could be mastered in HDR-10 or Dolby Vision.

The unique thing about Dolby Vision is that Dolby has full control of the entire pipeline of the content, right from working with the content creators for mastering it to where and how it is displayed. With HDR-10 you can mess around slightly with your video settings but with Dolby Vision, the settings are locked to what Dolby wants you to see. With Dolby Vision, the visual settings are dynamically altered for every scene using preset metadata for the best image quality. Dolby also has significantly higher requirements for hardware, with displays having to meet certain color and brightness requirements that are higher than HDR-10.

With such tight control of the proceedings and high minimum requirements, the quality is generally higher on Dolby Vision content but it also results in less content in general. It’s also why we got about eight phones with HDR in a span of a year but only one with Dolby Vision, and why only a handful of expensive, absolute top of the line televisions have Dolby Vision.

Acquiring HDR content

Two of the main ways to acquire HDR content today is via Blu-ray discs and streaming services. Blu-ray, specifically 4K Blu-ray, is where you will get the best quality. Blu-ray discs have uncompressed video and audio and is just the absolute best way to enjoy your movies or television shows.

However, what most people will end up using is streaming services, especially since that is the only option available on mobile. Here, companies like Netflix, Amazon and YouTube rule. Netflix, in particular, has one of the largest libraries of HDR content on the internet. Netflix is also the only one to have HDR-10 as well as Dolby Vision content.

Amazon would be a close second along with Hulu. YouTube recently started supporting HDR content and Google also added some HDR movies to its Google Play Movies services. This week, Apple also threw its hat in the 4K HDR ring with the announcement of the Apple TV 4K, iPhone X and iTunes 4K content.

However, even with these many services at your disposal, the amount of HDR content is still limited. Not all the content on the aforementioned services is in HDR. Netflix even makes you pay for its four-person plan for you to access its 4K HDR library, even if you are the only person using it. Then depending upon your region, much of this already limited library could be further restricted. Some of these services won’t be offering HDR in your region at all.

As such, getting your hands on HDR content right now isn’t easy. However, things are slowly improving and as HDR hardware becomes more accessible, the content situation should improve as well.

As an aside, there is also a third way of getting HDR content, and that is in games. Currently, PC, PS4 and Xbox One support HDR games that have all the aforementioned advantages of HDR video. However, there are currently no HDR games on the mobile platform.

Wrapping it up

In summary, HDR is all about increasing the quality of your content. Previous advancements in video technology were primarily about increasing the resolution but HDR is where the advancement happens on the pixel level, meaning it’s less about more pixels and more about better pixels. The brighter, more dynamic and vibrant HDR image is far more obvious even to the novice eye than a resolution bump, which may or may not be obvious depending upon your visual acuity or distance from the screen.

It must be said that it still depends upon two major factors, quality of the HDR panel and mastering of the content. Cheaper HDR televisions are obviously nowhere as good as the most expensive ones and just like with 3D, some of the HDR content is clumsily mastered with absurd colors and contrast to make it pop more.

However, because of the requirements of HDR, even a half decent HDR panel will automatically be better than an SDR panel. And as for smartphones, considering only the flagship phones currently have it you can expect the displays to be good in general. And if you choose to go with Dolby Vision, you can be especially sure of the quality because of all the work Dolby has put in.

And it’s worth repeating that the HDR displays and technology has nothing to do with the fake HDR button in your phone’s camera app and that the latter is largely bogus while the former is true HDR.

I hope this helped clear some of the confusion surrounding HDR. In the coming days we will be seeing a lot more of this technology but for once, it is actually useful and not a gimmick so I personally will be looking forward to it.