15.6 4k lcd panel hdr 10bit quotation

Native 4K resolution, 4K 12G-SDI single link supports utmost 4096 x 2160 60p format, 2 x 12G-SDI signal inputs and outputs (auto detected 6G/3G/HD/SD-SDI), 1xHDMI2.0 input, 1xSDI SFP+ module input cage.

Konvision KUM 4K, 8K and KVM-6X series support HDR display. Adjustable HDR modes include PQ(ST2084), HLG with Rec 2020 color gamut. It reproduces a greater dynamic range of luminosity and provides extremely high level picture quality.

Konvision KUM 4K, 8K and KVM-6X series supports a variety of EOTF curve conversion applicable to the broadcast industry and digital film standard. A preset of lots of HDR log, SDR logs and gamma curve selection, so as to realize the perfect combination with the camera system.

4K HDR Waveform. SDI and HDMI support Waveform, Vectorscope, Histogram. When luminance reaches or exceeds the preset value, the over exposure areas will be red marked (Waveform Alarm).

15.6 4k lcd panel hdr 10bit quotation

Supporting 8192×4320 8K signal, including 4320p 23.98,24,25,29.97,30,50,59.94 and 60p.With advance image processing,8K HDR monitor restores a real world for eyes.

Konvision KUM 4K, 8K and KVM-6X series support HDR display. Adjusrtable HDR modes include PQ(ST2084), HLG with Rec 2020 color gamut. It reproduces a greater dynamic range of luminosity and provides extremely high level picture quality and image reproduction.

Konvision KUM 4K,8K and KVM-6X series supports a variety of EOTF curve conversion applicable to the broadcast industry and digital film standard. A preset of lots of HDR log, SDR logs and gamma curve selection, so as to realize the perfect combination with the camera system.

Konvision 8K and 4K monitors offer HDR & SDR side by side comparison. This function allows customers to compare the difference between HDR and SDR on the same screen. it allows users to see more picture details and color in scense.

15.6 4k lcd panel hdr 10bit quotation

The best 4K monitors are now almost essential tools for visual design work. In fact, for those working with video, 4K is almost becoming a minimum now that 8K video is emerging. 4K, also known as Ultra HD (UHD), refers to a resolution of 3,840 x 2,160 pixels. That"s four times as many pixels as full HD (FHD), and that increased pixel density makes for a much cleaner, sharper image with more detail and texture on any screen from 27 inches and above.

The good news is that the best 4K monitors are a lot more affordable than they were a few years ago. Because UHD has now become more standard, both for media consumption as well as professional use, there are a lot more options on the market today. The flip side of this is that it"s now perhaps harder than ever to work out which is the best 4K monitor for your needs.

To help with that, we"ve made our own pick of the best 4K monitors based on our own reviews, recommendations from working creatives and a full comparison of their specs. We"ve weighed up the pros and cons of each screen and evaluated them for build and ergonomics image quality, features, colour support and accuracy, brightness and connectivity (learn more about how we test and review at Creative Bloq).

We"ve also considered value for money in order to recommend options for different needs and budgets. The best 4K monitors in terms of image quality are still very expensive – demanding professionals with a big enough budget will want to go for the Asus ProArt PA32UC-K or Eizo ColorEdge CG319X, but we haven"t placed these at the very top of our list because their price makes them simply unaffordable for many people. When factoring in value, cheaper 4K monitors like the Dell S3221QS and Samsung U28E590D end up ranking higher since we think they meet most people"s needs and an accessible price.

If you"re looking for the best 4K monitor specifically for video work, see our pick of the best monitors for video editing. And while some of the screens below can support daisy chaining for a dual monitor setup, you might also want to consider one of the best ultrawide monitors if you want more space to work with.

Pro features for under a grand put this at the top of our list of the best 4K monitors. Following in a long tradition of fine displays from Dell, the UltraSharp U3219Q offers full sRGB covering, 95 per cent DCI-P3 and exceptional colour uniformity, making it ideal for all types of creative, including those working in photo and video.

Dell takes second place on our pick of the best 4K monitors too, but this time with a much more affordable display. So OK, perhaps it"s not exactly cheap, but it"s a lot more affordable than most good 4K monitors. It looks stunning too. Dell monitors aren"t known exactly known for their sleek looks, but the curved S3221QS is a lot more elegant than its name, standing out from all those black and grey business monitors with its white back and base.

Professional 4K monitors can be massively expensive (just see the stunning Eizo ColorEdge CG319X and the Asus ProArt PA32UC-K below), but there are now some fantastic UHD screens that strike a great balance between specs and price. Our favourite, certainly for photography, is the BenQ SW321C PhotoVue. When we reviewed it, we found it to be the perfect (reasonably) affordable 32in 4K monitor for photo editing for performance and usability.

The Eizo ColourEdge CG319X is the connoisseur’s choice in high-quality displays. Eizo displays are a very familiar sight in professional video and photography studios – and this 31-inch 4K monitor, with a 10-bit display and 24-bit colour look-up table, is a stunner. The CG319X also boasts one feature that sets it apart from competing high-end 4K screens: 4096 x 2160 resolution rather than 3840 x 2160. This reflects the slightly taller 4K standard used in digital video production.

For a more compact 4K monitor, this 28-inch Samsung 4K display follows very closely on the heels of the Dell 4K S3221QS at number 2 on our list of the best 4K monitors above in terms of value. It lacks pro features but there"s still 100 per cent support for the sRGB colour space, a high 300cd/square metre brightness level and support for 60Hz 4K.

The MateView is firmly pitched at working creatives. Its IPS panel is capable of displaying 100 per cent of the sRGB colour gamut, and 98 per cent of the DCI-P3 video colour space. It also has a maximum brightness level of 500 nits, and a 1200:1 contrast ratio. The sleek, slim-bezel design is a nice addition too, and we found the touch-sensitive smart bar to be an ergonomic way to control the monitor. It"s available at a tempting price (although there are some stock issues in the US), and we reckon it"s a solid choice of monitor for creatives.

Compared with some of the pricey high-end colour-accurate Eizo and Asus screens above, the Philips Brilliance 328P (another really catchy name, right?) is an excellent alternative, as it’s great value for money while still offering solid visuals. It’s a 31.5-inch IPS panel with measured 99 per cent sRGB and 73 per cent AdobeRGB coverage, a thin-bezel design and a few extras such as a pop-up webcam that works when the built-in USB hub is connected.

Photographers and videographers who need high resolution may also want to consider this more compact LG option, which offers a 98% DCI-P3 colour gamut, great colour accuracy and typical brightness of 540 nits. The stand is height-and-tilt adjustable so you can find the right position for you, and the Thunderbolt 3 port supports 4K Daisy Chain so you can set this up with another monitor.

Viewsonic has a few colour accurate displays on the market, and the VP2785-4K is the most high-end model in its catalogue. It"s a 27-inch 4K IPS screen, sporting 100 per cent sRGB and quoted 99 per cent AdobeRGB coverage. It’s a bit fiddly to put together, requiring a screwdriver to attach the panel to the stand, but the overall design is extremely svelte, with a thin and light build, near edge-to-edge screen, and only a small bezel at the bottom that accommodates touch-sensitive controls.

With a 14-bit LUT, 700:1 contrast ratio and 375-nit brightness, the picture quality of the VP2785-4K won’t disappoint, although it doesn’t quite deliver the same eye-popping colours of the most high-end 4K displays money can buy.What is a 4K monitor?4K, also known as ultra-high definition or UHD, is a measure of a screen"s resolution. 4K monitors have a resolution of 3840 x 2160 pixels, which compares to a resolution of 1290 x 1080 for full high definition (FHD). This means that it"s almost four times FHD resolution. What that means in practice is that images look sharper and tighter, which is great for viewing HD video and higher quality video game graphics.Do I need a 4K monitor?There are a couple of main reasons that you might benefit from a 4K monitor. Firstly, for your own enjoyment. Even if you"ll be using your screen purely for entertainment purposes rather than for work, 4K offers a notably sharper picture, which can enhance the enjoyment of watching films, series and playing games. That said, when it comes to PC gaming, 4K UHD resolution is very demanding and many will find that the improvement in image quality isn"t worth the drop in performance.

The other reason you might want one of the best 4K monitors is for work. If you work in any visual creative area, a 4K screen can improve your experience by allowing you to see your work in more definition. If you work in video, you"ll almost certainly need at least a 4K monitor since 4K video has almost become the norm. If you"re producing 4K video for a client, then you really need to be able to view it in 4K while you work.Is a 4K monitor worth it?Today, for most professionals it"s almost certainly worth investing one of the best 4K monitors. They"re still more expensive than 1080p displays, but they have come down in price a lot as they become standard, and they"re now so much more affordable than they were that it"s no longer such a big decision.

While 4K doesn"t make a lot of sense for a small monitor, from 27-inches and up, it makes a huge difference from FHD that will be immediately apparent to anyone. Gary Heiting, an optometrist and senior editor of the website AllAboutVision, even says that the increased screen resolution can reduce the risk of eye strain, so working in 4K over long sessions can be more comfortable even if you don"t need to produce 4K video.Is my computer compatible with a 4K monitor?You might hope that buying one of the best 4K monitors will automatically improve your viewing experience, but it"s important to know that not every laptop or PC can support 4K. Most recent PCs or Macs should have no problem displaying 4K resolution, but it"s a good idea to check your screen"s recommended display resolution before you buy a new screen. We have a guide to screen resolution that may help.

To check your device, Right-click your desktop and select “Screen Resolution”. Under display settings, you"ll find a range of screen recommendations. If 3840×2160 is listed, you can be confident that your computer will indeed support a 4K monitor. If not, then you"ll need to upgrade your computer as well as your monitor if you want to enjoy 4K video rendering.

There"s also the issue of ports. You’ll need to make sure your PC has either an HDMI 2.0 port or DisplayPort 1.4 port that can support 4K since earlier versions of these ports do not. Your CPU And GPU also have an impact on your device"s ability to run 4K, because 4K is more demanding.

If your device uses Intel integrated graphics, you"ll want to have at least a 4th generation (Haswell) processor core processor. If you have Ivy Bridge or earlier, you"ll need to have a recent graphics card installed (if you"re unsure, you can check your processor at ark.intel.com to find out what the motherboard or integrated CPU graphics is capable of.How do I choose the best 4K monitor?You can now go 4K without spending a fortune, but the best 4K monitors can still be very expensive if you want pro-level calibration and the best colour accuracy. If you"re going to be using your screen for any kind of colour work, then you want precise colour accuracy. Most entry-level 4K monitors actually do a fairly decent job, but the best 4K monitors for designers will have full coverage of the AdobeRGB or DCI-P3 colour space.

After colour, size is obviously another major factor in choosing the best 4K monitor for you. The most popular choice is usually 27-inches but 32-inch screens are becoming more common. If you"re looking for a display specifically for image editing then make sure you see our roundup of the best monitors for photo editing for more options.

You"ll also want to check what ports a monitor has before you buy it. The two cheapest options in our list of the best 4K monitors don"t have a USB-C connection, something that many creatives will want for hooking up devices quickly and easily. Most monitors have DisplayPort and HDMI ports, but this can"t be taken as given either – the LG Ultrafine 24MD4KL is well kitted out with USB-C and the faster Thunderbolt 3 ports, but skips the older ports.

15.6 4k lcd panel hdr 10bit quotation

You might be wondering “But my TV turns on its HDR modes and games look better” this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.

However, if you want to output true 10-bit, then you’ll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

15.6 4k lcd panel hdr 10bit quotation

Ok, so you might have a hard time finding that last one. Color bit-depth is often hidden on the specs page or described in some obscure way. However, bit depth is becoming an increasingly important metric for comparing projectors that claim the ability to reproduce wide color gamut (WCG) and high dynamic range (HDR) content. In fact, it may actually tell you more about a projector"s potential image quality than its contrast, pixel resolution, or even color accuracy ratings—all of which can be varied based on display modes or focusing accuracy.

Fortunately, there is a simple way for any serious video enthusiast to download and view 10-bit test patterns to help assess their display. All 4K UHD Blu-ray players have built-in 10-bit per color graphics capability for playing back 4K UHD Blu-ray movies—all of which are stored in 10-bits per color HEVC format video. Most of these 4K UHD Blu-ray players and a few 4K media players, including the Roku 4K HDR, have a USB input that enables them to play back animated 10-bit per color test targets that have been saved in 10-bit HEVC format.

If you"d like to see how your own projector handles 10-bit signals, you can download the 10-bit per color animated test target you see below (Figure 3), created by In-Depth Focus Labs, from ProjectorCentral.com. The spinning wheels display a 10-bit grayscale between video levels 0 and 20 on the left, and levels 20 through 100 on the right. Although it should appear as a grayscale image, it is actually a full color pattern containing metadata tags that should automatically turn on the HDR and WCG modes in any HDR10 compatible display.

To view the test pattern on your display, copy it to a USB flash drive and insert the drive into the USB media input on your UHD Blu-ray player. When you play the file from the disc player"s built-in media player, it should be recognized by your display as a UHD resolution video with 10-bit bit depth, HDR, and BT.2020 color space.

Unlike a monochrome display, color monitors must form at least three grayscale images that represent the red, green, and blue data channels found in a standard SMPTE color signal. Most 3-chip projectors, whether using LCD, LCoS, or DLP imaging chips, start by using the data from each of the incoming R, G, and B data channels to form associated grayscale images. These are then illuminated by red, green, and blue lights (created by filtering a white light or using color LEDs or lasers) to form an overlapping full color image on screen (Figure 4).

Below (Figure 5) are examples of the 10-bit circular HDR grayscale target cited above as it should appear when properly processed at 10-bit depth (top), and with obvious banding as a result of being processed with only 8-bit or 9-bit depth (bottom). You can clearly see the banding steps in the darkest part of the test pattern, and more subtly, in the brighter part of the pattern.

In real world content on most 8-bit per color displays, you might perceive bit-depth banding issues in the transitions of light levels and colors in a sunset, or in the different hues of blue in a sky. Other bit-depth artifacts can be seen around the edges of objects, such as the transition between a planet in outer space and the halo of light surrounding it, or when one saturated color ends and another begins. Instead of a smooth tonal transition, you see a line or edging effect. For example, in the illustration below, shot in 4K HDR with 10-bit color depth, compare the out-of-focus, violet-tinged flowers behind the butterfly. The top frame in Figure 7 shows the out-of-focus flowers as they should appear with proper 10-bit processing. Below that is the frame processed at 8 bits per color.

For more than a decade, advanced photographers, videographers, and film directors have been aware of the advantages of capturing and processing color images and video with a minimum of 10-bits per color (30-bits per pixel). The RAW modes on all DSLR cameras store still photos in 10- or even 12-bits per color, and affordable 4K camcorders now have similar capabilities. On the computer side, every Mac currently sold has at least 10-bits per color graphics capability, as do the majority of PCs, image and video editing programs, and 4K or higher-resolution monitors used for image editing and advanced gaming.

However, it wasn"t until 4K UHD Blu-ray movies and players became available, enabling the distribution of high dynamic range (HDR) and wide color gamut (WCG) content to a home audience, that 10-bits per color became an important feature for both flat panel TV"s and projectors. Before that, the marketing of displays and projectors had concentrated on increased resolution and in some cases, improved color accuracy and extended color gamut reproduction. In 2015, 10-bits per color became the minimum acceptable color standard when the CEA released its minimum guidelines for HDR10-compatible displays and projectors, which included a 10-bit requirement under the HDR10 Media Profile. Here are the parameters:EOTF: SMPTE ST2084

The simplicity of the CEA definition may have created more confusion among consumers than it eliminated. A deeper read shows that all a projector or display has to do in order to claim "HDR10-compatibility" is to accept an HDR content signal containing 10-bits per colors data that"s stored using BT.2020 color space coordinates and includes appropriate HDR metadata tags. But HDR10-compatible displays and projectors are not required to maintain 10-bits per color from input to output, or even reproduce any wide gamut colors outside the standard dynamic range (SDR) Rec. 709 color space. That loophole was intentional, and left the door open for more affordable and "older-technology" 8-bit displays that are limited to Rec. 709 color gamuts (or slightly more) to be re-engineered to accept HDR and wide gamut color content from 4K UHD Blu-ray players without choking.

The TV industry has always prioritized backwards compatibility, and in this case it can be done with some internal processing tricks on the display or projector side, or within a computer or stand-alone media player. The result is that some displays with limited bit-depth capabilities are labeled as HDR-capable, but don"t really meet the criteria or deliver the full image quality benefits of 10-bit HDR displays.

Here"s how it typically works for an 8-bits per color display claiming to be HDR-compatible: When an incoming 10-bit HDR movie signal is detected, a front end processor in the display downsamples the signal to 8-bits per color data, or creates dithered 10-bit colors. Next, the display applies a reverse HDR or HLG curve adjustment to counter the EOTF 2084 contrast curve applied during the HDR mastering process. A color look up table (LUT) is then applied to scale all the wide gamut colors the display can"t reproduce to the closest in-gamut colors that it can reproduce. Additional image tweaks may include selective saturation, contrast, and blurring adjustments to minimize posterization and banding artifacts.

The result on the screen lands in between a SDR 8-bit image and a 10-bit HDR image. You may still see some wide gamut colors in the 8-bit display output, as 10-bits per color is not required to create many of the DCI-P3 gamut colors that fall outside the smaller Rec. 709 standard color gamut. However, no reasonably affordable 8-bit display or projector can achieve 100% coverage of the DCI-P3 wide gamut color space used to master and color grade 4K UHD Blu-ray movies, and 10-bit or higher color is required to achieve the additional colors found in the full BT.2020 color gamut.

Projectors and flatpanel TVs with true 10-bit processing and the improved image quality it enables are out there and more affordable than you might think. But they"re competing with some "HDR10-compatible" models that claim all sorts of HDR advantages yet don"t reveal their 8-bit limitations until you see their output on screen, or learn about it in a product review. The lesson? If you"re in the market for a new projector, make sure you do your homework.

Just curious. So practically speaking, how does this relate to an old projector like the Optoma HD80 which claims 10 bit color processing when it obviously doesn"t support HDR.

Jason, the manufacturers don"t always make this information immediately available in the spec sheets we use to create the database. But we are discussing how we might be able to incorporate this information. At this point, pretty much all new 4K displays ought to be able to do full 10 bit processing,though I suppose some budget models might not.

Great thanks for this, I wish there was more calibration/test patterns available on movies/devices/projectors. Trying to play a 4K HDR disc through a xbox one x on my projector is just hoping for the best, none of those things have any great test patterns to help you out. Things will get better but right now its...not good.

Nitin, the only Dolby Vision projectors we know of on the planet are the Christie digital cinema projectors built expressly for Dolby Vision movie theaters. There could be a couple of possible reasons for this, all conjecture on my part and not from anything I"ve heard: 1) The cost of licensing it for a projector is prohibitive for projector manufacturers, or 2) what I think is the more likely reason, Dolby simply doesn"t have a version of this technology to license to projector makers. Dolby has been very interested in making sure the technology is well executed so that viewers have a positive view of it, and unlike in Dolby Vision theaters, where they have direct knowledge of the screen size and material and lighting conditions being used and get involved in the installation and tuning of these systems (which I think are dual-projector)they can"t know how it"s going to look in home theaters. Given that HDR10 is essentially an open source technology, they can"t sell licences for something that can"t be sure looks demonstrably better or at least is executed to a level they consider satisfactory. They can do that with TVs -- but not projectors.

Do you believe we will ever get 36 bit colour projectors? If so when could you gestimate when we would have such things? Which system is best capable of doing in future, laser, LCD etc?

I"m curious about the use of temporal dithering for the many panels that are true 8 bit panels but want to "show" 10 bit colors. With all manufacturers that I investigate, I cannot find what their native panels are. What I am looking for is a modern day true 10 bit projector, but I don"t know where to look. Many say they perform 10 bit color processing, but that does not necessarily mean the panels are 10 bit. I"d really love to hear your thoughts on this. Thank you for the great article.

15.6 4k lcd panel hdr 10bit quotation

This is a subtle perk to the DisplayHDR certification. Every version of DisplayHDR enforces some minimum level of color gamut and bit depth, and better certifications are fairly rigorous. DisplayHDR 500 and above requires a color gamut that spans 90% of the DCI-P3 standard, and DisplayHDR 1400 bumps that up to 95%. That’s excellent, and you don’t even need HDR content to see this difference. Anything mastered for a wide color gamut will do.

I’ve mostly touched on monitors so far. This guide is about HDR on your PC, after all. Still, you might wonder how HDR differs between monitors and televisions. Maybe you’re even batty enough to consider a 48-inch OLED for your gaming rig.

You can throw everything I’ve said about DisplayHDR out the window. Televisions don’t participate in DisplayHDR certification. This can lead to rather terrible HDR in the most affordable televisions. The $250 Insignia you’ll find on the floor at Best Buy? It’s hot garbage (but you probably knew that).

However, competition between TV makers means solid HDR is available even in mid-tier televisions like the TCL 6-Series or Hisense H-Series. Most televisions have the brightness, contrast, color gamut, and bit depth to provide a very noticeable improvement over SDR. The best HDR televisions, like Samsung’s QN90A or Vizio’s P-Series, can broil your retinas. It’s intense.

Televisions are more confusing in one area: the competition between HDR standards. Everything supports HDR10, but there’s also HDR10 Plus, Dolby Vision HDR, HLG, and Advanced HDR. Each alternative is backed by an interested company or organization.

There are exceptions. A small handful of laptops, like some Lenovo ThinkPad X1 models and Dell XPS systems, offer Dolby Vision. You’ll find plenty of content, from streaming services to PC games, that support an additional HDR standard if you have a compatible display.

For now, though, PC users shouldn’t pay the HDR format war any mind. Support for HDR formats aside from HDR10 is far too spotty and haphazard to be worth your time.

HDR support was added to HDMI with the HDMI 2.0a specification, and it came to DisplayPort with the DisplayPort 1.4 specification. These arrived in 2015 and 2016, respectively, so your PC likely has one of these display connections if it was sold in 2016 or later.

There’s some technical nuisance here. Cable bandwidth has changed over time so older cables might lack the bandwidth to handle HDR at the resolution and refresh rate of your display. Then there’s the trouble of cables built with sub-par materials.

Still, you can expect most HDMI or DisplayPort cables in your drawer to work, though our guide to the best HDMI cables can help if they don’t. Also, an HDR monitor will include an appropriate cable in the box.

What about USB-C? Many laptops have a USB-C portthat includes DisplayPort alternate mode. This will typically be DisplayPort 1.4 or 2.0 certified, which means it can handle HDR. However, unlike with HDMI and DisplayPort, the cable used is important. That’s because DisplayPort is an optional extra, not a mandatory part of the USB3 specification. Make sure to buy one that claims DisplayPort alternate mode support, or just go with a ThunderBolt 4 cable, which should be compatible.

Thunderbolt always supports DisplayPort but originally released with DisplayPort 1.2 support. This was upgraded to DisplayPort 1.4 in 2018, so more modern Thunderbolt 3 connections will support HDR. Thunderbolt 4 always supports DisplayPort 1.4, so it alway supports HDR. Because DisplayPort is a mandatory part of the specification, you don’t have to worry about buying the right cable (assuming, of course, that the cable is built to spec).

You can turn on HDR in Windows 10 with the flip of a toggle, but this only enables HDR in Windows 10 itself. It doesn’t suddenly make your photos, videos, or games HDR. The content needs to be made for HDR from scratch.

Windows plans to add an Auto HDR featurein a future update. This will automatically turn HDR on when required. For now, though, you must turn HDR on manually. Do this before viewing any HDR content.

HDR is simple to use in PC games that support it. You will find an HDR setting in the game’s display or graphics options. Flicking it on enables HDR in the game. You may also find additional options like HDR calibration or support for additional HDR formats, like Dolby Vision. The details will of course vary from game to game but, in general, switching HDR on is all you need to do.

Streaming services are complex. Licensing issues, which I won’t get into here, mean that Windows 10 doesn’t ship with HEVC codecs required to view streaming HDR. I know, I know. It’s weird. You must download the HEVC Video Extensions for 99 cents from the Microsoft Store.

The streaming service must also support HDR. Requirements for this differ depending on whether you’re using the service through a web browser or through a Windows 10 app. Check your streaming provider for details. Some streaming services don’t support HDR on PC at all: HBO Max is one example.

HDR content saved to your hard drive is less obtuse. All you need is an HDR video file and a video player that supports HDR. The Movies and TV app in Windows 10 can handle this if you have the HEVC Video Extensions installed. VLC, a popular third-party media player, also supports HDR. VLC is popular because you can add a free HEVC codec to it, bypassing the need to buy HEVC extensions from Microsoft. (VLC also lets you play DVDs in Windows 10 for free.)

I wish this guide could be shorter. It’s a bit nutty to think you need a 2,000-word essay to explain how HDR works on a PC. Still, if you want the short-and-sweet of it, here’s my recommendation for what you need to achieve great HDR on a PC.

My personal go-to for HDR testing is Microsoft Flight Simulator, an absolutely stunning game that hugely benefits from the tech. A great HDR display is the only way to experience the searing glare of a desert sunset or the lonely glow of flight instruments in a pitch-black cockpit.

15.6 4k lcd panel hdr 10bit quotation

You may also find it helpful to read these two other posts: How to use an OLED in Post Production and a detailed round up of 4K Video Editing Monitors.

These OLED TVs deliver a large screen, ranging from 55″–83″, with perfect blacks, a wide colour gamut and capable of displaying both SDR and HDR content. And best of all they come with a consumer price tag.

Often colorists may also have a smaller, more expensive and more accurate display in front of their control panel as their main point of reference, but if both displays don’t line up together exactly you can get into the troublesome situation of the client asking “Which one should I be looking at?” i.e. what can I trust?

If you can afford it, the consensus seems to be that a Flanders Scientific is the more affordable of the high-end. While their flagship 3000nit 10bit 4K HDR monitor, the XM310K will set you back $25,000

The Flanders Scientific DM170 drops down to a bargain price of $3,495 for a 17″ 1920 x 1080 10bit LCD display. For comparison the cheapest FSI monitor is the 2021 AM211, a 21.5″ HD 8bit monitor for $1,995.

Display Resolution – You want to be able to monitor the kind of footage you regularly work with at it’s full resolution. If you’re often working with 4K footage then you’ll want a 4K monitor. If you’re only ever delivering HD then a 1920 x 1080 monitor will do just fine. (See my 2018 update below for more on this!)

Contrast Ratio – This will probably make the biggest difference to your perception of the images on display. Glossy displays tend to have a higher contrast ratio than matte displays. According to chapter 2 of Alexis Van Hurkman’s Color Correction Handbook 2nd Ed. (paraphrasing here) for an LCD display 1400:1 (glossy) or 1100:1 (matte) or better, is a good ball park. For OLED 5000:1 is a good ball park.

Black Levels – Having deep blacks is what colorists are always looking for, not muddy grey ones. Deep gorgeous blacks with plenty of detail still in them. Partly this impacts on your perceived contrast and partly it’s a sign of a good display panel. OLED panels beat LCD in this and the contrast department.

Brightness – SDR (Standard Definition) is mastered to a 100 nit brightness range. HDR is usually mastered to 1000 or 4000 nits. True HDR reference monitors are incredibly expensive. My focus in this post is on SDR use-cases.

It’s worth noting that very few displays these days offer the 4096 x 2160 true 4K resolution, but the price bump to monitors that do, doesn’t seem to be worth it.

In 2016 I bought the LG 31″ 4K 10bit monitor (LG 31MU97-Z)and have LOVED using it every day since then. I can’t even begin to calculate how many hours I have stared at this screen!

“You have pure 8bit, you have 8bit+FRC. Now there’s different types of FRC; there’s high speed switching between the bit value over and under and this can happen both in a spatial and a temporal state. So it allows you to, through this rapid switching, get a perceived higher bit depth than the panel may actually have.

The ASUS ProArt series has an impressive spec, with the higher end, mini-LED HDR models having an equally impressive price tag. When it comes to finding a more affordable option in the range, it appears that the Pro Art PA329C (2019) is currently the best bet.

Importantly, the PA329C supports hardware calibration, a14-bit LUT and the ability to store custom colour profiles on the monitor. It comes with a VESA DisplayHDR 600 certificate.

For context the 2020 Dell Ultrasharp UP3221Q, a 4K UHD display with a true 10 bit, mini-LED backlit panel delivering 1000 nits of HDR ready peak brightness costs close to $4,000/£3,600.

To be clear there is the UP2720Q (2019)which has a 10bit panel and 250 nits of brightness and greater colour accuracy is about $1,600 and then there is the U2720Q (2020) which has a 8bit+FRC panel with 450 nits of peak brightness and a very slightly lower colour accuracy which is about $700. You can compare their specifications here.

For our purposes EIZO don’t make an ‘affordable’ 32″ model with the CG319X coming in at close to $6k, while the latest Eizo ColorEdge Prominence CG3146 HDR reference monitor will set you back over $30k.

From my research there are two that seemed the most promising. The 2021 Z27xs G3 4K DreamColor and the 2017 DreamColor Z31x Studio, both of which can be calibrated with an external probe.

Expensive if you can still find it at about $2500 this is a true 4K 4098 x 2160, 10 bit monitor. Whether it’s still worth the money today, given other ways to spend that kind of money, is highly questionable!

I’m writing this on my (now discontinued) LG 31MU97B 10bit 4K (4096 x 2160) monitor, which I have loved using for the past few years. From my experience, LG monitors and OLED TVs are superb.

I wanted to include them here as a potential nod to the future and it will be interesting to see how they compare to other similarly priced mini-LED monitors designed for professional HDR use. (See next section below)

For the money the 32BN67U-B looks like a great deal. It doesn’t have the fancy stand of the 32UN880-B or the ability to connect USB peripherals, but it does have a 32″ display, true 10bit panel and a UHD 3840 x 2160 resolution.

At this price point the ASUS Pro Art PA279C (2020) – Approx $500/500 and the Z27xs G3 4K DreamColor (2021) – Approx $700/£600 are also considerations but they both have 8 bit+FRC panels and a much smaller 27″ display.

When it comes to stepping up to an ‘affordable’ and reliable HDR monitor, right now you’re still looking at several thousand pounds/dollars. The technology is rapidly improving and the prices slowly falling but we’re not there yet.

In these two videos colorist Kevin Shaw gives his first impressions of the ASUS PA27UCX-K and the Dell UltraSharp UP3221Q. One thing to take into consideration with an HDR monitor is just how much power they consume!

With mastering of HDR movies sitting at around 1000 or 4000 nits, consumer OLEDs can’t match this getting to only 650-700 nits in their brightest areas. Whilst also being hampered by ABL (automatic brightness limiting – to protect the panel) bringing their full-screen brightness to around 100-150 nits.

Examples of IO boxes would be the Blackmagic Design UltraStudio Monitor 3G (approx $115 – HD video – full tech specs) and the older more expensive, AJA T-Tap (approx $295 – HD video – full tech specs). If you want 4K i/o you’ll need to jump up to the more comprehensive BMD UltraStudio 4K Mini.

You would need to take a Rec. 709 10bit video file, edit it in your video editing software maintaining that bit depth and colour space, output that video signal to your external monitor in 10bit and in Rec.709 and view it on a monitor with a 10bit panel, calibrated to Rec.709.

The reason to use a dedicated IO box (like the UltraStudio 4K Mini) is that it gives you a properly managed colour pipeline that by-passes the operating system’s GPU and colour profile settings and gets you straight from the video editing software to your monitor without alternation (unless you’ve got some hardware calibration going on too).

That way, if you know you’ve got a 10bit Rec. 709 video file and you’re outputting it via the IO to a 10bit Rec. 709 calibrated monitor you should be good to go.

In this image from the Blackmagic Design DaVinci Resolve Configuration Guide you can see that they recommend connecting the 2013 Mac Pro via Thunderbolt to an I/O box like the 2020 UltraStudio 4K Mini and from there via HDMI 2.0 to the OLED.

Although the Mac Pro has an HDMI port built in, (1.4b UHD) the reason that you need to use something like theUltraStudio 4K Mini ($995/£785) in between, is so that the video signal goes directly from the software to the monitor and by-passes the GPU (and it’s drivers) and the operating system ICC profiles.

The need for an external IO box is especially true when working with HDR material as it requires all the bit-depth and bandwidth you can give it, you can read more about this in the ‘What About HDR?’ section of my Colour Management for Video Editors post.

The X-Rite i1 Display Pro Plus is the most recommended low-cost probe, and some of the HDR monitors above, for example the ASUS ProArt ships with one included. Warren and Stuart discuss working with the X-rite probe and how to ‘hack’ it here. But the safest bet is to buy a Rev.B OEM version from LightIllusion. I would recommend reading this entire thread on LiftGammaGain to discover the ins and outs of that.

15.6 4k lcd panel hdr 10bit quotation

BenQ this week introduced its new SW320 display designed specifically for professional photographers and other people who require 4K/UHD resolution, the sRGB and the Adobe RGB color spaces, and support for HDR10 capability. The monitor is calibrated directly at the factory on a per-unit basis.

The BenQ SW320 is a 31.5-inch display featuring a 10-bit IPS panel, which can reproduce 1.07 billion colors and covers 99% of the Adobe RGB, sRGB, and 87% of the DCI-P3 color spaces. The Adobe RGB color space is important for professional photographers that need to edit their photos for publications, whereas the sRGB and the DCI-P3 color spaces are crucial for video editors and animation designers who do post-production work. One of the interesting features of the SW320 is the ability to display content in different color spaces simultaneously side-by-side in PIP/PBP modes (two inputs are required).

It is worth noting that when it comes to the DCI-P3, the SW320 covers 87% of the color space, which is below 98%-99% covered by numerous displays aimed at video professionals. Meanwhile, the BenQ SW320 supports 10-bit HDR along with a 14-bit 3D LUT (look-up table) and is calibrated to DeltaE ≤ 2 in both Adobe RGB and sRGB. The support for HDR10 with proper blending accuracy is needed for those that work on adding HDR to various content, including photos and videos. BenQ cite that users involved in post-production for UHD movies (for streaming services or for Ultra HD Blu-ray media) will take advantage of the SW320.

As for the other specifications of the BenQ SW320, they look pretty standard for a high-quality 4K display: a 3840×2160 resolution with a 60 Hz refresh rate, 350 nits typical brightness, 1000:1 static contrast, 5 ms response time in fast mode and 178° viewing angles. The monitor uses a CCFL backlighting with brightness uniformity feature to ensure consistency. As for input/output capabilities, the display is equipped with one HDMI 2.0a, one DisplayPort 1.4 as well as one mDP 1.4 header (all of them support HDCP 2.2 required for various content). In addition, the SW320 is equipped with a dual-port USB 3.0 hub and a card reader that may be useful for photographers.

Just like many other professional displays, the SW320 monitor has an adjustable stand that allows rotating the panel clockwise or counter-clockwise to view the screen in portrait orientation. The BenQ SW320 comes pre-calibrated, just like competing devices, with users able to further calibrate it using the appropriate equipment.

BenQ plans to start selling its SW320 professional display in January 2017. Pricing is unknown, but since we are talking about a high-end monitor that has its own peculiarities (support for HDR and the Adobe RGB), so expect an appropriate price.

15.6 4k lcd panel hdr 10bit quotation

You will have heard the term HDR talked about more and more in the TV market over the last few years, with the major TV manufacturers launching new screens supporting this feature. It’s become the ‘next big thing’ in the TV space, with movies and console games being developed to support this new technology. Now HDR has started to be more widely adopted in the desktop monitor space as well and we are starting to see an increase in the talk of HDR support. This will provide support for HDR gaming from PC’s and consoles, as well as movies and multimedia. We thought it would be useful to take a step back and look at what exactly HDR is, what it offers you, how it is implemented and what you need to be aware of when selecting a display for HDR content. We will try to focus here more on the desktop monitor market than get too deep in to the TV market, since that is our primary focus here at TFT Central.

Trying to put this in simple terms, ‘High Dynamic Range’ refers to the ability to display a more significant difference between bright parts of an image and dark parts of an image. This is of significant benefit in games and movies where it helps create more realistic images and helps preserve detail in scenes where otherwise the contrast ratio of the display may be a limiting factor. On a screen with a low contrast ratio or one that operates with a “standard dynamic range” (SDR), you may see detail in darker scenes lost, where subtle dark grey tones become black. Likewise in bright scenes you may lose detail as bright parts are clipped to white, and this only becomes more problematic when the display is trying to produce a scene with a wide range of brightness levels at once. NVIDIA summarizes the motivation for HDR nicely in three points: “bright things can be really bright, dark things can be really dark, and details can be seen in both”. This helps produces a more ‘dynamic’ image, hence the name. These images are significantly different, providing richer and a more ‘real’ images than standard range displays and content.

The term HDR has been more broadly associated with a range of screen improvements in terms of marketing material, offering not only the improvements in contrast between bright and dark areas of an image but also improved colour rendering and a wider colour space. So when talking about HDR in the display market the aim is to produce better contrasts between bright and dark, as well as more colourful and lifelike image.

Linked to HDR is the term ‘High Dynamic Range Rendering’ (HDRR), which describes the rendering of computer graphics scenes by using lighting calculations done in high dynamic range. As well as the contrast ratio benefits we’ve already discussed in the introduction, HDR rendering is also beneficial in how it helps preserve light in optical phenomena such as reflections and refractions, as well as transparent materials such as glass. In Standard Dynamic Range rendering, very bright light sources in a scene (such as the sun) are capped at 1.0 (white). When this light is reflected the result must then be less than or equal to 1.0. However, in HDR rendering, very bright light sources can exceed the 1.0 brightness to simulate their actual values. This allows reflections off surfaces to maintain realistic brightness for bright light sources.

A typical desktop monitor based on a TN Film or IPS technology panel can offer a real-life static contrast ratio of around 800 – 1200:1, while a VA technology panel can range between 2000 and 5000:1 commonly. The human eye can perceive scenes with a very high dynamic contrast ratio, around 1 million:1 (1,000,000:1). Adaptation to altering light is achieved in part through adjustments of the iris and slow chemical changes, which take some time. For instance, think about the delay in being able to see when switching from bright lighting to darkness. At any given time, the eye’s static range is smaller, at around 10,000:1. However, this is still higher than the static range of most display technologies including VA panels and so this is where features like HDR are needed to extend that dynamic range and deliver higher active contrast ratios.

One area of the HDR market that is still quite murky is content standards, those being the way in which content is produced and delivered to compatible displays. There are two primary standards that you will hear most about and those are HDR10 and Dolby Vision. We won’t go in to endless detail here, but Dolby Vision is considered to offer superior content as it supports dynamic metadata (the ability to adjust content on a dynamic basis, frame by frame) and 12-bit colour. However it is proprietary and carries a licensing fee, and also originally required additional hardware to be available to play it back so was more expensive to support. HDR 10 on the other hand only supports static metadata from content and 10-bit colour but is an open-source option which has therefore been more widely adopted so far. Microsoft and Sony for instance have adopted HDR10 standards for their modern games consoles. It is also the default standard for Ultra HD Blu-ray discs.

You don’t need to worry that this is going to turn in to another HD DVD vs. Blu Ray war, as actually despite the varying content standards it is relatively easy for a display to support multiple formats. In the TV market it is quite common to see screens which will support both Dolby Vision and HDR10, as well as often the other less common standards like Hybrid Log Gamma (HLG) and Advanced HDR.

Samsung have more recently started to drive development of the so-called HDR10+ standard which is designed to address some of the earlier implementation shortcomings, adding things like dynamic metadata for instance. On the other side of things Dolby Vision has recently moved their standard entirely to software, removing some of the complications with hardware requirements and associated additional costs.

When it comes to viewing differently rendered content, you need a display which supports the relevant standard. HDR10 compatible displays are very common and that is very widely supported content. Dolby Vision is less common, although some TV sets will advertise support and include it for those who want to use Dolby Vision encoded content. The monitor market seems to be focused on HDR10 for the time being though and we have yet to see a screen advertised for Dolby Vision content support. It will presumably only be a matter of time.

When selecting a display you may want to consider the source of your HDR material, and the format that is designed to output. Whether that is an HDR-compatible Blu-Ray player, streaming video service like Amazon Prime Video, games console or PC game.

In more recent times, to try and overcome some of the ongoing contrast ratio limitations of LCD displays you will often hear the term “Local Dimming” used by manufacturers. This local dimming can be used to dim the screen in “local” parts, or zones, dimming regions of the screen that should be dark, while keeping the other areas bright as they should be. This can help improve the apparent contrast ratio and bring out detail in darker scenes and shadow content. Local dimming is the foundation for producing an HDR experience on a display.

Where local dimming is used, one area you might want to be concerned about is how fast the local dimming can respond. Screens aimed at HDR gaming for instance need a very responsive local dimming backlight, to account for rapid content changes and to ensure the dimming can keep up with the likely high frame rates and refresh rate of the display. If the local dimming is not fast enough, it can lead to obvious “blooming” and other issues. The speed of the local dimming is independent of the type of backlight used, but the more zones there are, the potentially more complex it is for the manufacturer to get the backlight dimming response times fast enough and consistent enough. When you throw in variable refresh rate (VRR) technologies like NVIDIA G-Sync and AMD FreeSync, the operation of local dimming backlights may become more complex still. Manufacturers are unlikely to list the speed or responsiveness of their local dimming backlights, so you will have to rely on third party independent testing (like ours) to examine how a local dimming backlight performs.

This method of local dimming is what has been implemented in most desktop monitors so far. It is not overly expensive or complex to introduce and offers a level of local dimming that allows for HDR to be promoted and marketed. It is fairly typical to see an 8-zone edge lit backlight being used so far in desktop monitors. For instance the Samsung C32HG70 we have reviewed features a local dimming backlight in this format.

A more optimal way to deliver this local dimming on an LCD screen is via a “Full-Array Local Dimming (FALD)” backlight system (pictured above), where an array of individual LEDs behind the LCD panel are used to light the display, as opposed to using any kind of edge-lit backlight. In the desktop monitor market, edge-lighting is by far the most common method, but there are some screens available already that feature a FALD.

It would be ideal for each LED to be controlled individually, but in reality with LCD screens they can only be split in to separate “zones” and locally dimmed in that way. Each zone is responsible for a certain area of the screen, although objects smaller than the zone (e.g. a star in the night sky) will not benefit from the local dimming and so may look somewhat muted. The more zones, and the smaller these zones are, the better control there is over the content on the screen.

There are some drawbacks of implementing full-array backlights. They are firstly far more expensive to utilise than a simple edge-lit backlight, and so expect to see retail costs of these supporting displays very high. For example the first desktop HDR display featuring a full array backlight system (384 zones) to be released (and which we have reviewed) was the 27″ Dell UP2718Q which has a current retail price of around £1400 GBP. The full-array backlight can’t be blamed on its own for this high price, as this display offers other high end and expensive features like 4K resolution, a wide gamut LED backlight, hardware calibration etc. However, you can bet that the use of a full-array backlight system with 384 zones is a large part of the production costs, and reason for the high retail price.

A step beyond a FALD backlight is Mini LED. Mini LED offers much smaller chip sizes than normal LED and so can allow panel manufacturers to offer far more local dimming zones than even the current/planned FALD backlights that we’ve seen so far. Those FALD backlights have been limited to around 384 dimming zones on a typical 27 – 32″ model. The new mini LED backlight systems will support more than 1000 zones and also allow even higher peak brightness as well. They will also facilitate thinner screen profiles compared to FALD screens.

Mini LED was originally talked about being used initially in gaming screens for HDR benefits, but the latest plans from AU Optronics who are investing in Mini LED talk more about them being used for professional grade displays. We have already seen the 32″ Asus ProArt PA32UCX professional display announced featuring a 1000+ zone Mini LED backlight, although release date could still be some way off. A step beyond Mini LED is Micro LED, offering even smaller zones and a higher number as a result. Those are even further away and we mention them only for reference here.

It’s actually quite complicated to achieve an HDR output at the moment from a PC and something you should be aware of before jumping straight in to a modern HDR screen. You will need to ensure you have a compatible Operating System for a start. The latest Windows 10 versions for instance will support HDR, but from many systems you will see some odd behaviour from your monitor when it is connected. The image can look dull and washed out as a result of the OS forcing HDR on for everything. HDR content should work fine (if you can achieve it – more in a moment!) and provide a lovely experience with the high dynamic range and rich colours as intended. However normal every day use looks wrong with the HDR option turned on. Windows imposes a brightness limit of 100 cd/m2 on the screen so that bright content like a Word Document or Excel file doesn’t blind you with the full 1000 cd/m2 capability of the backlight. That has a direct impact on how the eye perceives the colours, reducing how vivid and rich they would normally look. It also attempts to map the common sRGB content to the wider gamut colour space of the screen causing some further issues. Sadly Windows struggles at the moment of turning HDR on/off when it detects HDR content, so for now it’s probably a case of needing to toggle the option in the settings section (settings > display > HDR and Advanced Color > off/on). Windows does seem to behave better when using HDMI connectivity so you may have more luck connecting over that video interface, where it seems to switch correctly between SDR and HDR content and hopefully negate the need to switch HDR on and off in the Windows setting when you want to use different content. This is not any fault of the display, and perhaps as HDR settles a bit more we will have better OS support emerge. That is a little fiddly in itself, but a current OS software limitation.

Another complexity of HDR content from a PC is graphics card support. The latest NVIDIA and AMD cards will support HDR output and even offer the appropriate DisplayPort 1.4 or HDMI 2.0a+ outputs you need. This will require you to purchase a top end graphics card if you want the full HDR experience, and there are some added complexities around streaming video content and protection which you might want to read up on further. There are graphics cards now available to provide that HDR option from a PC, but they are going to be expensive right now.

Finally, content support is another complex consideration from a PC. HDR movies and video including those offered by streaming services like Netflix, Amazon Prime and YouTube currently won’t work properly from a PC due to complicated protection issues. They are designed to offer HDR content via their relevant apps direct from an HDR TV where the self-contained nature of the hardware makes this easier to control. So a lot of the HDR content provided by these streaming services is difficult or impossible to view from a PC at the moment. Plugging in an external Ultra HD Blu-ray player or streaming device like Amazon Fire TV 4K with HDR support is thankfully simpler as you are removing all the complexities of software and hardware there, as the HDR feature is part of the overall device and solution.

PC HDR gaming is a little simpler, if you can find a title which supports HDR properly and have the necessary OS and graphics card requirements! There are not many HDR PC games around yet, and even those that support HDR in the console market will not always have a PC HDR equivalent. Obviously more will come in time, but it’s a little limited at the time of writing. All in all, it’s a complicated area for PC HDR at the moment.

Thankfully things are a bit simpler when it comes to external devices. The enclosed hardware/software system of an external Ultra HD Blu-ray player or streaming device (Amazon Fire TV 4K HDR etc) make this easy. They will output HDR content easily, you just need to have a display which is capable of displaying it and has a spec capable of supporting the various requirements defined by the HDR content. More on that in a moment.

The other area to consider here is console HDR gaming. Thankfully that part of the gaming market is a bit more mature, and it’s far simpler to achieve HDR thanks to the enclosed nature of the system – no software, graphics card or OS limitations to worry about here. If you have a console which can output HDR for gaming such as the PS4, PS4 Pro or X Box One S then the monitor will support those over the HDMI 2.0a connection.

While HDR content might be created to a certain standard, the actual display may vary in its spec and support of different aspects of the image. You will often see HDR marketed with TV screens and more recently, monitors, but specs and the level of HDR support will vary from one to the next.

To stop the widespread abuse of the term HDR in the TV market primarily, and a whole host of misleading advertising and specs, the UHD Alliance was set up. This alliance is a consortium of TV manufacturers, technology firms, and film and TV studios. Before this, there was no real defined standards for HDR and there were no defined specs to be worked towards by display manufacturers when trying to deliver HDR support to their customers. On January 4, 2016, the Ultra HD Alliance announced their certification requirements for a “true HDR display” in their view, with a focus at the time on the TV market since HDR had not started to appear in the monitor market. This encapsulates the standards defined for “true” HDR support, as well as then defining several other key areas manufacturers can work towards if they want to certify a screen overall under their brand as “Ultra HD Premium” certified. This Ultra HD Premium certification spec primarily focuses on two areas, contrast and colour.

There are two options manufacturers can opt for to become certified under this standard, accounting for both LCD and OLED displays. This covers the specific HDR aspect of the certification:

Option 1) A maximum luminance (‘brightness’ spec) of 1000 cd/m2 or more, along with a black level of less than 0.05 cd/m2. This would offer a contrast ratio then of at least 20,000:1. This specification from the Ultra HD alliance is designed for LCD displays and at the moment, is the one we are concerned with here at TFT Central.

Option 2) A maximum luminance of over 540 cd/m2 and a black level of less than 0.0005 cd/m2. This would offer a contrast ratio of at least 1,080,000:1. This specification is relevant then for OLED displays. At the moment, OLED will struggle to produce very high peak brightness, hence this differing spec. While it cannot offer the same high brightness that an LCD display might, its ability to offer much deeper black levels allows for HDR to be practical given the very high available contrast ratio.

In addition to the HDR aspect of the certification, several other key areas were defined if a manufacturer wants to earn themselves the Ultra HD Premium certification:

Resolution –Given the name is “Ultra HD Premium” the display must be able to support a resolution of at least 3840 x 2160. This is often referred to as “4K”, although officially this resolution is “Ultra HD”, and “4k” is 4096 x 2160.

Colour Gamut – As part of this certification, the Ultra HD alliance stipulate that the display must also offer a wider colour gamut beyond the typical standard gamut backlights. In the TV space, this would need to be beyond the standard sRGB / Rec. 709 colour space (offering 35% of the colours the human eye can see) which can only cover around 80% of the required gamut for the certification. The display needs to support what is referred to in the TV market as “DCI-P3” cinema standard (54% of what the human eye can see). This extended colour space allows for a wider range of colours from the spectrum to be displayed and is 25% larger than sRGB (i.e. 125% sRGB coverage). In fact, it is a little beyond Adobe RGB which is ~117% sRGB. As a side note, there is an even wider colour space defined which is called BT. 2020 and this is considered an even more aggressive target for display manufacturers for the future (~76% of what the human eye can see). To date, no consumer displays can reach anywhere near even 90% of BT. 2020, although many HDR content formats use it as a container for HDR content as it is assumed to be future proof. This includes the common HDR10 format. One to look out for in future display developments.

Displays which officially reach these defined standards can then carry the ‘Ultra HD Premium’ logo which was created specifically for this cause. You need to be mindful that not all displays will feature this logo, but may still advertise themselves as supporting HDR. The HDR spec is only one part of this certification so it is possible for a screen to support HDR in some capacity, but not necessarily offer the other additional specs (e.g. maybe it doesn’t have the wider gamut support). Since a screen may be advertised with HDR, but not carry this Ultra HD Premium logo, it may be unclear how the HDR is being implemented and whether it can truly live up to the threshold specs that the UHD alliance came up with. In those situations you may get some of the benefits from HDR content, but not the maximum, full experience intended or defined here. HDR is just one part of the Ultra HD Pr