Color accuracy is paramount for color grading. And many things can affect this from room lighting to wall color to computer hardware and setup. In this article we discuss what's needed for an accurate grading system including bypassing the GPU, display capabilities and display calibration.
I’ve already discussed what’s required for a color grading room that won’t break the bank. If you missed it, check out our grading room post. Now we need to talk hardware requirements. Because like it or not, accurate grading involves more than just a nice computer and display. And when I say hardware needed for grading with Resolve, we’re not even talking about fancy tools like panels. The items discussed here are just to get us grading accurately.
Modern computers and displays are very sophisticated. A Mackbook Pro will auto adjust the display based on ambient light. The GPU might boost contrast for a better image—you get it. All super cool stuff for our viewing experience. Lethal for accurate grading. To get your system dialed in for color grading, we’re going to look at 3 things: Display Capabilities, GPU and Display Calibration.
We discussed the CIE chromaticity diagram and how color spaces have chromaticity coordinates on the diagram called a color gamut for the color space. If that didn’t make sense, definitely check out our intro to color lesson as it will clear the fog.
Color gamuts also define what hardware can do. For example, does a particular grading display support Rec. 709, DCI-P3, Rec. 2020—you get it. With modern filmmaking, our Resolve timeline is typically DaVinci Wide Gamut (DWG). DWG is a very large color space that can encompass other High Dynamic Range (HDR) color spaces like Rec. 2020, etc.
How we monitor and deliver a film might be different than our timeline color space. For example, I often monitor in Rec. 709 which is a standard dynamic range (SDR) color space. Resolve takes my DWG timeline and any corrections done with HDR tools and graciously displays them as Rec. 709. Many of the 4K videos you see on YouTube are in fact Rec. 709, including the videos I have linked up here.
You might be thinking, “Why not monitor in HDR?” One reason—the hardware required for quality HDR monitoring is PRICEY. And if you’re an indie filmmaker watching costs, this is a big deal. Another reason? Not everyone has HDR capable stuff yet. So you can’t deliver in just HDR only—eventually we’ll get there! And again, if you have questions on how SDR differs from HDR see our intro to color post.
All of that to say, whatever color space you decide to grade in, your display must support it 100%. Manufacturers tell you what color space a display supports, and how much it supports it. For example, my grading display is an ASUS ProArt PA247V. According to ASUS it has 100% support for Rec. 709. This display is adequate for my grading. It fully covers the Rec. 709 color gamut and also supports gamma 2.4 which deals with luminance. Other Rec. 709 displays may report only 98% support. Or only sRGB support, etc. Whether you monitor your grade in Rec. 709 or Rec. 2020, your display must provide 100% support for the color space.
Displays will report their support for a color space. But cheap displays are still not going to give you a good image. For example, the display gamut might say 100% support for a color space, but then not give a good enough contrast ratio. For Rec. 709 you need a 2000:1 contrast ratio. Many Rec. 709 displays are only 1000:1. The price goes up for 2000:1 displays. And that’s just SDR. HDR has significant requirements to see a grade accurately. Netflix, for example requires their HDR grading to be done on displays that have a 200,000:1 contrast ratio (see their specs). Many who grade in HDR use displays from Flanders Scientific that cost thousands of dollars.
So that’s displays in a nutshell. As you can imagine, there’s a lot more to this than what we’ve discussed. The distilled version is you need to decide whether you’re monitoring and delivering in SDR or HDR and then purchase a display that truly supports this in both color gamut and luminance (contrast ratio).
For the rest of this article we’ll assume monitoring and delivering in Rec.709 gamma 2.4. Now let’s assume we’ve purchased a capable display. 100% support for the Rec. 709 color gamut, support for gamma 2.4 and it has 2000:1 a contrast ratio. We’re ready to go!
Not quite yet.
This new display is still connected to the computer’s GPU (graphics processing unit). Whatever you see on your laptop or desktop display is handled by the computer’s GPU. As discussed, modern GPUs are doing things like adding contrast, adjusting color temperature, stuff like that.
So why does this matter?
Let’s say I’m grading, but in the low light of the grading room, my MacBook Pro warms up the image. Now I’m grading an image where the color temperature has already been tweaked to make it warmer. This is going to throw my grade WAY OFF. I’ll might unconsciously cool things down to compensate for what the GPU did to the image. Then when I look at my movie on another display, it’ll look like Minority Report (which was very cool look for THAT movie).
So what’s the answer? There are certain things you can turn off in your computer to try to keep the GPU from doing stuff, but you never have a 100% guarantee that it’s not messing with things. So the safest bet is to get products from Blackmagic Design to bypass your GPU entirely.
The UltraStudio Monitor 3G hooks into a Thunderbolt display on your computer with an HDMI out to your grading display. Then you enable this in DaVinci Resolve and it sends the signal out of Resolve through the UltraStudio Monitor 3G 100% bypassing your GPU. NICE. This ensures your grade is not adjusted by the computer in any way.
Now something to know, the UltraStudio Monitor 3G is HD only. This is fine for me as my 24″ Rec. 709 grading display is HD. If you want to monitor in 4K you need the UltraStudio 4K Mini or one of their DeckLink cards. So however you go about it, you must bypass your GPU for accurate grading.
Again, you still might be thinking, "Why monitor in HD?" Remember, pixel count has nothing to do with color space. If you're monitoring in Rec. 709 you don't need 4K. However, it can be easier to find a 4K display that supports a 2000:1 contrast ratio, which would be more accurate. So in that case, you will need a DeckLink card vs the UltraStudio Monitor 3G. Or the Blackmagic Design 4K mini. Either one will work.
This is where things can get a little dicey. So what is display calibration? In a very simple way, it’s like wheel alignment on a vehicle. All displays, regardless of cost, must be calibrated periodically to correct drift and maintain their accuracy.
When you order a new display, it’s calibrated at the factory. But after using it for 20 hours or so the colors will drift and then it’s not 100% accurate anymore. It’d be like shooting a rifle with a scope that’s not dialed in—it doesn’t matter how well you aim, you will NOT hit the target because the scope is off. Likewise, if our monitor shifts, it doesn’t matter how well we grade.
What’s the solution?
First know there are two types of displays: Displays that hold their own calibration data and displays like my ASUS PA247V that rely on software calibration stored on your computer (or LUT box). A hardware calibrated display costs more, plus you have to pay someone to come in and calibrate it for you.
Software calibration allows you to calibrate your own display using a colorimeter. You simply tell the colorimeter software what to calibrate to (example Rec. 709), place the colorimeter on your display and the software app runs thousands of color swatches on the screen which are scanned by the device. It then performs a lot of calculations based on what the display is doing vs what it should be doing, and creates the ICC profile. Then when you use the display, the ICC profile gives your OS the corrections.
There’s an issue with ICC profiles when it comes to grading in DaVinci Resolve. They still rely on the GPU and OS of your computer. And they may not be fully supported by Resolve. So what’s the solution? Creating a 3D LUT file that’s used with your Blackmagic Design setup. The 3D LUT stores the required calculations for you and allows the corrected signal to go to your display without the GPU involved.
But here’s the catch: Normal calibration software DOES NOT create 3D LUT files. Only ICC profiles. You have to use software like Calman Studio from Portrait Displays (pricey) or DisplayCAL which is free but more difficult to use. I cover how to use DisplayCAL here:
DisplayCAL is not running on newer versions of macOS. There is a forked version that supports Python 3 but I have yet to test in detail. The other software consideration is Calman Studio—but it comes with a price tag.
We’ve talked about SDR and HDR. About display capabilities. You now know to bypass the GPU. And your display must have regular calibration to stay accurate. So what is the least expensive way to do all of this? For the macOS, the cheapest route you can take is as follows:
All of that including the display will cost about $650. This is significantly less than just the display required for an HDR capable setup. Is this budget setup ideal for everything? No. But for those of us putting indie films online? It’s great.
Again, if you want to monitor in 4K vs HD you can do that via a Blackmagic Design Decklink card (with an external PCIe enclosure) and then get a 4K display that supports Rec. 709. And of course to monitor in HDR things change even more.
Hopefully this helps you get started correctly with your grading! This seems like a lot, because it is. But once you have it dialed in, you’ll have a solid solution to make your films look incredible.
Our online film school designed for directors can show you how. Find out more below!
If your answer is yes, Write & Direct is your inciting incident. It's your catalyst in the journey to becoming a filmmaker. There's nothing else quite like it. Enroll today or find out more below: