Introduction
I am Aby Aneja, an aspiring film director in the segment of sports and adventure. I have always been fascinated by the visuals in cinema driven by color. Films like The Batman, Blade Runner, Joker and Dune are my inspiration and masterpieces to watch if you are stimulated by colors. From the beginning of my career, I always wanted my videos to look like movies. It took me 5 years to make it possible, but achieving just that was not enough; I wanted more than just a ‘film look’. In my journey of creating, directing, and post-producing brand content, which is usually meant to be viewed on social media, I have been trying to simulate the mechanics of high-end cinema for low-end projects. In the process of color grading in postproduction, grading like they do in the movies is expensive to pull off. You need a pro colorist and a legit postproduction studio with high-end monitors, but there is a lot that can be done by sitting at home on the couch with a MacBook. I wanted to explore the possibility of widening the scale of the visual experience in content created for YouTube with limited resources. To make it happen, I integrated HDR (High Dynamic Range) into my color grading workflow of post-production in DaVinci Resolve Studio and combined it with film emulation process in the Aces pipeline. I basically ‘Home Cooked Cinema in HDR’.
Color Spaces and Pipelines
When I started to grade, I had no idea about color pipelines, color spaces, color gamuts, and gamma. The basics of grading start with understanding the size of each color space, beginning with Rec709 < SRGB < P3 < Rec2020. Gamma is the black level associated with an image. The standard values of gamma are Gamma 2.2 used for internet deliveries, Gamma 2.4 used for broadcasting on TV, and Gamma 2.6 used for cinema projections. A color pipeline is a framework of adding inputs and outputs to the color process based on the camera with which the footage was shot and the device that people are going to see it on. Aces is an example of a color pipeline that is widely used in movies. It’s also important to know the codecs and color space of the footage that has been shot by the camera. Most of the content that you see on YouTube is delivered in SDR in Rec709 color space. On the other hand, Amazon Prime, Netflix, and Apple TV now stream content in HDR/Dolby Vision, which has a wider color space like Rec 2020/P3. Rec709 is the smallest of all and the most widely used color space out there.
Experimenting with Wider Color Spaces
For my personal and professional deliveries on YouTube, I wanted to explore a bigger color space other than Rec709 and find out whether people can see more colors with a wider color space, as well as figure out the possibility of its playback on consumer devices. Initially, I was color grading in Rec709 color space and outputting to Rec709 for YouTube deliveries. Then I started experimenting with SRGB, which had a slightly bigger gamut than Rec709; however, the 2.2 gamma natively associated with SRGB resulted in the contrast of the visuals looking washed out. I also tried to work in P3D60/D65 in Gamma 2.4, which has a beautiful contrast ratio, but outputting it to YouTube didn’t work since Youtube converts everything to Rec709 in Gamma 2.2. However, it's possible to see P3 colors on iPhones when content is airdropped inside the phone. It is now established that YouTube doesn’t support a color space bigger than Rec 709 and a gamma higher than 2.2 in SDR. Meanwhile, I kept my research open until I found Rec2020, the biggest color space used in HDR, which is supported by YouTube.
​
YouTube's Gamma Discrepancy: Gamma Shift ?
Youtube Player has a native gamma value that is close to 2.2 gamma. While default settings on editing software like DaVinci and Final Cut have a default gamma of 2.4, which is used in grading content meant to be broadcasted on TV, the difference between 2.2 and 2.4 gamma is that 2.2 has lighter black levels and 2.4 has darker blacks in the visuals. When rendering for YouTube, using 2.4 gamma in the color space and exporting will give you a nicer image for sure, but the moment it's uploaded on YouTube, it converts it to 2.2 gamma, giving it a faded-out pale and less contrasty image. To fix this, Apple came up with Rec 709A as their own gamma to see color and gamma-matched visuals on all Apple devices. Rec 709A. ‘A’ stands for Apple, which means using this as color space and gamma tags in the masters will display the same gamma on all the Apple devices. Rec709A as a gamma value can only be used in SDR and is close to gamma 2.3. In case you like the look of gamma value 2.4, there is no way to play 2.4 gamma on YouTube. Since HDR has its own Gamma, whether you choose HLG or Rec2020, the black levels are deep and can be played on YouTube.
​
Introducing Aces and Film Emulation in SDR
Meanwhile, still color grading in SDR, I was introduced to Aces color space and built an Aces pipeline for delivering content on YouTube. Movies like Oblivion, The Legend of Tarzan, and Guardians of the Galaxy were graded in the Aces pipeline. Introducing aces in my workflow was the first step towards building a legit post-production pipeline for an industry-standard workflow. Grading in Aces is different than grading in REC 709. Aces is a way wider color gamut, and grading tools like color wheels provide more flexibility. Then comes integrating film emulation in aces, which is another step and a scientific approach to deciding the look of the film based on film stocks like Agfa, Fuji, and Kodak. This is where Dehancer Pro comes in as a plug-in, which makes your digital footage look like it was shot with a film camera. This is the process of taking your footage from a negative film stock and printing it digitally. The third step is then outputting it to the final color space, which was REC709 in SDR. The pipeline is converting the footage with the bigger color space, which is Aces, to the smallest, which is REC709. It didn’t make sense to me to shoot content in a bigger space like P3 with a camera that captures 10 bit 422 at 4k and deliver it in Rec709, wasting a huge amount of color spectrum, where people watch it on a compressed scale of 720P via browsing on Youtube, which is a bummer. I came up with the idea to integrate HDR into the ACES pipeline to produce and deliver content in a wider color space, like Rec2020, for any small to high-budget content on YouTube, and changed the game forever.
Why HDR ?
HDR gives you an opportunity to widen the scale of visual experience of viewers consuming content on their phones, tablets, or MacBooks. You can view content produced for 500 to 2000 nits of brightness with an HDR screen. In HDR, there is a shift in contrast, and the image has deeper black levels, which are not achievable with SDR. People are mistaken when they think that HDR is all about watching content with 2000 nits of brightness, but there are hardly any visuals that are lit more than 1000 nits apart from the sun, explosions, or fire. Its blinding to see highlights in your visuals having over a 1000 nit value. Instead, HDR is more about the black levels, which make your image come alive and give it a deep texture.
​
Integrating HDR
I integrated the basic HDR workflow with outputing the content to Rec2020 color space and fell in love with what it did to the footage. Rec2020 is the widest color space available today; however, there are no monitors in the world that can display 100% of its colors, so it is limited to the P3 color gamut. The native gamma associated with REC2020 is ST2084, which is usually used with a value of 1000 nits measured in brightness. There are few options in HDR to choose from while grading HDR10+, Dolby Vision, and basic HDR. Basic HDR is free and easy to use, whereas Dolby Vision requires a Dolby license. Then comes deciding the color space of the HDR project. There are two basic color space formats in HDR: HLG (Hybrid Log Gamma)/Rec2100 and Rec2020/PQ. Both of them give a slightly different result while grading. Combining Aces, film emulation using Dehancer Pro, and basic HDR grading in the REC2020 PQ made my visuals look close to a Dolby Vision movie. Uploading it to YouTube with its colors and gamma intact was another challenge. Youtube supports HDR content, which means it supports Rec2020. Rendering the project using Proress 422 in 10bit in DaVinci Resolve matches the colors and gamma of export 100% with YouTube uploads.
Consumption Formats?
99% of people consume content on their phones. Apart from those 1% of nerds who are using their computers, the majority of them are watching content on Apple devices with HDR screens with brightness up to 2000 nits, supporting the DCI-P3 color gamut. Amazon Prime, Netflix, and Apple TV deliver content in P3 Color Space in SDR and Dolby Vision/PQ in HDR. Considering all my content is meant to be delivered on YouTube, it made sense to understand it deeper. Youtube basically works in the color space of Rec709 in SDR and Rec2020/2100 in HDR. There is an opportunity to produce content in Rec2020 in HDR and make use of the HDR screens of the devices available today.
Grading in HDR with a MacBook Pro
The new MacBook M1 comes with a HDR screen and has color-calibrated display profiles like SRGB and Rec709 for Internet deliveries and an XDR display with a brightness of 1600 nits, supporting 100% of P3 display gamut. Grading SDR and HDR using these profiles has become easier. There are a few ways to grade HDR in DaVinci Resolve Studio. The easiest way is to use Davinci YRGB Color Science and change your project’s input color space to Rec2020 and gamma to a ST2084 1000 Nits and output to the same, grade your project in this pipeline, and forget about how to grade for backwards compatibility in SDR. Nobody is going to see your video on an old Lenovo tablet, 99% use iPhones and Macs with HDR screens to watch content. Even TVs have HDR these days. Another way is to use a color-managed workflow and use Rec2100 HLG as gamma input and output. Keep the MacBook’s display profile in default Apple XDR Display P3. Export your projects in proress 422 with the respective color space and gamma tags, and HDR metadata will be encoded automatically.
Conclusion
It’s far more complicated to produce content in HDR; however, it’s worth seeing your work in a bigger color space, which makes the visuals come alive. I like the idea of HDR since it optimizes the use of brightness levels of 2000 nits with the playback devices available today. There have been a lot of conclusions about how HDR failed. In my opinion, it’s got what it takes to move us to the next level of content creation and filmmaking.
Here is a new video delivered in HDR on YouTube
How to watch HDR content ?Step 1 -Click on the youtube logo at the bottom and 'Open YouTube App' to watch it in 2160P 4k HDR. Step 2 - Brighten your Iphone/mac to max to enjoy the 1200 to 1500nits of peak HDR Luminance.
Left Behind|HDR , Travel Music Video
Hanle at an altitude of 14,000 ft, is an extremely remote part of Ladakh, Northern India and is in the middle of nowhere. It has the widest landscape in all of ladakh with beautiful grass fields and water streams clustered around the area. It is also a dark sky sanctuary with low light pollution to see clear stars and milky may galaxy through naked eyes. India has built the highest astronomical observatory in Hanle. This video is very special to me since I had envisioned visiting Hanle i in 2022 and It became a realty in July 2023 at the time of Nomadic Festival happing down there. Creatively video is about me and my friend Harsh going on a bike trip from leh to Hanle and capturing moments expressing freedom.
Featuring travel and shooting partner - Harshwardhan Singh Rathore
Gears
Panasonic S5 + Sigma 45mm F2.8 + 20 - 60mm Kit Lens Iphone 14Plus
Post-Production Colored in DaVinci Resolve Pipeline - Aces /Rec 2020 HDR 1000Nits.
Getting Possessed|HDR , Cinematic piece
A piece shot with Lumix S5 in 420 10bit HD - H264 LonGOP, Graded in HLG