Preliminary results of tests using Prelight, Lattice & Resolve

I’ve just published this to CML but I thought it would be of interest here as well…

I’m sharing these results on the basis that they are the first look and all the software used is either beta or in the process of being updated!

The workflow is as follows

.ari file loaded into ARC and output as Logc Wide gamut jpg, this is to try and simulate what would happen next if we were using the camera output directly. In reality I will use BMD mini thunderbolt SDI adapter for input from camera to Prelight.

The jpg that was the result of the above process was loaded into Prelight and a very simple grade applied, this was then output as a LUT in 2 forms, the first was grade only, the second IDT, RRT, Grade & ODT combined.

The same jpg was then loaded into Resolve in ACEScc mode IDT and ODT and the complete LUT from Prelight loaded and rendered out to a TIFF. I then set Resolve to Alexa IDT and sRGB ODT and rendered the output to TIFF again.

I then measure what the results were in Grey, and RGB, I also tried the full look loaded in Lattice.

Prelight Grade

Grey 155 158 159

Red 142 29 38

Green 12 135 47

Blue 1 43 130

Lattice Look

Grey 156 156 159

Red 147 28 38

Green 16 133 47

Blue 1 42 130

Resolve rendered complete package

Grey 152 158 159

Red 147 27 36

Green 12 143 36

Blue 1 42 129

Resolve rendered LUT only

Grey 154 156 158

Red 145 28 36

Green 1 143 47

Blue 1 43 129

I think these results are pretty much within the margin of error using the Mac colour tool to measure them!

I’ll be creating a webpage of these results in the next day or so which will be complete with the images used and generated.

1 Like

Thanks Geoff. I’m adding some copy below (paraphrased from your CML page) because I think it helps provide some context for what you’re trying to do:

Preliminary results of ACES workflow on low budget productions

These tests document the process of creating LUT’s for both on-set monitoring and producing Rushes using ACEScc in a very simple and inexpensive way

Best,

Steve

Thanks Steve, just to clarify things…

I am trying to find the simplest way for a Cinematographer to get his intent onto the screen in a reliable and predictable way.

To me ACES is the clear solution, it’s a question of standards, reference levels etc.

I know that there are other ways of doing what I’m doing, Lightspace CMS for one, but I’m trying to find a clear and easy way for any Cinematographer, regardless of budget, to get the look he desires onto low cost monitors on set, through rushes and as a guide for final grade.

I think this works well this way…

http://www.cinematography.net/ACES-Test-low-budget.html

1 Like

Geoff, thanks for publishing the results of your test.
Although, the link worked the first time I visit the CML website, it is not working anymore. I tried it with Safari and Chrome in my Mac computer and through my iPhone. Same result.

Hmm, it’s working here, I’ll talk to Network Solutions and see if there’s an issue.

Although, it’s working in my iPhone now, t is still not working in my computer.
Here is what I get when trying to access the page:
Safari can’t open the page ACES workflow on low budget productions because Safari can’t establish a secure connection to the server www.cinematography.net

Geoff, thanks once again for sharing the results of your test.

Different from the static workflow set up you are using in this test and the hardware configuration your are intending to use on-set, requires more than a single BMD UltraStudio Mini recorder or the Mini Monitor. It requires both, one to capture the live video signal or for grabbing stills, (the Mini Recorder), and the Mini Monitor for sending the signal to the monitor chain. The downside of this configuration is that it uses both thunderbolt ports of a Laptop or desktop computer, because none of the UltraStudio Minis have thunderbolt loop. This leaves us with no option for data management thunderbolt I/0 devices on-set. Unless, of course, we are using two computers, as I do when possible, one for data management: backup and dailies, and a dedicated one for color management.

The reason for the use of a single computer is that very often the location doesn’t permit us to bring the full DIT cart or the desktop computer on-set. To overcome this situation a video interface I/O that integrates both I/O in one device is recommended, like the BMD Ultra 4K Extreme, leaving one thunderbolt free. I don’t recommend the UltraStudio 4K because of the noise (average 77dB SPL @ 5 feet) the internal turbine produces to dissipate the heat. For this reason this unit can not be used near or on-set when recording sound. The UltraStudio 4K Extreme is a better choice.

In addition, a LUT box is required for monitoring the live grade going out to the director and video village monitors.
It’s important to notice that a LUT box with 33.33.33 grid is needed for ACES and/or HDR workflow, otherwise we are going to have constant banding interpolation, if the LUT box or the camera internal LUT server only support 17.17.17. LUT size. This is including scenes with high luminosity, like sunset, blow out highlighted doors and windows and specular lights.

For the new cameras supporting internal LUT server, like Panasonic 35, Arri Alexa XT, the workaround is to bypass the internal LUT server and to use a LUT box with the 33.33.33 size.
The downside to this workflow is that we lose the automatic metadata written info that travels with the files the looks have been applied. So, we need to keep manual track of this information through TC and proper labeling within Prelight, LiveGrade or whatever application we use on-set.

Here is a workaround to Sony F65 related to CDL.
“Sets whether to apply ASC CDL in the check box. Selects an imported CCC file or CDL file. The SDI1 system supports “ACES1.0-Rec.709” settings that perform color conversion using a built-in 3D LUT. The F65 employs a 3D LUT with 17×17×17 lattice to obtain a contour line signal for areas of smoothly varying luminous intensity. The recording signal is not affected, allowing the processing to be improved using a color grading tool that performs color conversion employing a higher-degree 3D lattice grid.”

Wondering, why in your test you are you using LUTs instead of ACES CDL or a combining of LUT+CDL to facilitate further changes and interchange of metadata in post between facilities or applications?

LiveGrade allows the user to use a combination of CDLs and LUT in Simple Mode, and to stock multiple CDLs using ACES CDL Advance. However, only one saturation can be exported from the CDLs stock.

Yeah but once again, this is a workflow/method for people on low budget shoots who are highly unlikely to have a DIT for onset grading.
The solution is for creating a usable LUT that will emulate ACES plus a grade onset.
It will be likely created in prep or in the evening in a hotel room.
Onset it’ll simply be loaded into something like a Bolt and the desired image transmitted to everyone.
I really don’t like the CDL approach, it’s far too limiting, in FilmGrade in Prelight I can make a simple overall colour change with Sat and then go to shadows/mid/highlight and make a quick tweak to the midrange.
Easy to get the look I want working in a language I understand, and BaseGrade takes this further, CDL makes me use Power, Shift, Bump & Grind, it’s not how I talk about pictures, nor any of my crew.
Prelight works the way I do and gives me both a combined LUT and a simple LUT.
It’s simple!
It’s fast!
It doesn’t need more bodies, especially useful in a situation where your 2AC is your “unloader”.

K.I.S.S

The beauty of the digital democratization is that it offers us tons of options to fill up different production situations.

Thanks for the tips on your workflow.

By the way, I’m a beta tester of Prelight too. The application looks promising.
I haven’t used it in production because of the advice from FilmLight against using the application in actual production. However, I’m waiting for the final release.

It is not necessary to have a live input to Prelight, and although you can add a Thunderbolt Mini Monitor and pass the graded live image to a monitor, I would always use a LUT box for video village, as looping through Prelight adds a delay, and also if the laptop has an issue, everybody looses their picture. A LUT box keeps the LUT, even if the computer goes down.

If you doing this in prep, you can shoot an actual image and import that into Prelight. Geoff, your step of going through the ARRI RAW converter and creating a JPEG is not necessary. You could have used the .ari file in Prelight directly.

If your LUT box is a BoxIO, you can grab live images with that. My preferred approach is to have the live signal going through the BoxIO on Prelight channel A, grab frames using that, and create a look on channel B in Prelight, monitoring on an SDI monitor via a BMD Mini Monitor. When I have a look everybody is happy with, I push that to the LUT box on channel A.

I don’t normally have a Mini Recorder sending a live feed to Prelight. It is not implemented yet, but I understand that Prelight will be able to use the image grabbing and timecode reading ability of the BoxIO to allow logging without needing a live feed.

I tested this when I had a camera here, I don’t at the moment, and it worked fine.

JPG was so I could easily post the original file as well as rendered images, otherwise I’d get complaints. You can’t please all the people :slight_smile:

I’m running a CML camera evaluation next week and I’ll try and see if I have time to fit in tests of this workflow with all the cameras.

Sorry. I was unclear. The thing which is not yet implemented is reading timecode from the BoxIO when logging in Prelight. Currently you need a direct SDI feed from the camera, such as a Mini Recorder, to do this. Although you can now log approximate timecodes with Prelight’s internal free running timecode.

For those using the Blackmagic Design UltraStudio 4K:
The insane noise mentioned on this thread of the unit has been fixed by updating to the recent Blackmagic_Desktop_Video_10.9.5./MacOS Sierra.

Hi Geoff.
First of all thanks for your test. Just wanted to make sure that the right order of ACES components is reported correctly; I’m pretty much sure that FilmLight Prelight does the things in the right order and that you meant as below while writing, but correct order would be

  • Input Transform (legacy name: IDT)
  • Grade
  • Output Transform (combination of legacy RRT then ODT – nothing should be applied between the two otherwise ACES interoperability is broken).

(Not “IDT, RRT, Grade & ODT combined” as it might be interpreted).

Yes, Prelight dose do things n the correct order. It’s just an incompetent user who lists things in the wrong order :slight_smile:

I’m in the process of producing a whole bunch of LUT’s to use in onset monitoring that will emulate the ACES workflow.

This is being done as part of my ACES workshops with the NSC and the next one, on the 24th January, will have many camera systems in a studio with various IO systems and grading setups.
We’ll be testing the current ACES emulations that I’ve done with the NSC that are for Alexa, CLog2, SLog3 & Vlog, there are also 2 experimental ones that include a 0.85 contrast setting for Alexa & Clog2.
All of these are based around ACEScct.

I’ve received a very kind offer of help from Nick Shaw and I’m sure he’ll make sure that my enthusiasm doesn’t run past my ability!!

I am looking forward to testing with Geoff at the workshop tomorrow, having a range of cameras and LUT boxes available. I have prepared a selection of LUTs in different forms (EE and LL, to use the ARRI LUT generator nomenclature) and will test and document which is the correct version to use in each LUT implementation.

1 Like

Geoff, the link is finally working here too. Thanks!

It’s an HTTPS address, well, it is now :slight_smile:
I’ll be uploading all the files that Nick and I presented at the workshop in the very near future.
I’m trying to collect all the relevant data on input and output levels of various LUT boxes before I do so that I can include that info as well.

2 Likes

Hi. First time poster here. I’m trying to live monitor in Resolve. Space is ACEScct/Input is BMD Film 4.6 Film/Output is REC709
I’m using a UltraStudio Mini Monitor and Recorder combo to input from URSA Mini Pro to MacBook Pro and out to a FSI monitor. The signal path works fine, but the resulting image is super RED and saturated.

Hi Jonathan,

Welcome to the forum.
Davinci Resolve >Resolve Live> Doesn’t support ACES. You will need LiveGrade or Prelight, among other live grading softwares that already support ACES. another downside of Davinci Resolve Live is that currently it doesn’t support LUT boxes.