Interactive wireframe of a parametric ODT concept

Below is the interactive wireframe presented at the ACES ODT Virtual Working Group Meeting

It’s got a few minor bugs and the specific options were all intended primarily as a conversation starter, but just wondering what everyone thinks of the overall concept.

Thanks
Alex

1 Like

We probably discussed about and it seems to a great base. Some initial things I would be keen on discuss about:

  • Presets for various type of displays:
    • Projector
    • Computer Monitor
    • TV
    • VR HMD
    • AR HMD
    • (i)Phone
    • You name it…
  • Extended viewing conditions and appearance modeling support (critically important for HDR and VR HMDs.)
  • Metadata carrying once the ODT is generated, ties with the archival mention of this morning.
  • Low and high rolloff control.
  • Splines evaluation and how they are enforced to be averagely average while exhibiting smooth derivatives. Talking about the RRT itches super hard, I keep that for the next meeting :wink:

Cheers,

Thomas

Keep in mind it’s the display, the viewing conditions, and the encoding that dictate the ODT. All need to be specified for each display type.

We’ve been looking closely at ACESclip to determine if the current spec meets these needs or not.

This requires more of a detailed conversation, but this may have archival implications.

Absolutely :slight_smile:

I was not necessarily thinking about “live” controls, but parameters you set upon generating the ODT, akin to the Display Max Luminance, Display Min Luminance and 18% Grey Luminance.

Overall I like the concept. I think it is a much better way of covering all the possible ODTs that simply providing an official stock ODT for every possible permutation. Even just for simple stuff like the fact that there is no P3 D65 ODT in the repo. With this, making one would be easy.

I notice HLG is included under “Display EOTF”. What is the proposed ACES approach to HLG, given that as per BBC/BT.2100 spec, HLG is scene referred, with picture rendering applied by the display? This clearly doesn’t fit with the way ACES is designed to work. Do you simply treat it as display referred, and use an inverse of the HLG EOTF of the display used for grading (e.g. the “HLG Reference EOTF” from BT.2100 with Lw 1000 cd/m^2 for an X-300) making HLG “Mastering Display Referred” as Charles Poynton has always said HLG is (and I’m inclined to agree with him).

That, I think, is the big win. Give people the ability to create their own ODTs.

Yeah, the best way we’ve been able to incorporate HLG to date is to use the existing ST-2084 ODTs and convert to HLG using the methods specified in BT.2100 and BT.2390-2.

That’s what I suspected. I remain unconvinced by the use of scene referred HLG.

1 Like

I know that is a point of disagreement. Although not specified in ITU-R BT.709, ISO 22028 denotes that ITU-R BT.709 is scene referred also, though there’s a footnote about how it’s treated in many cases as output referred. The idea in the case of both 709 and 2020 is that it’s scene referred because both specify as camera encoding.

I think most have come to accept 709 as output referred because of the dynamic range and encoding primary limitations imposed by the encoding and workflows associated with it.