Ok, good then. Just feels a bit weird to use an output as an input. Labels ¯\_(ツ)_/¯.
Yes and no. Mostly no in my case. If it looks right on an iPad display (regular, retina and P3 retina) I’m good. P3 retinas visually match what I’m seeing on my NEC MultiSync which is hardware set to sRGB at D65, then the rest is calibrated with an i1 Display Pro node.
The good thing about this, if it matches on one iPad, it’ll match on virtually all iPads of that type. I’m matching to P3 and live with what the earlier ones do as they’re being phased out. Well, except that new cheap one which has the old display, no coating and no P3 space.
That’s how my system is set up. I had it on wide gamut for a while but once someone doesn’t embed a colour profile everything just looks garish and pure red or green tones physically hurt to look at. After a year of that my tolerance for that was gone and I caved and set it to sRGB on the hardware side and calibrated everything to sRGB 2.1 spec.
QuickTime’s gamma issues are a fun and sordid tale indeed. When I write a ProRes4444, while not really RGB, I’m getting pretty much the same as when writing a JPEG though. So once I got that right, it’s not that much of a stretch to get the ProRes to look right. Once that looks right, Compressor does a good job in keeping the colours close to the ProRes intermediate.
Which is one of the reasons I’m testing with JPEGs, faster iteration since QuickTime player doesn’t auto update a file when it got overwritten. The other reason is that some parts actually end up as a JPEG sequence with a separate alpha channel.
Thank you for taking all that time BTW.