LUT confusion

Its simple really.

Now with latest version of eyeon Fusion working camera LUT’s couldn’t be easier. The profiles are all included. The my new workflow is very simple.

  1. Simple Log to Lin workflow in Fusion
  2. Using Stacked LUTs in Fusion
  3. Using grading artists lut files in fusion
  4. Log Maths: Converting between gain and f-stops

Alas while working on The Fades for the BBC things were not so obvious. It proved impossible to find the correct LUT for the sony F35 camera used by the production team. So in the end we did it all by eye. The maths can help, but there’s a saying in VFX… ‘if it looks right it is right’.

I wrote this post in the middle of the confusion. It outlines some of the issues we were having. Of course once the production had finished the correct LUT files were found.

Random notes from the past …

A LUT is needed when working on digital film images. I.e.The RAW frames from the sony F35 camera are normally encoded in sonys s-log color space. A LUT converts the image into a new color space. If you view LOG images without the correct LUT they can appear either very flat and washed out, or very dark

You can convert when you capture data from the camera. BUT if you are working in 10bit space then you need to keep the data in a log format. If convert from log to linear without increasing the bit depth of the file you lose data. This may show up as artifacts when the images are graded. As humans we’re more sensitive to darker tones than highlights. Log colour space pushes more data into the darker areas of the image where its needed to create a finer gradient of tone.

DDC/CI is a proticol that allows the graphics card to talk to the monitor. This allow you to setup Hardware LUTs that are faster than software LUTs. In theory this means that my geForce card will talk to my NEC multiSync LCD2490WUXi2 which supports 12-bit internal programmable lookup tables (LUTs) for calibration. Alas support for this monitor is almost dead so I’ve never got it to work.

A viewer LUT is used to display the image in linear color space. Bare in mind that all Monitors apply there own gamma to the image. This could be rec709 sRGB, Adobe RGB color space, or something else. Wikipedia is typically detailed but unhelpful… ‘sRGB uses the ITU-R BT.709 primaries, the same as are used in studio monitors and HDTV,[1] and a transfer function (gamma curve) typical of CRTs. This specification allowed sRGB to be directly displayed on typical CRT monitors of the time, a factor which greatly aided its acceptance.’

The Maths.

To get from the the RAW sony camera data to something I can view sensibly. I simply need to.
Convert s-log to Linear then Linear to the Monitor colour space

So we need a camera LUT AND a monitor LUT to view the image correctly … (and a properly calibrated monitor…)

In  the Sony f35 white paper. The formula representing the S-Log curve is as follows.

y = (0.432699 * Log10(t + 0.037584) + 0.616596) + 0.03

where t ranges from 0 to 10.0, representing 0 to 1000% input light level to a camera.
Multiply y by 100 to get the percentage.

The reverse curve is expressed as follows.

Y = Power(10.0, ((t – 0.616596 – 0.03) / 0.432699)) – 0.037584

where t has a range of 0 to 1.09, representing the camera output code of 0 to 109%.
Multiply Y by 100 to get the percentage.

The formula repesenting the Rec709 curve is

Y = if x <= 0.018 then ( x * 4.5 ) else ( 1.099 * (x^0.45) ) – 0.099

Y = if x <= 0.081 then ( x / 4.5 ) else ( (x + 0.099) / 1.099 ) ^ (1 / 0.45)