ultrahdr: Correct hdr white nits for linear input
The hdr white nits for linear input is set to sdr white nits. This
causes huge losses during gain map compression. For linear inputs
now we are marking peak white nits to peak hlg white nits
Bug:
Test: ./ultrahdr_sample_app -p inp_p010.yuv -w 1920 -h 1080
Change-Id: I635d7d4cb7d2d3de9b2b24a59b175e05fbb3c189
diff --git a/libs/ultrahdr/jpegr.cpp b/libs/ultrahdr/jpegr.cpp
index dc439d7..55333b3 100644
--- a/libs/ultrahdr/jpegr.cpp
+++ b/libs/ultrahdr/jpegr.cpp
@@ -817,10 +817,13 @@
map_data.reset(reinterpret_cast<uint8_t*>(dest->data));
ColorTransformFn hdrInvOetf = nullptr;
- float hdr_white_nits = kSdrWhiteNits;
+ float hdr_white_nits;
switch (hdr_tf) {
case ULTRAHDR_TF_LINEAR:
hdrInvOetf = identityConversion;
+ // Note: this will produce clipping if the input exceeds kHlgMaxNits.
+ // TODO: TF LINEAR will be deprecated.
+ hdr_white_nits = kHlgMaxNits;
break;
case ULTRAHDR_TF_HLG:
#if USE_HLG_INVOETF_LUT