Skip to content

> Regarding Dolby's tonemap, HDR10+ to DV conversion? #86

@boohboot

Description

@boohboot
          > Regarding Dolby's tonemap, how did you solve the problem for the HDR10+ to DV conversion?

The HDR10+ to Dolby Vision conversion is simply adding L1 metadata, which is the min/max/avg brightness of a shot.
That data is already present in HDR10+ metadata, on a different scale (0 to 100 000 instead of 0 to 4095).

Proper HDR10+ requires a Bezier curve to adjust the mapping for the target display, and there's no public algorithm for generating it.

And Dolby specifies multiple parameters to adjust the tonemapping, which can't be directly translated.
They are also creative intent, while HDR10+ is based solely on algorithms.

I don't know how HDR10+ displays would react with bogus data, because I don't own one.

This isn't something I'm willing to invest time into, because I have no way to test for myself.

Originally posted by @quietvoid in #37 (comment)

What about converting DV to HDR10+ profile A which doesn't require Bezier Curve data?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions