Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Windows] Support output to HDR monitors #94496

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

DarkKilauea
Copy link
Contributor

@DarkKilauea DarkKilauea commented Jul 18, 2024

Implements: godotengine/godot-proposals#10817 for Windows.

Overview

This PR enables the ability for Godot to output to HDR capable displays on Windows. This allows Godot to output brighter images than allowed in SDR mode and with more vibrant colors.

Testing/Sample project: https://github.com/DarkKilauea/godot-hdr-output

HDR (higher bit depth image, may not display correctly on all browsers):
godot-hdr-output_hdr

SDR:
godot-hdr-output_sdr

Sponza (HDR, higher bit depth image, may not display correctly on all browsers):
godot-hdr-output_sponza_hdr

Sponza (SDR):
godot-hdr-output)sponza_sdr

Supported Platforms:

  • Windows

Supported Graphics APIs:

  • Vulkan
  • D3D12

Supported HDR Formats:

  • HDR10
  • scRGB (Linear)

Features:

  • APIs for fetching HDR display capabilities from attached displays.
  • Request a HDR capable swap chain for a window at runtime.
  • Automatic luminance matching to the display for the main window on launch.
  • Support for preferring a 16 bit per color swap chain for better blending or color banding.
  • Editor automatically updates to use HDR when changed in project settings.

Quirks:

  • Getting HDR display information on Windows requires Godot to be compiled with D3D support.

Follow up work:

  • Support Android
  • Support macOS with Metal
  • Add more error checking when requesting HDR output
  • Write official docs going over creating content for HDR displays and how to enable HDR output.
  • Create an official demo application.

Open Questions:

  • How should tonemap settings be configured to keep 3D scenes within the capabilities of the user's display?

Usage

Project Settings

  1. Enable display/window/hdr/enabled:
    image
  2. Enable rendering/viewport/hdr_2d:
    image
  3. Restart the editor as requested to enable the HDR framebuffer. Future changes to display/window/hdr/enabled will not require a restart.
  4. Adjust your tonemap settings to extend the brightness of your scene into the HDR range of your display (switching to Reinhard with a whitepoint of 0.5 and exposure of 3.0 works well). For 2D content, use colors that exceed 1.0 for a channel.

Runtime

  1. First, check that HDR is available for your platform, render driver, and display:
func _is_hdr_supported(screen: int) -> bool:
	return DisplayServer.has_feature(DisplayServer.FEATURE_HDR) \
		&& RenderingServer.get_rendering_device().has_feature(RenderingDevice.SUPPORTS_HDR_OUTPUT) \
		&& DisplayServer.screen_is_hdr_supported(screen);

func _ready() -> void:
	var screen := get_window().current_screen;
	var hdr_supported := _is_hdr_supported(screen);
	if hdr_supported:
		print("HDR is supported on this screen!");
	else:
		print("HDR is not supported on this screen.");
  1. Next, if HDR is supported, we can enable it on our current window.
if hdr_supported:
	var window := get_window();
	# Enable HDR render buffers for the current viewport, which allow for color values to exceed 1.0f.
	window.use_hdr_2d = true;
	# Request HDR output to the display.
	window.hdr_output_enabled = true;
	# Set the brightness of SDR content to match the desktop.
	window.hdr_output_reference_luminance = DisplayServer.screen_get_sdr_white_level(screen);
  1. Adjust your tonemap settings to extend the brightness of your scene into the HDR range of your display (switching to Reinhard with a whitepoint of 0.5 and exposure of 3.0 works well). For 2D content, use colors that exceed 1.0 for a channel.

Help Needed

Please give this a test, either with the linked sample project or with your own projects, and give feedback. Specifically I'm looking for input on how easy this feature was to use and if you encountered any issues with your particular display, OS, or driver configuration.

@Calinou
Copy link
Member

Calinou commented Jul 19, 2024

I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂

I'll need to look into building more extensive scenes and getting tonemapped screenshots/videos out of this. 2D HDR also needs to be tested thoroughly.

Remember that JPEG XL or AVIF for images and AV1 for videos are a must for HDR, as other formats can only store SDR data. You may need to embed those in ZIP archives and ask users to preview them in a local media player, as GitHub doesn't allow uploading those formats and browsers often struggle displaying HDR correctly.

I noticed some issues for now:

  • Having RTX HDR enabled will mess with the HDR that is enabled in the editor. It will continuously enable and disable itself whenever you make any input in the editor (and disable itself after being idle for a second). This is also an issue on master with HDR disabled.
  • HDR Max Luminance affects both 2D (UI) and 3D rendering. Is that intended?
  • The HDR editor setting is not applied instantly when you change it, even though the demo project shows a working example of it being toggled at runtime. You can update the viewport's status based on editor settings here:
    void EditorNode::_update_from_settings() {
  • There doesn't appear to be a paperwhite setting you can use to adjust UI brightness. This is typically offered in games to prevent the UI from being too bright. Using a paperwhite value around 200 nits is common, since a lot of OLED displays cap out at that brightness level in SDR. Either way, this should be exposed in the project settings and the documentation should recommend exposing this setting to player (just like HDR peak luminance).
    • There should also be a way for unshaded materials to base themselves on paperwhite, so that Sprite3D and Label3D used for UI purposes are not overly bright in HDR. I suppose this would be a BaseMaterial3D property or a shader render mode.
      • In the interest of compatibility, we may not be able to enable this by default in Sprite3D due to VFX usage (where HDR display can be intended), but for Label3D, we may be able to safely default to this.

See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video):

control_hdr_mod_settings.mp4

Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch 2 times, most recently from 88beb60 to 8df131d Compare July 19, 2024 06:30
@DarkKilauea
Copy link
Contributor Author

I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂

Thanks for taking a look!

I noticed some issues for now:

* Having RTX HDR enabled will mess with the HDR that is enabled in the editor. It will continuously enable and disable itself whenever you make any input in the editor (and disable itself after being idle for a second). This is also an issue on `master` with HDR disabled.

* See [[4.3 Beta 3] Strange editor brightness and colors caused by RTX Dynamic Vibrance affecting the editor #94231](https://github.com/godotengine/godot/issues/94231). We should see if we can forcibly disable RTX HDR and RTX Dynamic Vibrance for the editor using a NVIDIA profile. I haven't seen options for those in NVIDIA Profile Inspector so far.

Odd that NVidia's RTX HDR doesn't detect the HDR color space and avoid messing with the final swap chain buffer. Auto-HDR in Windows 11 appears to avoid messing with Godot when HDR is enabled. Updating the NVidia Profile may be outside the scope of this PR and be best done with a more focused PR.

* HDR Max Luminance affects both 2D (UI) and 3D rendering. Is that intended?

For the initial draft, yes, everything is mapped using the same tonemapper. However, we should map UI elements to a different brightness to avoid them being too bright. For now, that can be worked around with dimming the brightness of any UI elements via the theme, but I would like to fix that in this PR.

* The HDR editor setting is not applied instantly when you change it, even though the demo project shows a working example of it being toggled at runtime. You can update the viewport's status based on editor settings here: https://github.com/godotengine/godot/blob/ff8a2780ee777c2456ce42368e1065774c7c4c3f/editor/editor_node.cpp#L356

I haven't looked into configuring the editor to use HDR yet. Will do after I figure out how to properly tone map UI elements, if you enable HDR on the editor now, the UI is a little unpleasant.

* There doesn't appear to be a paperwhite setting you can use to adjust UI brightness. This is typically offered in games to prevent the UI from being too bright. Using a paperwhite value around 200 nits is common, since a lot of OLED displays cap out at that brightness level in SDR. Either way, this should be exposed in the project settings and the documentation should recommend exposing this setting to player (just like HDR peak luminance).

Agreed, UI elements and other 2D elements should probably be mapped to a different brightness curve. I'll probably have to figure out where in the engine 3D and 2D elements are composited together and perform the tone mapping there.

  * There should also be a way for unshaded materials to base themselves on paperwhite, so that Sprite3D and Label3D used for UI purposes are not overly bright in HDR. I suppose this would be a BaseMaterial3D property or a shader render mode.

    * In the interest of compatibility, we may not be able to enable this by default in Sprite3D due to VFX usage (where HDR display can be intended), but for Label3D, we may be able to safely default to this.

That might be outside of the scope of this PR. I'm not sure how I would indicate that certain 3D elements need to be mapped using a different brightness curve once they are all combined into the same buffer. It would be similar to trying to avoid sRGB mapping certain rendered elements.

For now, this can be worked around by decreasing the brightness of the color of these elements.

See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video):
control_hdr_mod_settings.mp4

Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements.

Baldur's Gate 3 and Cyberpunk 2077 also have really nice HDR settings menus. I've been basing some of this work off their approach, though modifying contrast and brightness I'm leaving up to Environment since those effects are already there.

Thanks again for your comments! I'll add some TODO items to the description for tracking.

@Jamsers
Copy link

Jamsers commented Aug 28, 2024

Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux)

I'd love to help test this PR but unfortunately I don't have HDR hardware ☹️

@alvinhochun
Copy link
Contributor

I recently got a monitor that supports fake HDR DisplayHDR 400 so I thought I could give this a try, but on Intel UHD 620 it prints "WARNING: HDR output requested but no HDR compatible format was found, falling back to SDR." and doesn't display in HDR. I was kind of expected this since it is using Vulkan, but I'm a bit surprised it works for you, even on windowed mode no less. I guess there is some special handling in the NVIDIA driver?

Anyway, adding HDR output to D3D12 should be trivial and I might give it a try. (No promises!)


Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up.

@fire
Copy link
Member

fire commented Aug 28, 2024

This needs a rebase on master, but I have a https://www.dell.com/en-ca/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories HDR display.

I can help test.

@DarkKilauea
Copy link
Contributor Author

Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux)

I'd love to help test this PR but unfortunately I don't have HDR hardware ☹️

You should be able to test with any scene, though keep in mind that the realistic light units will not map directly to the brightness of the display. Consumer desktop displays typically don't go much above 1000 nits on the high end, which is far too dim to simulate sunlight. Values from the scene will be mapped to a range fitting within the max luminosity set for the window.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from b2bd1a1 to 728912f Compare August 29, 2024 08:49
@alvinhochun
Copy link
Contributor

Here are the changes to get Rec. 2020 HDR output on D3D12: master...alvinhochun:godot:hdr-output-d3d12

@alvinhochun
Copy link
Contributor

Quote

HDR (blown out a bit, looks better on an HDR display): image

SDR: image

The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours:

image

The conversion may be done with something like this:

diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        // max_luminance is the display's peak luminance in nits
        // we map it here to the native 10000 nits range of ST2084
        float adjustment = max_luminance * (1.0f / 10000.0f);
+       color = color * adjustment;
+
+       // Color transformation matrix values taken from DirectXTK, may need verification.
+    const mat3 from709to2020 = mat3(
+          0.6274040f, 0.0690970f, 0.0163916f,
+          0.3292820f, 0.9195400f, 0.0880132f,
+          0.0433136f, 0.0113612f, 0.8955950f
+       );
+       color = from709to2020 * color;

        // Apply ST2084 curve
        const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        const float c3 = 18.6875;
        const float m1 = 0.1593017578125;
        const float m2 = 78.84375;
-       vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+       vec3 cp = pow(abs(color.rgb), vec3(m1));

        return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
 }

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from 728912f to 56d27a6 Compare August 31, 2024 02:29
@ArchercatNEO
Copy link
Contributor

I must admit my experience with hdr before trying to implement it in wayland was 0. I haven't actually tested either PR, played an hdr-capable game or have an hdr monitor. I just keep hearing about people wanting hdr in wayland and that godot hadn't implemented hdr yet so when the protocol was finally merged I decided I could try my hand at implementing it. My perspective on how developers/users should do hdr then is ideally they shouldn't, instead godot and the system compositor should have a solid enough implementation that developer effort and user calibration aren't required. That doesn't mean I think hdr settings menus need to be eliminated, user configuration should always take priority over what we can do at the engine level but I'd like to have an implementation where it isn't required for godot games to look good on hdr monitors.

@allenwp
Copy link
Contributor

allenwp commented Mar 8, 2025

Ok, the reason I asked is that I think it would be most helpful to hear your thoughts in the context of specific behaviours that you have experienced when trying out this PR on Windows, but if you don't have the hardware on hand to test this, then it sounds like this might not be an option.

@DarkKilauea
Copy link
Contributor Author

When screen luminance is enabled the hdr api is mostly disabled (just raises an error an returns) but I don’t see an actual fallback for missing values

The warnings are mostly there to let the developer know that setting luminance values has no effect if they've delegated handling luminance to Godot.

The choice has been distilled down to:

  1. Let Godot handle luminance for you, automatically configuring the tonemapper to expand the range to the user's display. This is enabled by default.
  2. Take control yourself, provide the luminance range and all the responsibility that entails to keep it updated as system config changes or the window moves around. This option enables "mastering" scenarios where the developer does not want Godot messing with the image on different displays.

I'm not understanding why this is incomplete. What scenarios are not supported?

@DarkKilauea
Copy link
Contributor Author

In both situations when display/window/hdr/enabled is false unless the developer enables hdr on a window manually the window will be sdr.
Difference is automatic sdr -> hdr no longer requires hdr_output on windows to be true. This is specifically for implementations where the difference between prefers_hdr and supports_hdr exists and means explicit hdr must be different to implicit automatic hdr.

Oh, I think I see a source of confusion here. The project settings only affect the main window and some viewports in the Editor to enable seamless authoring. It does not control "HDR" globally or enable it to be used. The developer can always ignore the project settings and go enable HDR on their main window and it will work the same way in game.

Like many project settings in Godot, it is primarily there as a convenience for getting the most common scenarios setup.

@ArchercatNEO
Copy link
Contributor

It does not control "HDR" globally or enable it to be used

When I read main.cpp I saw a GLOBAL_GET for that setting which if true, enabled hdr on the main window. Then at some point that created windows inherit their hdr status from the main window. Did I misunderstand what the code change in main.cpp does? If the project setting doesn't actually enable automatic hdr for the actual exported game then I am suggesting that it do that.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from f49a1f6 to d68a6be Compare March 8, 2025 19:39
@ArchercatNEO
Copy link
Contributor

I'm not understanding why this is incomplete. What scenarios are not supported?

Yeah maybe that was probably me thinking too fast. There is the footgun of disabling screen_luminance without setting values defaulting to project settings but that's a strech. Essentially the only things my suggestion would add is being a bit simpler (in my opinion) as it allows to remove the system_luminance project setting while still keeping the benefits. Also the case where only some of the settings are set by the developer and others are set by the system, I admit this seems a bit strange of a usecase. Mainly it's doing the same thing but in a way that seemed a bit simpler to me. If it doesn't seem simpler to you then we could probably just drop the change.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from d68a6be to fa2fa6d Compare March 8, 2025 22:45
@DarkKilauea
Copy link
Contributor Author

A couple quick updates:

  1. Changed default for reference_luminance to 200 as suggested by allenwp
  2. Renamed screen_get_max_average_luminance to screen_get_max_full_frame_luminance for clarity.

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

My perspective on how developers/users should do hdr then is ideally they shouldn't, instead godot and the system compositor should have a solid enough implementation that developer effort and user calibration aren't required.

This is misguided - unfortunately HDR isn't just some feature that you can flip on to make your SDR game prettier - it's a different presentation that fundamentally requires developer attention. You can literally see this in @allenwp's updates: you can't just take ACES/AgX SDR tonemapping output and map it to HDR - it fundamentally requires a different tonemapper. This and many others is a big reason why even AAA studios often get their HDR wrong.

In addition, proper game HDR usually requires the compositor (along with the display itself) to butt out and leave the game engine alone. Any interference tends to cause issues. (especially since the HDR stack is still quite brittle even nowadays)

Trying to make it so that users don't have to configure HDR is a good goal - but even then, that's achieved by making sure the engine can reliably get correct information about the display's colorspace and max nits - so that the developer can then implement functionality to take care of HDR configuration for the user.

You can't divorce the developer's hand from HDR - they're fundamentally going to have to get their hands dirty to implement HDR - so the engine's responsibility is really just to provide the developer with the tools needed for them to do their job.

What you seem to want is an auto-magic SDR to HDR feature. The good news is that does exist. Windows offers it as Auto-HDR, NVIDIA offers it as RTX HDR. However... if Wayland/Linux wants this, it's on them to implement it on their side. This is actually a compositor/driver side feature, not a game engine feature.

You also seem to want an auto-magic HDR to SDR feature. As I mentioned before, no one has implemented this for games because it doesn't make much sense - if the engine knows it's outputting to an SDR screen, why wouldn't it just use its SDR pipeline? The output will be much higher quality than a converted HDR to SDR image. That being said, if for whatever reason you still really want this, this can also be implemented as a compositor/driver side feature. The only existing implementations to reference are in video players like VLC, since auto-magic HDR to SDR is only useful for media with static, baked data, that won't necessarily have SDR data, like video and images.

@ArchercatNEO
Copy link
Contributor

ArchercatNEO commented Mar 9, 2025

This is actually a compositor/driver side feature, not a game engine feature.

I would have an easier time accepting this if the automatic sdr -> hdr features didn't exist. auto hdr and rtx hdr are compositor side yes but why wouldn't we be able to do precisely what they do ourselves? Also what about use_screen_luminance? That is explicitly about releasing control of hdr to the engine so that it may do sdr -> hdr. Do you propose we remove this feature because for good hdr the developer must have been involved?

You also seem to want an auto-magic HDR to SDR feature.

There are 2 ways I could interpret this. Godot renders something in hdr and then tonemaps to sdr internally, which we already do. When importing hdr image formats I believe godot does just clip the hdr to an sdr range.

I could also interpret it as we send hdr to the compositor and expect the compositor to be able to convert it to sdr for the monitor. This is precisely a compositor feature like you said and the spec for wayland says that we may depend on the compositor to have this feature. We wouldn't automatically do hdr output on sdr monitors because that's not efficient but wayland has this feature so I don't see why we should actively prevent the developer from doing it on wayland.

Automatic hdr wouldn't do it on sdr monitors and we can just add documentation saying that explicit hdr (rather than the implicit, automatic hdr) should be exposed as settings to the user at which point it's up to the user to decide if hdr happens on their sdr monitor or not. If Windows doesn't have this feature, then it doesn't and we aren't required to allow it.

From the original proposal godotengine/godot-proposals#10817

While Godot internally renders in HDR when using the Forward+ and Mobile rendering methods, it currently does not support outputting to an HDR display in a way that preserves the dynamic range.

The way I interpret this is that godot has been doing hdr rendering for a long time and has been tonmapping hdr to sdr for a long time. Is this the automagic hdr -> sdr you were talking about? Maybe the rendering pipeline changed since the proposal was made?

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

auto hdr and rtx hdr are compositor side yes but why wouldn't we be able to do precisely what they do ourselves?

Well actually yes, we could do it. Could be a nice feature for devs that don't want to do the legwork for proper HDR.

I will say that the fundamentals of Godot's HDR will have to be finished and released first because this feature will require that to even work in the first place. But after all's said and done, we could create a separate proposal/PR to implement this.

Also what about use_screen_luminance? That is explicitly about releasing control of hdr to the engine so that it may do sdr -> hdr. Do you propose we remove this feature because for good hdr the developer must have been involved?

There are 2 ways I could interpret this. Godot renders something in hdr and then tonemaps to sdr internally, which we already do. When importing hdr image formats I believe godot does just clip the hdr to an sdr range.

The way I interpret this is that godot has been doing hdr rendering for a long time and has been tonmapping hdr to sdr for a long time. Is this the automagic hdr -> sdr you were talking about? Maybe the rendering pipeline changed since the proposal was made?

I think there's some confusion here... I apologize. This is a common pitfall when talking about HDR. Let me clarify.

There is "HDR" in the "Half Life 2 Lost Coast" sense - this is talking about how games, internally, represent the scene with HDR data, so that they can do fancy stuff (at least for the 2000s) like bloom and eye adaptation. We're not talking about that, although it is related. Regardless of whether the engine supports and outputs HDR or not, internally, the engine represents the scene in "HDR" either way, pre-tonemapping. This is how it's been since the 2000s.

When we talk about HDR in the here and now, we're talking about HDR displays, and the pipeline from game output to display. The HDR10 spec basically. (And potentially HDR10+ and Dolby Vision)

So when I refer to an "auto-magic SDR to HDR feature" like Windows Auto-HDR, what that does is, the compositor takes the SDR output of a game that has no HDR capabilities, or has explicitly turned off its HDR capabilities, and does realtime inverse tone mapping on that output. The compositor then takes the result of that and presents it to the display as HDR data.

When I refer to an "auto-magic HDR to SDR feature", this is taking HDR data and tonemapping that down to SDR in realtime, then presenting that SDR data to the display. This can currently only be found on video players (AFAIK), but theoretically how it could work with games is, let's say we're on a system that doesn't have HDR capabilities. If we have a game that has HDR capabilities, the compositor can "lie" to the game and say the system has HDR capabilities. The game then sends out HDR output to the compositor. The compositor will then take that HDR output, tonemap it down to SDR in realtime, and present that SDR data to the display.

I apologize if it's still not clear after this, I fully admit HDR is honestly a mess and confusing as hell, and developers are only now starting to grok all of it. There's a reason why a whole decade after the standard was introduced, only now are game developers starting to embrace it.

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

So, on a game that doesn't support HDR:

In engine:

  1. scene is rendered in "HDR"
  2. render is tonemapped to SDR
  3. output to compositor

In compositor:

  1. compositor receives SDR output
  2. compositor outputs SDR data to display

If HDR display is connected and HDR is enabled for the OS:

  1. compositor receives SDR output
  2. compositor composites this SDR output within the HDR desktop/buffer
  3. compositor outputs HDR data to display

(You can just composite SDR images into an HDR buffer with no tonemapping or conversion required, because SDR is a subset of HDR)

On a game that does support HDR:

In engine:

Engine asks compositor/OS if system supports HDR.

If compositor answers no, engine will just do the SDR pipeline.

If compositor answers yes:

  1. scene is rendered in "HDR"
  2. render is tonemapped to HDR.
  3. output to compositor

In compositor:

  1. compositor receives HDR output
  2. compositor outputs HDR data to display

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

This is where our current work lies:

If compositor answers yes:

  1. scene is rendered in "HDR"
  2. render is tonemapped to HDR.

"But wait a minute, this doesn't make sense! It's already in HDR, so why does it need to tonemap to HDR?"

Ah, therein lies the rub. The "HDR" representation that engines use internally is actually higher range than current HDR display standards! In addition, as I'm sure you know, HDR displays range from 400 nits to 1000+ nits. Tonemapping is still required, and this is where values like use_screen_luminance are used. When the engine asks the compositor/OS if the system supports HDR, if it's answered yes, the engine will also ask for the display's max nits, and the display's color space. The tonemapper will then tonemap specifically for the max nits and color space reported by the system, not to a generic, fixed HDR target like movies or images do.

"Then what's the point of the HDR options offered in the game's settings menu?"

User control over the tonemapper behavior, basically. For example, maybe, even if your TV supports 1000 nits, you only want to use 400 nits of brightness range. Maybe the TV is lying to the system about its actual capabilities (some older cheapo TVs did this). Maybe you prefer the TV's tonemapper - for example, you disable HGiG mode, set the game to output full 1000 nits even if the TV only supports 600 nits, and leave it to the TV to tonemap that down as it sees fit. Maybe you want to choose between HDR10+ and Dolby Vision.

"Where would the auto-magic SDR to HDR feature lie in all this, if we were to implement it?"

In engine:

Engine asks compositor/OS if system supports HDR.

If compositor answers no, engine will just do the SDR pipeline.

If compositor answers yes, and developer enabled the auto-magic SDR to HDR feature:

  1. scene is rendered in "HDR"
  2. render is tonemapped to SDR
  3. SDR render is inverse tonemapped to HDR
  4. output to compositor

In compositor:

  1. compositor receives HDR output
  2. compositor outputs HDR data to display

@Zamundaaa
Copy link

I know the spec says that the OS compositor /should/ correctly convert between formats, but I have absolutely no faith that every single OS compositor will. I'm not confident /any/ of them will get it right the first time

On HDR displays, Windows 11 still gets sRGB wrong, and many HDR displays are terrible themselves, so I understand your hesitation, but things work differently on Wayland. Colorspaces are very strictly defined, and conversions between them are simple mathematical formulae that so far not a single compositor has gotten wrong.

Preventing conversions is a good thing in general, targeting the compositor's preferred colorspace is absolutely the right move for performance reasons, but assuming that the entire rest of the stack is completely broken and that you need to attempt to work around it before even seeing a single actual issue is a terrible approach.

You won't even always have a choice - in many cases, KDE Plasma will tell you to target a colorspace that uses the native primaries and whitepoint of the display + a gamma 2.2 transfer function with HDR headroom, rather than some standard HDR colorspace. Sometimes this is for compositor-side optimizations, but sometimes it's also just how the display works: The Steam Deck OLED for example uses a conventional gamma 2.2 transfer function.

Unless you add support for that, using Wayland instead of Vulkan to tell the compositor that you're targeting this special color space (which would be cool tbf, and could prevent color conversions on SDR displays as well), you're just gonna have to trust the compositor to convert whatever colorspace you use to what it actually needs.

hdr on sdr monitors working on kwin is not enough, can we be confident all compositors will have a sufficiently robust implementation?

I think the above answers that question, but to expand on that, I would recommend you to detect if you should prefer HDR by checking max_luminance > reference_luminance. If the compositor can make HDR work nicely on a given screen, no matter how that works under the hood, it'll just tell you by sending matching luminance values in the preferred image description.

Allowing the user or developer to manually enable HDR even when that's not the case might be nice to test the compositor's tone mapping capabilities, but isn't something you really need to care about.

Yes we should prefer dynamic metadata to static metadata when setting hdr luminances. Dynamic metadata being system/screen luminance and hdr api. Static metadata being the project settings and bt.2408 recommendations.

That's not right. Static metadata - or more specifically the part of it that's called the mastering display information - describes the display the content was made for.

On Wayland, the preferred image description tells you about the display to target, so assuming you tonemap for that, you should also use those same values as the static HDR metadata, which tells the compositor that you're targeting the display you're supposed to, and that it shouldn't do any tone mapping.

If you don't target the system provided values, then the luminance ranges the developer manually set up should be used - representing the display they optimized the game for - and the compositor will tonemap it to the display it actually gets presented on.

Dynamic metadata would give the compositor more information about the actually used brightness ranges in a given scene, which may be lower than what the target display is capable of. As this is for better tonemapping, which you want to avoid in most cases anyways, and Wayland doesn't even have an extension for dynamic HDR metadata yet, you really don't have to worry about it right now.

The "HDR" representation that engines use internally is actually higher range than current HDR display standards!

I think it's more helpful there to talk about the meaning of the brightness values rather than just their range. The internal data is normally scene-referred, it represents the brightness that things would have in the real world (assuming a "realistic" scene), and the output data takes into account the light adaptation of the viewer, and as you said of course the capabilities of the display.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from fa2fa6d to b5a3369 Compare March 10, 2025 03:19
@DarkKilauea
Copy link
Contributor Author

Applied some of the tonemapping changes that @allenwp was playing around with since they seem to improve the result significantly.

Also gave me a chance to refactor the tonemap shader to remove parameters I was no longer using.

@allenwp
Copy link
Contributor

allenwp commented Mar 10, 2025

Applied some of the tonemapping changes that @allenwp was playing around with since they seem to improve the result significantly.

Also gave me a chance to refactor the tonemap shader to remove parameters I was no longer using.

Thanks! I wanted to adapt your glow changes to this approach to review how it would behave with the different glow modes and a "Linear" tonemapper. Sometimes glow is used to simulate a physical effect (fog glow around a bright point of light), other times it is used as a psychological/perception trick to make part of an image appear brighter than the display can reproduce, or it can even be used as a stylistic effect with no physical or psychological basis.

I will reiterate that the intent behind this approach is to demonstrate that it is not possible to adapt any of the existing filmic tonemapper equations to HDR, as the contrast curve is not correctly scaled. Because Reinhard does not have a "toe", it is reasonable to adapt it to HDR using this approach, but it breaks when white < (max_luminance / ref_luminance), so it likely makes the most sense to simply create a new HDR Reinhard tonemapper that has a white = max(white, (max_luminance / ref_luminance); statement, alongside the new filmic HDR tonemappers when they are introduced.

(A good example of when this breaks is setting Reinhard white to 1.0, which by definition should be identical to Linear. Another example would be setting white to 2.0, which is a reasonable value if you are targeting the Mobile rendering method: you will notice that the contrast and brightness between HDR and SDR is very different on all of the tonemappers. With higher white values, the difference in contrast is more subtle, but still incorrect in HDR mode.)

@ArchercatNEO
Copy link
Contributor

Allowing the user or developer to manually enable HDR even when that's not the case might be nice to test the compositor's tone mapping capabilities, but isn't something you really need to care about.

Very well. I still believe separating "explicit" hdr (the actual scripting hdr api) and "implicit" hdr (the project setting) would be more flexible (and allow us to distinguish between monitors which both support and prefer hdr from those which support hdr but do not prefer it, likely because of a compositor feature) but since this is my primary usecase if we decide we really don't want this and nobody has a different usecase for making the distinction then my suggestions can be ignored.

That's not right. Static metadata - or more specifically the part of it that's called the mastering display information - describes the display the content was made for.

Yeah my bad. I shouldn't have used "static metadata" and "dynamic metadata" here since they already have standard definitions in this context. I was using dynamic as runtime luminances, ie luminances we got while the game was running. The luminance values in this category then were the "system/screen" luminances which for most situations should be the ones we prefer and values decided either in a settings menu or from something like a colorspace picker in an artistic app. For the settings menu the loss of performance seems fine to me since the user explicitly enabled the setting and the artistic app needs that kind of control for mastering and such. Really my version of "dynamic" luminances just meant the system luminances in most cases and luminances we got from some other source not decided by the developer, the user, an image's hdr metadata. I also said "static metadata" because it didn't change at runtime which was also a bad move. It's just the bt.2408 luminances and the project settings. Although thinking about it maybe we could provide mastering display information somewhere, maybe from the project settings themselves even. But I won't push too much on that, if it turns out mastering display metadata is useful we could possibly return in a later pr.

@ArchercatNEO
Copy link
Contributor

Well I have some code examples of what my wayland impl is, what I'd like it to be and why it can't be implemented like that with the current state of this pr. Maybe it will make the usecase clearer maybe it won't.

Current

bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
  ...
  // as per zamundaaa's suggestion
  return wayland_thread.supports_hdr() && (screen->color_profile.max_luminance > screen->color_profile.sdr_white);
}

void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
     ...
    bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
    bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
    bool hdr_desired = wayland_thread.supports_hdr() && hdr_preferred && hdr_enabled;
}

(the hdr state is just hdr_desired)

The thing I would like to draw your attention is how to actually enable hdr on a window we check if a window's preferred profile meets the max_luminance > sdr_white criteria but for screen_is_hdr_supported we check if the screen's profile meets the criteria. These are not guaranteed to be the same (and I don't know how on how many compositors they might be). In theory they should be pretty close which is why I decided we could be slightly inconsistent here. Big problem is that if they actually differ we could have something like a broken hdr settings menu that's very hard to track down.

What I would like to use

bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
  ...
  // as per the wayland spec
  return wayland_thread.supports_hdr();
}

void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
     ...
    bool system_hdr = GLOBAL_GET("display/window/hdr/enabled");
    bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
    bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
   
    if (system_hdr && hdr_preferred && wayland_thread.supports_hdr()) {
       //use screen luminances
    } else if (hdr_enabled && wayland_thread.supports_hdr()) {
       //use developer-set luminances
    } else {
       //disable hdr
    }
}

Why can I not do this? It's because the current implementation makes all windows request hdr turning most of this into just if (true) { hdr } which clearly is not efficient. If it did work it would solve the consistency problem. It is the developer using screen_supports_hdr and setting hdr_enabled so it's good that the conditions are exactly the same. The other benefit is we can also automatically enable hdr on certain monitors like we already do. This is why I would like for display/window/hdr/enabled to not call window_set_hdr_output_enabled at startup. That change means this implementation is possible.

There are 2 other alternatives for the consistency problem:

  • Use the main window's preferred profile inside screen_supports_hdr. This solves inconsistency when we just check if the screen the main window is on is the only one we care about but isn't really what we should be doing.
  • Use the screen profile in _window_update_hdr_state. This one I'm also very against. When using "screen" luminances the wayland implementation uses the window preferred profile not the screen profile. Checking against screen luminance and using preferred luminance seems even more inconsistent a behavior.

If I still haven't convinced anyone chances are the inconsistency probably won't be a problem. I cannot guarantee that it won't be a problem but I don't predict many compositors making the preferred profile really different to the screen profile very often (though now that I think a bit more, compositors can put windows in several screens which could make this quite a bit more complicated).

@allenwp
Copy link
Contributor

allenwp commented Mar 13, 2025

Regarding ACES 2 support: I had a discussion with one of the members of ACES about how things should be handled with an operating system that scales its SDR content when displaying it in an HDR signal, instead of pinning SDR content at exactly 100 nits.

https://community.acescentral.com/t/aces-2-in-game-engines-variable-reference-max-luminance/5734

The summary is: there is no recommendation on how to deal with this and it would need to be researched. Regardless, I think I came up with a way to best handle this by using max_luminance and reference_luminance values, but it is yet to be tested. I first need to write an ACES 2 reference implementation in glsl that could accept these parameters and then figure out how to simplify and approximate it.

I could say that a truly correct ACES 2 implementation would force reference_luminance to 100, but I believe this has a notable downside of creating an game that has a very different brightness between SDR and HDR output.

I could go even further to say that Godot should always force reference_luminance to 100, full stop, and this would allow ACES 2 to be absolutely correct in HDR and more obviously incorrect in SDR (because SDR would appear brighter on most systems). I don't think I agree with this; it's better to just let the developer/player set their reference_luminance to 100 if they want ACES HDR to be absolutely "correct". We simply don't have the ability to make ACES SDR game output correct on Windows when in Windows is in HDR output mode.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from b5a3369 to 137b53c Compare March 15, 2025 22:24
@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from 137b53c to e112064 Compare March 21, 2025 03:07

LONG result = DisplayConfigGetDeviceInfo(&sdr_white_level.header);
if (result == ERROR_SUCCESS) {
data->sdrWhiteLevelInNits = (float)(sdr_white_level.SDRWhiteLevel / 1000) * 80;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
data->sdrWhiteLevelInNits = (float)(sdr_white_level.SDRWhiteLevel / 1000) * 80;
data->sdrWhiteLevelInNits = (float)sdr_white_level.SDRWhiteLevel / 1000 * 80;

This suggestion corrects reading of window SDR white level to match behaviour described in the documentation:
https://learn.microsoft.com/en-us/windows/win32/api/wingdi/ns-wingdi-displayconfig_sdr_white_level

(Current PR code incorrectly performs integer division instead of float division.)

@allenwp
Copy link
Contributor

allenwp commented Mar 26, 2025

I had a conversation with David Sena, an experienced rendering engineer, at GDC and he presented a really great idea with how to improve the workflow for a game that supports SDR and HDR output: previewing HDR and SDR at the same time when lighting a scene and configuring exposure/tonemapping. (Thanks, David!)

I thought about how this sort of feature might work in Godot and realized that the two viewports for HDR and SDR already exist in Godot: simply use the "Camera Preview" feature in the editor and run the scene (F6). The only remaining feature is a way to make the HDR setting different between the scene that is running in its own process and the editor. My proposal is to add a new HDR button to the Game workspace:

image

This HDR button would ideally match the state of the scene instance that is running and, importantly, will not affect the project setting that used by the editor. This way, you can very quickly and easily light a scene and adjust your environment settings while previewing the look in SDR and HDR.

(Edit: I should also add that this feature can be implemented in a separate PR if desired, although I think this would be really great to have in the first Godot release that supports HDR output.)

Edit 2: This goes way outside of the scope of this PR and should exist in a separate proposal, but I'm mentioning it here because it doesn't make sense to make a proposal until HDR output is merged... Some people might be interested in being able to preview what sRGB TVs look like with the SDR version of the game when Windows HDR is enabled. This can be done by applying this to the final render of the game:

// Simulate sRGB display
color = linear_to_srgb(color); // sRGB encoding
color = pow(color, vec3(2.2)); // linearize like a reference sRGB display would

(This video has more information about why linearization performed by an sRGB display does not match the inverse of sRGB nonlinear encoding.)

@allenwp
Copy link
Contributor

allenwp commented Mar 28, 2025

I have begun testing with multi-display setups and have discovered some bugs. I'll post them here and can elaborate more on them later as needed. Once I get a development environment set up, I can debug these myself, if and when I have some time.

System info: Windows 11, NVIDIA GPU + AMD integrated graphics.

Issue 1: Crash when running editor or game on secondary display.

This happens when primary display is plugged into NVIDIA and secondary display is plugged into AMD integrated graphics. Both have HDR enabled. The crash happens when I move the window onto the secondary display or when I start Godot on this secondary display. Simply changing the AMD HDMI output to be the primary display resolves the crash.

Issue 2: HDR output nonfunctional on AMD output

When my AMD integrated graphics output is the primary display, HDR output does not function. This is the case regardless of whether I have a secondary display using my NVIDIA output. Interestingly, if I toggle HDR output on while the window is on my secondary NIVIDA output, I can move the window back to my AMD output and HDR output works on the AMD output until I toggle HDR output off and on again.

Issue 3: get_window().hdr_output_reference_luminance does not update immediately

I've noticed that when I change the Windows "SDR content brightness" slider and have "use screen luminance" enabled, Godot only responds to the change in reference luminance after I move or resize the Godot window. Maybe this is a reasonable behaviour, but I figured I'd mention it anyway. Edit: Actually, it seems that Mac maps its system brightness setting to this sort of screen reference luminance in its "EDR" paradigm, so Godot will need to adjust its reference luminance in realtime to support this behaviour on Mac OS with didChangeScreenParametersNotification. Maybe there's a notification like this for Windows as well?

</member>
<member name="display/window/hdr/use_screen_luminance" type="bool" setter="" getter="" default="true">
If [code]true[/code], uses the screen's luminance as reported by the display server to set luminance for the window.
If information is missing for the display, will fall back to luminance values set in project settings.
Copy link
Contributor

@allenwp allenwp Mar 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If information is missing for the display, will fall back to luminance values set in project settings.

Is this still true? I'm having difficulty finding the code that does this...

I believe this sort of behaviour, if it was implemented, would be problematic from a user experience side of things because this would make the luminance project settings overloaded with two uses. Instead, hardcoding fallback screen luminance values into Godot, per-platform, is preferable.

Before I realized that this behaviour might not have been implemented yet, I wrote out some thoughts that I'll paste here for reference:

Fallback luminance values can be better determined by Godot engine developers than the vast majority of Godot users because most users will not dedicate sufficient research time. Additionally, users who do perform sufficient research will likely come to the same conclusion as Godot developers.

By simply hardcoding reasonable fallback screen luminance values based on operating system documentation and testing, the user will not need to do this sort of research themselves. The hardcoded fallback values must be per-host; another operating system might use a different value. This will be entirely invisible to the user and I can’t imagine a scenario where the user would have a valid need to know whether the screen luminance was successfully retrieved from the OS or whether the fallback is used; if they want control over the luminance, they can simply disable “use screen luminance”.

Suggested values:

  • Max Luminance: 1000 nits (ACES authors all fixed dynamic range content with this max luminance by default, so it's probably a safe bet.)
  • SDR White Level / Reference Luminance: 240 nits (I've tried three computers with Sony TVs and found that Windows always defaults to this. I will be getting a new Asus HDR ProArt monitor soon and will report back with what it defaults to...) The docs say: "On desktop HDR monitors, SDR reference white levels are typically set to around 200 nits."
  • Min Luminance: 0 nits
  • Full frame luminance: whatever reference luminance is set to might be a safe default...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Values set here

https://github.com/godotengine/godot/blob/e1120643efe3cd729d295c0a91e2fa1cbc683303/main/main.cpp#L3345C1-L3348C74

Looking through this pr the state of the "screen" luminance and "user" luminance are the same. Enabling screen luminance will clobber user luminance values but if screen luminance has no values than there's nothing to clobber project settings with so they'll be used.

Since one other person has brought this up I will make some suggestion as to the place of the project settings: it seems like these project settings fit more closely to mastering metadata than anything else. If user values are provided (eg through settings) or system values are provided (eg screen luminance) then it seems more logical to prefer those. Therefore we should consider changing the fallback chain of an exported project to remove project settings from it. I will suggest 2 variations for if either seem interesting. "User" luminance here is defined as luminances that were set by the scripting API. Most likely they are the implementation of HDR settings menus which are configured by "users".

Keeping use_screen_luminance

enabled: recommendation < "screen" luminance
disabled: recommendation < user luminance

It would also be helpful to document whether enabling/disabling screen_luminance should be expected to preserve user values or not (as in does disabling screen_luminance restore user values). The windows implementation doesn't, the wayland implemtation does.

Removing screen luminance

Since we should always have a fallback for luminances when hdr is enabled we could also fallback to screen luminance if user luminance is missing.

recommendation < screen luminance < user luminance

User luminance is considered missing if it is equal to 0. I have heard suggestion that min_luminance may be removed as we don't use it and reference/max luminances of 0 are clearly wrong so this seems a somewhat sensible check. Developers do explicitly clobber their user values which may be a disadvantage but since they wanted screen luminances it is sensible to expect they don't need them anymore.

It may also be helpful to decide if the project settings should have a meaning at runtime or not. On wayland we are actually able to cooperate with the compositor and describe mastering luminances but I'm not aware of this being possible for windows.

Copy link
Contributor

@allenwp allenwp Mar 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Values set here

https://github.com/godotengine/godot/blob/e1120643efe3cd729d295c0a91e2fa1cbc683303/main/main.cpp#L3345C1-L3348C74

This is not what I was referring to: this code only executes when not using screen luminance. My comment was in regards to when "use screen luminance" is true and screen information cannot be obtained.

I think the current behaviour of this PR is good, in regards to "clobbering" the luminance values when toggling on and off the "use screen luminance" setting in a game. If the developer would prefer to keep player settings when "use screen luminance" is toggled on and off, they can simply store those player settings alongside other player settings of the game.

Copy link
Contributor

@allenwp allenwp Mar 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ArchercatNEO did you know that "screen reference luminance" in this PR is SDR White Level, which is a Windows user setting, set for each connected HDR display, that takes priority over screen values?
image

I expect that a lot of your concerns would disappear if you're able to try out this PR and test how it behaves on Windows with an HDR monitor.


uint32_t path_count = 0;
uint32_t mode_count = 0;
if (GetDisplayConfigBufferSizes(QDC_ONLY_ACTIVE_PATHS, &path_count, &mode_count) == ERROR_SUCCESS) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might need error handling if unlikely != ERROR_SUCCESS. In this case, return 240.0 // Default Windows SDR white level in nits and maybe log an error message.

#endif // D3D12_ENABLED
}

static BOOL CALLBACK _MonitorEnumProcSdrWhiteLevel(HMONITOR hMonitor, HDC hdcMonitor, LPRECT lprcMonitor, LPARAM dwData) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function needs error handling if != ERROR_SUCCESS throughout. In this case, data->sdrWhiteLevelInNits = 240.0 // Default Windows SDR white level in nits and maybe log an error messages for places where it makes sense.

@allenwp
Copy link
Contributor

allenwp commented Mar 28, 2025

I spent some time prototyping different approaches to an HDR settings menu that might exist in a Godot game using this PR.

HDR.Settings.Menu.DEBUG.2025-03-28.16-53-02.mp4

I found that it was easy to implement most of the menu styles, except that a number of the styles require one modification to this PR: separate use_screen_max_luminace and use_screen_reference_luminance settings.

Note: I used the term "Brightness" instead of "reference luminance" because this is the correct player-facing term. In my opinion, it does not make sense for a game to present the term "reference luminance" to the player when all it really means is just the overall brightness of the game when using HDR output. I am indifferent on which term is used by Godot scripting/project settings.

Style A - Simple
Separate brightness and max luminance controls that default to screen values.

Style B - Advanced
Same as "Style A - Simple", but with nits values always presented to the player, even when they have not been customized.

Style C - Ignore Screen Lum.
Screen luminance is entirely ignored. For games designed to be played in a home cinema or where the viewing environment's brightness is known in advance.

Style D - Screen Luminance Toggle
One toggle switch that controls whether screen luminance is used or custom luminance is used for both brightness and max luminance.

Style E - Screen Luminance (Saved)
Same as "Style D - Screen Luminance Toggle", except the player's previous brightness and max luminance values are restored when they turn off "Use screen luminance".

Style F - Automatic Simple
"Use screen luminance" is automatically disabled when the user changes either brightness or max luminance. "Use screen luminance" is automatically enabled when the user presses the "Reset" button. My personal favourite style.

Another Style Idea (not in video)
Because the reference luminance value is simply the SDR white level Windows setting, the Brightness control could be omitted entirely and reference luminance would always match screen reference luminance. Only max luminance would be configurable through the game.

I have a pretty strong preference towards Style F - Automatic Simple because it makes for a very simple GUI that integrates naturally with other game settings that I could imagine. For this GUI to function correctly, it needs separate use_screen_max_luminace and use_screen_reference_luminance settings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.