-
-
Notifications
You must be signed in to change notification settings - Fork 22k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Windows] Support output to HDR monitors #94496
base: master
Are you sure you want to change the base?
Conversation
I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂 I'll need to look into building more extensive scenes and getting tonemapped screenshots/videos out of this. 2D HDR also needs to be tested thoroughly. Remember that JPEG XL or AVIF for images and AV1 for videos are a must for HDR, as other formats can only store SDR data. You may need to embed those in ZIP archives and ask users to preview them in a local media player, as GitHub doesn't allow uploading those formats and browsers often struggle displaying HDR correctly. I noticed some issues for now:
See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video): control_hdr_mod_settings.mp4Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements. |
88beb60
to
8df131d
Compare
Thanks for taking a look!
Odd that NVidia's RTX HDR doesn't detect the HDR color space and avoid messing with the final swap chain buffer. Auto-HDR in Windows 11 appears to avoid messing with Godot when HDR is enabled. Updating the NVidia Profile may be outside the scope of this PR and be best done with a more focused PR.
For the initial draft, yes, everything is mapped using the same tonemapper. However, we should map UI elements to a different brightness to avoid them being too bright. For now, that can be worked around with dimming the brightness of any UI elements via the theme, but I would like to fix that in this PR.
I haven't looked into configuring the editor to use HDR yet. Will do after I figure out how to properly tone map UI elements, if you enable HDR on the editor now, the UI is a little unpleasant.
Agreed, UI elements and other 2D elements should probably be mapped to a different brightness curve. I'll probably have to figure out where in the engine 3D and 2D elements are composited together and perform the tone mapping there.
That might be outside of the scope of this PR. I'm not sure how I would indicate that certain 3D elements need to be mapped using a different brightness curve once they are all combined into the same buffer. It would be similar to trying to avoid sRGB mapping certain rendered elements. For now, this can be worked around by decreasing the brightness of the color of these elements.
Baldur's Gate 3 and Cyberpunk 2077 also have really nice HDR settings menus. I've been basing some of this work off their approach, though modifying contrast and brightness I'm leaving up to Environment since those effects are already there. Thanks again for your comments! I'll add some TODO items to the description for tracking. |
b89985a
to
e9742ba
Compare
e9742ba
to
b2bd1a1
Compare
Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux) I'd love to help test this PR but unfortunately I don't have HDR hardware |
I recently got a monitor that supports Anyway, adding HDR output to D3D12 should be trivial and I might give it a try. (No promises!) Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up. |
This needs a rebase on master, but I have a https://www.dell.com/en-ca/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories HDR display. I can help test. |
You should be able to test with any scene, though keep in mind that the realistic light units will not map directly to the brightness of the display. Consumer desktop displays typically don't go much above 1000 nits on the high end, which is far too dim to simulate sunlight. Values from the scene will be mapped to a range fitting within the max luminosity set for the window. |
b2bd1a1
to
728912f
Compare
Here are the changes to get Rec. 2020 HDR output on D3D12: master...alvinhochun:godot:hdr-output-d3d12 |
The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours: The conversion may be done with something like this: diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
// max_luminance is the display's peak luminance in nits
// we map it here to the native 10000 nits range of ST2084
float adjustment = max_luminance * (1.0f / 10000.0f);
+ color = color * adjustment;
+
+ // Color transformation matrix values taken from DirectXTK, may need verification.
+ const mat3 from709to2020 = mat3(
+ 0.6274040f, 0.0690970f, 0.0163916f,
+ 0.3292820f, 0.9195400f, 0.0880132f,
+ 0.0433136f, 0.0113612f, 0.8955950f
+ );
+ color = from709to2020 * color;
// Apply ST2084 curve
const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
const float c3 = 18.6875;
const float m1 = 0.1593017578125;
const float m2 = 78.84375;
- vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+ vec3 cp = pow(abs(color.rgb), vec3(m1));
return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
}
|
728912f
to
56d27a6
Compare
I must admit my experience with hdr before trying to implement it in wayland was 0. I haven't actually tested either PR, played an hdr-capable game or have an hdr monitor. I just keep hearing about people wanting hdr in wayland and that godot hadn't implemented hdr yet so when the protocol was finally merged I decided I could try my hand at implementing it. My perspective on how developers/users should do hdr then is ideally they shouldn't, instead godot and the system compositor should have a solid enough implementation that developer effort and user calibration aren't required. That doesn't mean I think hdr settings menus need to be eliminated, user configuration should always take priority over what we can do at the engine level but I'd like to have an implementation where it isn't required for godot games to look good on hdr monitors. |
Ok, the reason I asked is that I think it would be most helpful to hear your thoughts in the context of specific behaviours that you have experienced when trying out this PR on Windows, but if you don't have the hardware on hand to test this, then it sounds like this might not be an option. |
The warnings are mostly there to let the developer know that setting luminance values has no effect if they've delegated handling luminance to Godot. The choice has been distilled down to:
I'm not understanding why this is incomplete. What scenarios are not supported? |
Oh, I think I see a source of confusion here. The project settings only affect the main window and some viewports in the Editor to enable seamless authoring. It does not control "HDR" globally or enable it to be used. The developer can always ignore the project settings and go enable HDR on their main window and it will work the same way in game. Like many project settings in Godot, it is primarily there as a convenience for getting the most common scenarios setup. |
When I read |
f49a1f6
to
d68a6be
Compare
Yeah maybe that was probably me thinking too fast. There is the footgun of disabling screen_luminance without setting values defaulting to project settings but that's a strech. Essentially the only things my suggestion would add is being a bit simpler (in my opinion) as it allows to remove the |
d68a6be
to
fa2fa6d
Compare
A couple quick updates:
|
This is misguided - unfortunately HDR isn't just some feature that you can flip on to make your SDR game prettier - it's a different presentation that fundamentally requires developer attention. You can literally see this in @allenwp's updates: you can't just take ACES/AgX SDR tonemapping output and map it to HDR - it fundamentally requires a different tonemapper. This and many others is a big reason why even AAA studios often get their HDR wrong. In addition, proper game HDR usually requires the compositor (along with the display itself) to butt out and leave the game engine alone. Any interference tends to cause issues. (especially since the HDR stack is still quite brittle even nowadays) Trying to make it so that users don't have to configure HDR is a good goal - but even then, that's achieved by making sure the engine can reliably get correct information about the display's colorspace and max nits - so that the developer can then implement functionality to take care of HDR configuration for the user. You can't divorce the developer's hand from HDR - they're fundamentally going to have to get their hands dirty to implement HDR - so the engine's responsibility is really just to provide the developer with the tools needed for them to do their job. What you seem to want is an auto-magic SDR to HDR feature. The good news is that does exist. Windows offers it as Auto-HDR, NVIDIA offers it as RTX HDR. However... if Wayland/Linux wants this, it's on them to implement it on their side. This is actually a compositor/driver side feature, not a game engine feature. You also seem to want an auto-magic HDR to SDR feature. As I mentioned before, no one has implemented this for games because it doesn't make much sense - if the engine knows it's outputting to an SDR screen, why wouldn't it just use its SDR pipeline? The output will be much higher quality than a converted HDR to SDR image. That being said, if for whatever reason you still really want this, this can also be implemented as a compositor/driver side feature. The only existing implementations to reference are in video players like VLC, since auto-magic HDR to SDR is only useful for media with static, baked data, that won't necessarily have SDR data, like video and images. |
I would have an easier time accepting this if the automatic sdr -> hdr features didn't exist. auto hdr and rtx hdr are compositor side yes but why wouldn't we be able to do precisely what they do ourselves? Also what about use_screen_luminance? That is explicitly about releasing control of hdr to the engine so that it may do sdr -> hdr. Do you propose we remove this feature because for good hdr the developer must have been involved?
There are 2 ways I could interpret this. Godot renders something in hdr and then tonemaps to sdr internally, which we already do. When importing hdr image formats I believe godot does just clip the hdr to an sdr range. I could also interpret it as we send hdr to the compositor and expect the compositor to be able to convert it to sdr for the monitor. This is precisely a compositor feature like you said and the spec for wayland says that we may depend on the compositor to have this feature. We wouldn't automatically do hdr output on sdr monitors because that's not efficient but wayland has this feature so I don't see why we should actively prevent the developer from doing it on wayland. Automatic hdr wouldn't do it on sdr monitors and we can just add documentation saying that explicit hdr (rather than the implicit, automatic hdr) should be exposed as settings to the user at which point it's up to the user to decide if hdr happens on their sdr monitor or not. If Windows doesn't have this feature, then it doesn't and we aren't required to allow it. From the original proposal godotengine/godot-proposals#10817
The way I interpret this is that godot has been doing hdr rendering for a long time and has been tonmapping hdr to sdr for a long time. Is this the automagic hdr -> sdr you were talking about? Maybe the rendering pipeline changed since the proposal was made? |
Well actually yes, we could do it. Could be a nice feature for devs that don't want to do the legwork for proper HDR. I will say that the fundamentals of Godot's HDR will have to be finished and released first because this feature will require that to even work in the first place. But after all's said and done, we could create a separate proposal/PR to implement this.
I think there's some confusion here... I apologize. This is a common pitfall when talking about HDR. Let me clarify. There is "HDR" in the "Half Life 2 Lost Coast" sense - this is talking about how games, internally, represent the scene with HDR data, so that they can do fancy stuff (at least for the 2000s) like bloom and eye adaptation. We're not talking about that, although it is related. Regardless of whether the engine supports and outputs HDR or not, internally, the engine represents the scene in "HDR" either way, pre-tonemapping. This is how it's been since the 2000s. When we talk about HDR in the here and now, we're talking about HDR displays, and the pipeline from game output to display. The HDR10 spec basically. (And potentially HDR10+ and Dolby Vision) So when I refer to an "auto-magic SDR to HDR feature" like Windows Auto-HDR, what that does is, the compositor takes the SDR output of a game that has no HDR capabilities, or has explicitly turned off its HDR capabilities, and does realtime inverse tone mapping on that output. The compositor then takes the result of that and presents it to the display as HDR data. When I refer to an "auto-magic HDR to SDR feature", this is taking HDR data and tonemapping that down to SDR in realtime, then presenting that SDR data to the display. This can currently only be found on video players (AFAIK), but theoretically how it could work with games is, let's say we're on a system that doesn't have HDR capabilities. If we have a game that has HDR capabilities, the compositor can "lie" to the game and say the system has HDR capabilities. The game then sends out HDR output to the compositor. The compositor will then take that HDR output, tonemap it down to SDR in realtime, and present that SDR data to the display. I apologize if it's still not clear after this, I fully admit HDR is honestly a mess and confusing as hell, and developers are only now starting to grok all of it. There's a reason why a whole decade after the standard was introduced, only now are game developers starting to embrace it. |
So, on a game that doesn't support HDR:In engine:
In compositor:
If HDR display is connected and HDR is enabled for the OS:
(You can just composite SDR images into an HDR buffer with no tonemapping or conversion required, because SDR is a subset of HDR) On a game that does support HDR:In engine: Engine asks compositor/OS if system supports HDR. If compositor answers no, engine will just do the SDR pipeline. If compositor answers yes:
In compositor:
|
This is where our current work lies:
"But wait a minute, this doesn't make sense! It's already in HDR, so why does it need to tonemap to HDR?" Ah, therein lies the rub. The "HDR" representation that engines use internally is actually higher range than current HDR display standards! In addition, as I'm sure you know, HDR displays range from 400 nits to 1000+ nits. Tonemapping is still required, and this is where values like "Then what's the point of the HDR options offered in the game's settings menu?" User control over the tonemapper behavior, basically. For example, maybe, even if your TV supports 1000 nits, you only want to use 400 nits of brightness range. Maybe the TV is lying to the system about its actual capabilities (some older cheapo TVs did this). Maybe you prefer the TV's tonemapper - for example, you disable HGiG mode, set the game to output full 1000 nits even if the TV only supports 600 nits, and leave it to the TV to tonemap that down as it sees fit. Maybe you want to choose between HDR10+ and Dolby Vision. "Where would the auto-magic SDR to HDR feature lie in all this, if we were to implement it?" In engine: Engine asks compositor/OS if system supports HDR. If compositor answers no, engine will just do the SDR pipeline. If compositor answers yes, and developer enabled the auto-magic SDR to HDR feature:
In compositor:
|
On HDR displays, Windows 11 still gets sRGB wrong, and many HDR displays are terrible themselves, so I understand your hesitation, but things work differently on Wayland. Colorspaces are very strictly defined, and conversions between them are simple mathematical formulae that so far not a single compositor has gotten wrong. Preventing conversions is a good thing in general, targeting the compositor's preferred colorspace is absolutely the right move for performance reasons, but assuming that the entire rest of the stack is completely broken and that you need to attempt to work around it before even seeing a single actual issue is a terrible approach. You won't even always have a choice - in many cases, KDE Plasma will tell you to target a colorspace that uses the native primaries and whitepoint of the display + a gamma 2.2 transfer function with HDR headroom, rather than some standard HDR colorspace. Sometimes this is for compositor-side optimizations, but sometimes it's also just how the display works: The Steam Deck OLED for example uses a conventional gamma 2.2 transfer function. Unless you add support for that, using Wayland instead of Vulkan to tell the compositor that you're targeting this special color space (which would be cool tbf, and could prevent color conversions on SDR displays as well), you're just gonna have to trust the compositor to convert whatever colorspace you use to what it actually needs.
I think the above answers that question, but to expand on that, I would recommend you to detect if you should prefer HDR by checking Allowing the user or developer to manually enable HDR even when that's not the case might be nice to test the compositor's tone mapping capabilities, but isn't something you really need to care about.
That's not right. Static metadata - or more specifically the part of it that's called the mastering display information - describes the display the content was made for. On Wayland, the preferred image description tells you about the display to target, so assuming you tonemap for that, you should also use those same values as the static HDR metadata, which tells the compositor that you're targeting the display you're supposed to, and that it shouldn't do any tone mapping. If you don't target the system provided values, then the luminance ranges the developer manually set up should be used - representing the display they optimized the game for - and the compositor will tonemap it to the display it actually gets presented on. Dynamic metadata would give the compositor more information about the actually used brightness ranges in a given scene, which may be lower than what the target display is capable of. As this is for better tonemapping, which you want to avoid in most cases anyways, and Wayland doesn't even have an extension for dynamic HDR metadata yet, you really don't have to worry about it right now.
I think it's more helpful there to talk about the meaning of the brightness values rather than just their range. The internal data is normally scene-referred, it represents the brightness that things would have in the real world (assuming a "realistic" scene), and the output data takes into account the light adaptation of the viewer, and as you said of course the capabilities of the display. |
fa2fa6d
to
b5a3369
Compare
Applied some of the tonemapping changes that @allenwp was playing around with since they seem to improve the result significantly. Also gave me a chance to refactor the tonemap shader to remove parameters I was no longer using. |
Thanks! I wanted to adapt your glow changes to this approach to review how it would behave with the different glow modes and a "Linear" tonemapper. Sometimes glow is used to simulate a physical effect (fog glow around a bright point of light), other times it is used as a psychological/perception trick to make part of an image appear brighter than the display can reproduce, or it can even be used as a stylistic effect with no physical or psychological basis. I will reiterate that the intent behind this approach is to demonstrate that it is not possible to adapt any of the existing filmic tonemapper equations to HDR, as the contrast curve is not correctly scaled. Because Reinhard does not have a "toe", it is reasonable to adapt it to HDR using this approach, but it breaks when (A good example of when this breaks is setting Reinhard |
Very well. I still believe separating "explicit" hdr (the actual scripting hdr api) and "implicit" hdr (the project setting) would be more flexible (and allow us to distinguish between monitors which both support and prefer hdr from those which support hdr but do not prefer it, likely because of a compositor feature) but since this is my primary usecase if we decide we really don't want this and nobody has a different usecase for making the distinction then my suggestions can be ignored.
Yeah my bad. I shouldn't have used "static metadata" and "dynamic metadata" here since they already have standard definitions in this context. I was using dynamic as runtime luminances, ie luminances we got while the game was running. The luminance values in this category then were the "system/screen" luminances which for most situations should be the ones we prefer and values decided either in a settings menu or from something like a colorspace picker in an artistic app. For the settings menu the loss of performance seems fine to me since the user explicitly enabled the setting and the artistic app needs that kind of control for mastering and such. Really my version of "dynamic" luminances just meant the system luminances in most cases and luminances we got from some other source not decided by the developer, the user, an image's hdr metadata. I also said "static metadata" because it didn't change at runtime which was also a bad move. It's just the bt.2408 luminances and the project settings. Although thinking about it maybe we could provide mastering display information somewhere, maybe from the project settings themselves even. But I won't push too much on that, if it turns out mastering display metadata is useful we could possibly return in a later pr. |
Well I have some code examples of what my wayland impl is, what I'd like it to be and why it can't be implemented like that with the current state of this pr. Maybe it will make the usecase clearer maybe it won't. Current bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
...
// as per zamundaaa's suggestion
return wayland_thread.supports_hdr() && (screen->color_profile.max_luminance > screen->color_profile.sdr_white);
}
void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
...
bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
bool hdr_desired = wayland_thread.supports_hdr() && hdr_preferred && hdr_enabled;
} (the hdr state is just The thing I would like to draw your attention is how to actually enable hdr on a window we check if a window's preferred profile meets the What I would like to use bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
...
// as per the wayland spec
return wayland_thread.supports_hdr();
}
void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
...
bool system_hdr = GLOBAL_GET("display/window/hdr/enabled");
bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
if (system_hdr && hdr_preferred && wayland_thread.supports_hdr()) {
//use screen luminances
} else if (hdr_enabled && wayland_thread.supports_hdr()) {
//use developer-set luminances
} else {
//disable hdr
}
} Why can I not do this? It's because the current implementation makes all windows request hdr turning most of this into just There are 2 other alternatives for the consistency problem:
If I still haven't convinced anyone chances are the inconsistency probably won't be a problem. I cannot guarantee that it won't be a problem but I don't predict many compositors making the preferred profile really different to the screen profile very often (though now that I think a bit more, compositors can put windows in several screens which could make this quite a bit more complicated). |
Regarding ACES 2 support: I had a discussion with one of the members of ACES about how things should be handled with an operating system that scales its SDR content when displaying it in an HDR signal, instead of pinning SDR content at exactly 100 nits. https://community.acescentral.com/t/aces-2-in-game-engines-variable-reference-max-luminance/5734 The summary is: there is no recommendation on how to deal with this and it would need to be researched. Regardless, I think I came up with a way to best handle this by using I could say that a truly correct ACES 2 implementation would force I could go even further to say that Godot should always force |
b5a3369
to
137b53c
Compare
Co-authored-by: Alvin Wong <[email protected]>
137b53c
to
e112064
Compare
|
||
LONG result = DisplayConfigGetDeviceInfo(&sdr_white_level.header); | ||
if (result == ERROR_SUCCESS) { | ||
data->sdrWhiteLevelInNits = (float)(sdr_white_level.SDRWhiteLevel / 1000) * 80; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
data->sdrWhiteLevelInNits = (float)(sdr_white_level.SDRWhiteLevel / 1000) * 80; | |
data->sdrWhiteLevelInNits = (float)sdr_white_level.SDRWhiteLevel / 1000 * 80; |
This suggestion corrects reading of window SDR white level to match behaviour described in the documentation:
https://learn.microsoft.com/en-us/windows/win32/api/wingdi/ns-wingdi-displayconfig_sdr_white_level
(Current PR code incorrectly performs integer division instead of float division.)
I had a conversation with David Sena, an experienced rendering engineer, at GDC and he presented a really great idea with how to improve the workflow for a game that supports SDR and HDR output: previewing HDR and SDR at the same time when lighting a scene and configuring exposure/tonemapping. (Thanks, David!) I thought about how this sort of feature might work in Godot and realized that the two viewports for HDR and SDR already exist in Godot: simply use the "Camera Preview" feature in the editor and run the scene (F6). The only remaining feature is a way to make the HDR setting different between the scene that is running in its own process and the editor. My proposal is to add a new HDR button to the Game workspace: This HDR button would ideally match the state of the scene instance that is running and, importantly, will not affect the project setting that used by the editor. This way, you can very quickly and easily light a scene and adjust your environment settings while previewing the look in SDR and HDR. (Edit: I should also add that this feature can be implemented in a separate PR if desired, although I think this would be really great to have in the first Godot release that supports HDR output.) Edit 2: This goes way outside of the scope of this PR and should exist in a separate proposal, but I'm mentioning it here because it doesn't make sense to make a proposal until HDR output is merged... Some people might be interested in being able to preview what sRGB TVs look like with the SDR version of the game when Windows HDR is enabled. This can be done by applying this to the final render of the game: // Simulate sRGB display
color = linear_to_srgb(color); // sRGB encoding
color = pow(color, vec3(2.2)); // linearize like a reference sRGB display would (This video has more information about why linearization performed by an sRGB display does not match the inverse of sRGB nonlinear encoding.) |
I have begun testing with multi-display setups and have discovered some bugs. I'll post them here and can elaborate more on them later as needed. Once I get a development environment set up, I can debug these myself, if and when I have some time. System info: Windows 11, NVIDIA GPU + AMD integrated graphics. Issue 1: Crash when running editor or game on secondary display. This happens when primary display is plugged into NVIDIA and secondary display is plugged into AMD integrated graphics. Both have HDR enabled. The crash happens when I move the window onto the secondary display or when I start Godot on this secondary display. Simply changing the AMD HDMI output to be the primary display resolves the crash. Issue 2: HDR output nonfunctional on AMD output When my AMD integrated graphics output is the primary display, HDR output does not function. This is the case regardless of whether I have a secondary display using my NVIDIA output. Interestingly, if I toggle HDR output on while the window is on my secondary NIVIDA output, I can move the window back to my AMD output and HDR output works on the AMD output until I toggle HDR output off and on again. Issue 3: I've noticed that when I change the Windows "SDR content brightness" slider and have "use screen luminance" enabled, Godot only responds to the change in reference luminance after I move or resize the Godot window. Maybe this is a reasonable behaviour, but I figured I'd mention it anyway. Edit: Actually, it seems that Mac maps its system brightness setting to this sort of screen reference luminance in its "EDR" paradigm, so Godot will need to adjust its reference luminance in realtime to support this behaviour on Mac OS with |
</member> | ||
<member name="display/window/hdr/use_screen_luminance" type="bool" setter="" getter="" default="true"> | ||
If [code]true[/code], uses the screen's luminance as reported by the display server to set luminance for the window. | ||
If information is missing for the display, will fall back to luminance values set in project settings. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If information is missing for the display, will fall back to luminance values set in project settings.
Is this still true? I'm having difficulty finding the code that does this...
I believe this sort of behaviour, if it was implemented, would be problematic from a user experience side of things because this would make the luminance project settings overloaded with two uses. Instead, hardcoding fallback screen luminance values into Godot, per-platform, is preferable.
Before I realized that this behaviour might not have been implemented yet, I wrote out some thoughts that I'll paste here for reference:
Fallback luminance values can be better determined by Godot engine developers than the vast majority of Godot users because most users will not dedicate sufficient research time. Additionally, users who do perform sufficient research will likely come to the same conclusion as Godot developers.
By simply hardcoding reasonable fallback screen luminance values based on operating system documentation and testing, the user will not need to do this sort of research themselves. The hardcoded fallback values must be per-host; another operating system might use a different value. This will be entirely invisible to the user and I can’t imagine a scenario where the user would have a valid need to know whether the screen luminance was successfully retrieved from the OS or whether the fallback is used; if they want control over the luminance, they can simply disable “use screen luminance”.
Suggested values:
- Max Luminance:
1000 nits
(ACES authors all fixed dynamic range content with this max luminance by default, so it's probably a safe bet.) - SDR White Level / Reference Luminance:
240 nits
(I've tried three computers with Sony TVs and found that Windows always defaults to this. I will be getting a new Asus HDR ProArt monitor soon and will report back with what it defaults to...) The docs say: "On desktop HDR monitors, SDR reference white levels are typically set to around 200 nits." - Min Luminance:
0 nits
- Full frame luminance: whatever reference luminance is set to might be a safe default...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Values set here
Looking through this pr the state of the "screen" luminance and "user" luminance are the same. Enabling screen luminance will clobber user luminance values but if screen luminance has no values than there's nothing to clobber project settings with so they'll be used.
Since one other person has brought this up I will make some suggestion as to the place of the project settings: it seems like these project settings fit more closely to mastering metadata than anything else. If user values are provided (eg through settings) or system values are provided (eg screen luminance) then it seems more logical to prefer those. Therefore we should consider changing the fallback chain of an exported project to remove project settings from it. I will suggest 2 variations for if either seem interesting. "User" luminance here is defined as luminances that were set by the scripting API. Most likely they are the implementation of HDR settings menus which are configured by "users".
Keeping use_screen_luminance
enabled: recommendation < "screen" luminance
disabled: recommendation < user luminance
It would also be helpful to document whether enabling/disabling screen_luminance should be expected to preserve user values or not (as in does disabling screen_luminance restore user values). The windows implementation doesn't, the wayland implemtation does.
Removing screen luminance
Since we should always have a fallback for luminances when hdr is enabled we could also fallback to screen luminance if user luminance is missing.
recommendation < screen luminance < user luminance
User luminance is considered missing if it is equal to 0. I have heard suggestion that min_luminance may be removed as we don't use it and reference/max luminances of 0 are clearly wrong so this seems a somewhat sensible check. Developers do explicitly clobber their user values which may be a disadvantage but since they wanted screen luminances it is sensible to expect they don't need them anymore.
It may also be helpful to decide if the project settings should have a meaning at runtime or not. On wayland we are actually able to cooperate with the compositor and describe mastering luminances but I'm not aware of this being possible for windows.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Values set here
This is not what I was referring to: this code only executes when not using screen luminance. My comment was in regards to when "use screen luminance" is true and screen information cannot be obtained.
I think the current behaviour of this PR is good, in regards to "clobbering" the luminance values when toggling on and off the "use screen luminance" setting in a game. If the developer would prefer to keep player settings when "use screen luminance" is toggled on and off, they can simply store those player settings alongside other player settings of the game.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ArchercatNEO did you know that "screen reference luminance" in this PR is SDR White Level, which is a Windows user setting, set for each connected HDR display, that takes priority over screen values?
I expect that a lot of your concerns would disappear if you're able to try out this PR and test how it behaves on Windows with an HDR monitor.
|
||
uint32_t path_count = 0; | ||
uint32_t mode_count = 0; | ||
if (GetDisplayConfigBufferSizes(QDC_ONLY_ACTIVE_PATHS, &path_count, &mode_count) == ERROR_SUCCESS) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This might need error handling if unlikely != ERROR_SUCCESS
. In this case, return 240.0 // Default Windows SDR white level in nits
and maybe log an error message.
#endif // D3D12_ENABLED | ||
} | ||
|
||
static BOOL CALLBACK _MonitorEnumProcSdrWhiteLevel(HMONITOR hMonitor, HDC hdcMonitor, LPRECT lprcMonitor, LPARAM dwData) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function needs error handling if != ERROR_SUCCESS
throughout. In this case, data->sdrWhiteLevelInNits = 240.0 // Default Windows SDR white level in nits
and maybe log an error messages for places where it makes sense.
I spent some time prototyping different approaches to an HDR settings menu that might exist in a Godot game using this PR. HDR.Settings.Menu.DEBUG.2025-03-28.16-53-02.mp4I found that it was easy to implement most of the menu styles, except that a number of the styles require one modification to this PR: separate Note: I used the term "Brightness" instead of "reference luminance" because this is the correct player-facing term. In my opinion, it does not make sense for a game to present the term "reference luminance" to the player when all it really means is just the overall brightness of the game when using HDR output. I am indifferent on which term is used by Godot scripting/project settings. Style A - Simple Style B - Advanced Style C - Ignore Screen Lum. Style D - Screen Luminance Toggle Style E - Screen Luminance (Saved) Style F - Automatic Simple Another Style Idea (not in video) I have a pretty strong preference towards Style F - Automatic Simple because it makes for a very simple GUI that integrates naturally with other game settings that I could imagine. For this GUI to function correctly, it needs separate |
Implements: godotengine/godot-proposals#10817 for Windows.
Overview
This PR enables the ability for Godot to output to HDR capable displays on Windows. This allows Godot to output brighter images than allowed in SDR mode and with more vibrant colors.
Testing/Sample project: https://github.com/DarkKilauea/godot-hdr-output
HDR (higher bit depth image, may not display correctly on all browsers):

SDR:

Sponza (HDR, higher bit depth image, may not display correctly on all browsers):

Sponza (SDR):

Supported Platforms:
Supported Graphics APIs:
Supported HDR Formats:
Features:
Quirks:
Follow up work:
Open Questions:
Usage
Project Settings
display/window/hdr/enabled
:rendering/viewport/hdr_2d
:display/window/hdr/enabled
will not require a restart.0.5
and exposure of3.0
works well). For 2D content, use colors that exceed 1.0 for a channel.Runtime
0.5
and exposure of3.0
works well). For 2D content, use colors that exceed 1.0 for a channel.Help Needed
Please give this a test, either with the linked sample project or with your own projects, and give feedback. Specifically I'm looking for input on how easy this feature was to use and if you encountered any issues with your particular display, OS, or driver configuration.