[DRAFT] Yet another implementation of XR_META_environment_depth (using global shader uniforms) #273
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is an alternative to the existing PRs to implement XR_META_environment_depth (ex #230 and #232), with the notable advantage that I've managed to a test project successfully doing occlusion (the others don't quite make it all the way there):
com.oculus.vrshell-20250220-145856-0.mp4
(Please ignore my very messy desk :-))
The implementation works by taking the data from Meta's depth API, then stashing it in global shader uniforms, and then there is normal Godot shader that checks against that data. The shader is based on the example shader in Meta's sample project for the depth API.
However:
(Would potentially fix #133)