-
Notifications
You must be signed in to change notification settings - Fork 389
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gamepads in inline mode #1347
Comments
I believe the specification does not prevent browsers from supporting this; this is an issue for implementors. |
@Manishearth - The specification doesn't currently allow gamepads in inline mode based on this section:
This is a proposal to relax the restrictions for "inline" mode only in the event the developer requests actual controllers. But also trigger a permissions check like "interactive-*" modes to to preserve the privacy which is the reason why inline doesn't currently support it. |
The specification supports accessing the gamepads directly via As for the ability to get input sources: Input sources are only useful if you have reference spaces to compare them against, and no current XR system supports inline with floor/local/etc spaces. I would be open to relaxing that constraint when the feature is requested. Though I'd want signals from implementers that they're interested in doing this. |
So even in "inline" mode, this would be allowed by the spec, because my understanding is because they (the existing controllers) are used as "pointers" in "inline" mode they can't really be used as standard gamepads, since they basically act like a mouse pointer effectively. I assume this is why both
Not necessarily, did you see what I am using the game inputs for? It is a flat 2d canvas, I don't need anything other than the inputs. I do eventually want access to the pointing data and where a click occurs because of games like X-Com or point-and-click type games like Kings Quest, Return to Zork. That code is something I'll probably work on next week to my vr framework code, but for a large number of games where a mice aren't used, the plain gamepad data is all I need.
A little confused, isn't that what this is, a feature request to allow relaxing the constraint only if the developer needs it? Please note, this decision impacts me very little now, I've already learned enough webgl to be dangerous 😀 and built my own mini xr-framework to deal with all the issues that I ran into, but I am requesting this to make the way a lot smoother for all the other non-gl web developers to get their 2d content into xr worlds very easily. Gamepad data is critical in a many cases, and right now the road blocks are massive to get it and it could be, imho, trivially simplified. |
What do you mean by "the inputs"? That's what I'm trying to get at here.
No, I am saying that I am open to relaxing the current constraint that code cannot request the gamepads feature in an inline session. When I say "feature is requested" I'm talking about WebXR features, not this feature request. |
@Manishearth - So in my case I'm using the Oculus/Meta Quest 1&2. (Will probably support others headsets later, but only have a Q1/2, so can only test that it works with them at this point).
The inputs being Buttons 0-6, and the Axis controls on both controllers. I don't need the default pointer/click mouse emulation for the browser, I literally need the controllers exactly as I get them in "immersive-vr" mode. I need to use all the inputs to remap to controls in the game. If you have an Oculus, you can check my site and you can see now 3 games now work in VR as I finished my remapping code and so I can bring games in much easier. In the first couple games they are simple side scrollers and so both controllers basically act the same, meaning you only have to use one controller to play the game. Axis controlls movement, buttons control jumping/firing/etc. However, with Magic Carpet for instance (still working on making the controls feel good, which is why it isn't listed as vr yet), The left controller's axis is used to point/turn the carpet, the right controller axis one is mapped for moving forward/backwards, and strafing side to side. The trigger buttons are the mapped the identically, but A/B/X/Y buttons all do something different. I basically need the vr controllers as-is just like I get them in "immersive-vr" mode, with the gamepad object that I can query for the buttons/axis's. I also will eventually need the pose data as I believe that is needed to figure out where the pointer is at, because some games like Return to Zork/Kings quest are mouse games where you point and click, and I'd like to make the point controller work as a mouse controller effectively... Does that help you understand? |
These are all exposed through the Gamepad API directly, yes? It sounds to me like the problem is that the profile mappings aren't always the same (implementors are free to do this). The profile string exposed by XRInputSource does seem useful here. I think it would be reasonable to allow the gamepad feature be requested, permissioned, in inline contexts. I would like to have signals from implementors that they want to allow this, though. |
In I assume they cannot be exposed currently as gamepads in This feature request is to allow me the web developer to tell the WebXR system (& browser specifically) that I do not want the default browser behavior in
Not really a profile issue (afaik), the issue is the defaults is the current expected behavior, changing them would break any existing
👍 |
I'm talking about (and we've already had discussions at the spec level that that is intended behavior: XR input sources can be Regular Gamepads too) |
I appreciate the time you are taking to understand the issue, hopefully this will finally clarify it. 😀
Oh, I'm very aware of that API, I use it else where. That was where I first started my bunny trail into this whole mess and ended up having to make my own very minimalist XR framework just to get access to the gamepad data. If you see my opening comments on this issue; I mention I tried using that API. Here is the breakdown for the current Oculus browser on a Quest 1 & 2:
I assume the gamepad axis and buttons are NOT exposed in anything outside of a WebXR |
Okay, I see. Yeah, that is not something the spec can fix, that's an implementor concern (see immersive-web/webxr-gamepads-module#19 for some prior discussion), and it's something we want to leave up to the implementor. From the point of view of the spec, If we do relax the requirement on requesting |
Yep, I realize that -- the spec is optional, but this at least gives a place for this to actually be codified as a possibility, just like |
What I'm trying to say is that it is already a possibility via the gamepads API. (In general I'm reluctant to add something to the spec unless implementors give a signal they are interested in supporting it because otherwise we have no basis on which to spec it) |
Ok, lets talk specifics. Because one of us is confused as to the how the TermsFirst lets define some terms, to keep things easier to discuss (at least for me) since the terms have been somewhat used interchangeably up to now.
ObjectiveWhat is the flow or potential flow for a browser to allow the XRController to be used as if it is a pure gamepad where all buttons and all axis's are reported to either of the gamepad api's and then the XRController's has no default "browser" actions and the XRController's are no longer emulating a mouse on a standard web site. Current BehaviorUnless you activate an XR Issues
SummaryWith the current Gamepad API, there does NOT seem to be a good way for either the User or the Developer to signal intent that not only meets the rules to activate the device, but also to make sure we don't break any existing sites. We can't assume that me tying to the So technically under the current existing api's, there appears to be NO good way for this class of WebXR device to be used as a general gamepad on a normal 2d site using any of the existing api calls. Proposed Solution(s)
|
Ah, I think I understand now. There's an implicit assumption in your post that the reason that implementations do not currently expose XR gamepads via
I was assuming these would only be accessible via
No, it can be done upstream too, we have done this in the past. It's just a matter of where things are supposed to belong. Advice on how to handle XR gamepad controllers using
I don't think this is necessary or that it solves the problem, since I think what you are actually looking for is a way to hint to the UA that "no, really, please include XR controllers in the gamepads API even if they're being used for regular input too". Which does solve the ambiguity problem, though as I mentioned before it's unclear if that's the main blocker for implementors supporting this.
This bit is unlikely to ever happen; I do not think we will ever let inline sessions take full control of XRControllers. We can let them accept button presses but it would security-wise be rather bad to let them completely take over.
Yeah, this is my preferred solution as mentioned before; however I am still reluctant to go down this path without a signal from implementors /agenda Talk about gamepads and also local/viewer spaces in inline sessions |
In the case of the non-webxr experience, the Quest browser doesn't have access to the controllers. Instead, the system service that draws the environment is the one that gets their input. |
So I think the WebXR-side design for this is:
However, we are getting into tricky design questions and we still don't have UAs who wish to expose this, and I'm wary of doing the above without an actual implementor committing to a mitigation strategy: I don't want to build castles in the sky. So my preference is to potentially hone this design but not add it to the spec until an implementor registers interest. |
@NathanaelA , since we're comtemplating deprecating inline sessions, is it ok if I close this? |
So I've been working on a site that basically is a flat 2d canvas that I'm showing in VR.
With an Oculus you can see it here (specifically a headset w/ controllers is needed) ... https://vr.gamers.center
The issue is that as a traditional web developer, I don't want to have to become a WebGL expert to access the controller gamepads.
I first thought lets try fullscreen mode. Looks great, no controller or gamepads from the vr controllers exposed either via XRInputSources or via the normal browser
navigator.getGamepad
system..Then I wired up the webgl code for "inline" mode, also looked great, but again the exact same issue with no access to the vr controller gamepads. Finally tried to use "interactive-vr" with the "dom-overlay" mode and of course it isn't supported on the Oculus Browser.
As a developer who just wanted to get my simple 2d canvas content into VR, both of these (fullscreen or inline) were easy to see my content. BUT, I also need to use the the gamepad parts of the controllers, and the current spec leaves no easy options. Long story short, to get around this I had to code about 400 lines of JS code to activate
immersive-vr
, mode, create a QuadLayer and then pipe the canvas contents into the Quadlayer every frame, all so I could access the gamepads. That is a LOT of extra work that the oculus has to do every frame (and developer has to program) to simply access the gamepads.I propose that a new
"requiredFeature": ["gamepad"]
forinline
mode which would do the same permissions check to verify access. Then it would expose the gamepad on the controllers in inline mode just like it does on interactive-vr. Simple change, but would allow standard 2d web content to easily use the controllers and web-devs to easily get access.The text was updated successfully, but these errors were encountered: