OculusXR Quest Hands to OpenXR Hands migration
Hey guys - I'm currently trying to migrate my normcore app, which once relied on the sharing of quest hands over the OVRPlugin, to XRHands with the OpenXR plugin in Unity VR. Has anyone else performed this migration? I've been trying to modify my shared avatar so it uses the XRHands sample and I'm getting a bit tripped up by the model and rig and how they interact with the hand model. Currently I'm in a situation where my hand model obviously has all the wrong transforms - i.e. avatar nightmare fuel... Does anyone have any advice for me about how I could be successful about switching over to the XRHands rig and how to make that work?
9 Replies
Hi @Mark S. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
I'm sorry, but the provided knowledge sources do not contain specific information about migrating from OculusXR Quest Hands (OVRPlugin) to OpenXR XRHands in the context of Normcore, nor do they cover best practices for updating hand models, rigs, or synchronizing hand tracking data using XRHands with Normcore.
The documentation does cover how to set up avatars and synchronize head and hand transforms using Normcore's built-in avatar system, and it mentions that you can customize the avatar prefab and the transforms that are tracked and synchronized. However, it does not provide details on integrating XRHands or handling the differences in hand rigging between OVRPlugin and OpenXR XRHands. If you need to synchronize custom hand data, you may want to look into the Synchronizing Custom Data guide, but it does not specifically address XRHands or hand rig migration.
If you need more detailed guidance on this migration, you may need to consult Unity's XRHands and OpenXR documentation, or seek advice from the Normcore community directly, as the official docs do not currently cover this scenario.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@780620438027632670> Kapa.ai is still learning and improving, please let me know how I did by reacting below.
Unfortunately we’re still using the ancient APIs on our own titles. I’ve been dreading having to do this 🙁
Yeah, I avoided it forever as well--just now facing the music.
The experience of migrating Unity apps across versions at least has the side benefit of showing you the limits of what AI can do for you, and therefore underscoring the value of humans in the loop. Fat change AI can help with this. So far I've actually gotten through most of it. The last part where I'm tripping up are the realtime transforms on the hands and the deserialization method you need for your networked data to avoid turning the hand meshes into chewing gum... I'm just about to start looking at the code in XRHandSkeletonDriver and XRHandTrackingEvents to see if I can get my hand pose sync and skeleton serializer to play nicely with them. So far it looks like my hands got caught in a blender... Fixed it. The skinned mesh renderer does not order the joints it is rigged with in the same order as the bones that show up under XRHand. If you reorder the joint order in the bones array the hand is no longer scrambled. The new ordering is: 0, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 2, 20, 21, 22, 23, 24, 25, 3, 4, 5, 6, 7, 8, 9. You can figure this out by looking at the names of the transforms in the bones array for skinnedMeshRenderer and it will let you enumerate them. I remapped them during deserialization and my hand looks clean again. So leaving this hint here and maybe it will help when you get around to looking at it. This ordering seems most impactful in the initialization step. The ordering on the right hand is slightly different so you have to do different reorderings based on left or right hand.
The experience of migrating Unity apps across versions at least has the side benefit of showing you the limits of what AI can do for you, and therefore underscoring the value of humans in the loop. Fat change AI can help with this. So far I've actually gotten through most of it. The last part where I'm tripping up are the realtime transforms on the hands and the deserialization method you need for your networked data to avoid turning the hand meshes into chewing gum... I'm just about to start looking at the code in XRHandSkeletonDriver and XRHandTrackingEvents to see if I can get my hand pose sync and skeleton serializer to play nicely with them. So far it looks like my hands got caught in a blender... Fixed it. The skinned mesh renderer does not order the joints it is rigged with in the same order as the bones that show up under XRHand. If you reorder the joint order in the bones array the hand is no longer scrambled. The new ordering is: 0, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 2, 20, 21, 22, 23, 24, 25, 3, 4, 5, 6, 7, 8, 9. You can figure this out by looking at the names of the transforms in the bones array for skinnedMeshRenderer and it will let you enumerate them. I remapped them during deserialization and my hand looks clean again. So leaving this hint here and maybe it will help when you get around to looking at it. This ordering seems most impactful in the initialization step. The ordering on the right hand is slightly different so you have to do different reorderings based on left or right hand.
dang. nice work figuring that out
Thank you. Looks like I've almost finished the job... one issue left and I'm having a hard time understanding why its happening.
My first avatar into the scene can share hand pose, position and rotation correctly so you see the avatar with hands from the second runtime. The second avatar into the scene seems to only share hand pose and rotation, while the position gets locked to the floor directly below the head. They have correctly rendering hands locally, but hands stuck on the floor in the first runtime.
Any hints about what could be going wrong with this type of asymmetric failure where it works correctly on one side but not the other, and only position is failing?
you might be running some code on remote avatars that's overriding the position. add a check for isOwnedLocallyInHeirarchy to make sure you're only running code locally
Yep, I think that was it. I think my serializer was forcing the root transform to set to Vector3.zero. Added a parent and set that transform, and uh... seems to work now.
Which is actually a oretty sig. milestone at least for me... XRRig conversion done!
If there's any encouragement I can offer about that task--it actually went a lot faster than I expected it to--just a few days--so it may not be as bad as you think before you start it.
Hi @Mark S. I'm interested on how did you implement hand tracking with Normcore. As I was asking here https://discord.com/channels/393839515074297858/1407614460725821501/1408307596825464893, I'm struggling with it as if the skeleton data isn't converted as it should. I've used that github project as a base and adapted to the Oculus and Normcore versions compatible with 2022, but there's something that screws the entire rig and I'm not sure if is related to the old code or OpenXR Hands.