OVR Hand Tracking implementation for VR
Hi. I've been trying to implement Normcore into a VR app that uses the default OVR Rigs, with Locomotion included. I'm using Unity 2022.3.4. The only requirement is that the players can see the other players hand tracking and their position in the scene, no need for interactions. I've been struggling to implement the hand tracking since there's not much information about it, but I've seen examples and I want to know what I'm missing. Thanks in advance.
4 Replies
Hi @H.N.K. Master. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
To implement hand tracking with Normcore in a VR app using OVR Rigs (such as those provided by Oculus Integration), and to synchronize hand positions across players, you should use Normcore’s avatar system, which is designed to handle head and hand tracking out of the box.
Key Steps
1. Use RealtimeAvatarManager and RealtimeAvatar
- Add a
RealtimeAvatarManager
component to your scene, on the same GameObject as your Realtime
component.
- Set the localAvatarPrefab
property to an avatar prefab that includes a RealtimeAvatar
component on its root. This prefab should have references to the transforms for the head and both hands.
- When a player joins, Normcore will instantiate this prefab for each user and synchronize the head and hand positions automatically across the network. No custom networking code is needed for basic position/rotation sync.
2. Integrate with OVR Hand Tracking
- In your avatar prefab, set up the hand objects to be driven by the OVR hand tracking system locally.
- The RealtimeAvatar
component will take the local hand transforms and synchronize them to other clients.
- Other clients will see the hand positions and rotations as sent over the network.
3. No Need for Custom Hand Sync Code
- As long as your avatar prefab is set up correctly, Normcore will handle the synchronization of hand positions and rotations for you. You do not need to write custom code to send hand tracking data.
4. Meta Avatars (Optional)
- If you want to use Meta Avatars (which support hand tracking and more advanced features), you can use the "Meta Avatar Player" prefab provided by the Normcore Meta XR package. This prefab is already set up for head, hand, and voice sync, and will automatically use the user's system preferences for hand tracking if available. See the Meta Avatars setup guide for details.
Example Reference
The RealtimeAvatar component is responsible for synchronizing the head + hands positions and active states. When it's first instantiated, RealtimeAvatar is given a set of transforms to track for the root of the prefab, the head, and the hands from RealtimeAvatarManager'sXR Avatars ReferencelocalPlayer
field, which is set in the editor inspector.
All avatar customization settings are automatically imported from the user's system preferences. Head tracking, hand tracking, finger tracking, lip sync, and voice chat using Normcore's RealtimeAvatarVoice all work automatically.Meta Avatars Setup What You Need to Check - Your avatar prefab must have a
RealtimeAvatar
component and references to the head and hand transforms.
- The hand transforms should be updated locally by OVR Hand Tracking.
- The prefab must be in a Resources folder and referenced by the RealtimeAvatarManager
.
- You do not need to manually send hand tracking data—Normcore handles this.
If you follow these steps and use the provided prefabs/components, you should see other players' hand positions and head positions synchronized in your scene. If you need more advanced hand data (like finger curls), you may need to extend the avatar model and sync custom data, but for basic hand position/rotation, the built-in system is sufficient.
If you have a specific issue (e.g., hands not appearing, transforms not syncing), please provide more details and I can help troubleshoot further.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@323985411821010944> Kapa.ai is still learning and improving, please let me know how I did by reacting below.
Maybe I paraphrased my question wrong, my issue is with finger tracking. I've already managed to connect the base hand as the tutorial says. After some researching, I managed to adapt some code from https://github.com/absurd-joy/Quest-hands-for-Normcore, and at least it syncs, but it's having issues with the hand itself (using OpenXR Hands) as it deforms incredibly, as if the OVRSkeleton isn't getting synchronized with the correct bones. I've run out of ideas of how can I implement it.
you should ask this guy how he did it https://discord.com/channels/393839515074297858/1404008447581753426/1405515293127868470