Meta's Movement SDK with Normcore
Can somebody help in setting up a full body networked custom avatars with Movement SDK and Normcore.
Movement SDK works locally but can't get to make it work with Normcore.
I found only this paragraph on the Movement SDK's github.
Hello, we can provide some advice for setting up Movement SDK for multiplayer. We suggest networking the final pose after retargeting and constraints, and optimizing the serialized pose before sending it. Additionally, consider implementing interpolation when applying the received pose to the receiving client. You can remove all other components except for the one responsible for deserializing the data on the player's side and simply update the character with the received pose. The pose data should be compressed and sent at a reasonable update rate, using networking optimization techniques. An article that explains some techniques can be found here: https://gafferongames.com/post/snapshot_compression/. For information about how to use Photon Fusion 2, please refer to their tutorials: https://doc.photonengine.com/fusion/current/tutorials/host-mode-basics/overview. To summarize, the host performs all logic to make the character move, while the client has only the skinned mesh that is driven by the result of the host's work.
But I don't know how to get the final pose data and how to send it through Normcore (i guess custom Realtime components).9 Replies
Hi @Asghamer. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
To set up a full body networked custom avatars with Movement SDK and Normcore, you would need to create a custom RealtimeComponent subclass to synchronize the final pose data. However, the specific details on how to get the final pose data from the Movement SDK are not provided in the knowledge sources.
Here's a general guide on how to set up networked meta avatars with Normcore:
1. Install the Normcore Meta XR package. Make sure you've installed Normcore and have configured all of Meta's platform prerequisites. Once Normcore is set up in your project, you'll be able to install the Normcore Meta XR UPM package. source (https://normcore.io/documentation/guides/meta/meta-avatars-setup)
2. Create a Meta Avatar prefab from scratch. Create an empty Game Object in the scene with a MetaAvatar component on it. MetaAvatar includes a button that will create all of the components needed by the Meta Avatar SDK. source (https://normcore.io/documentation/guides/meta/meta-avatars-customization)
3. Set up Meta Avatars. Import the "Meta Avatars" sample that comes with the Normcore Meta XR UPM package. Add the "AvatarSDKManagerMeta" prefab to the scene. Add your App Key to the Realtime component and hit Play. source (https://normcore.io/documentation/guides/meta/meta-avatars-setup#set-up-meta-avatars)
However, please note that Normcore uses metadata on the prefab in order to connect RealtimeComponents on all clients, so you cannot add RealtimeComponents at runtime. source (https://normcore.io/documentation/realtime/common-questions#can-i-add-realtimecomponents-at-runtime)
For more advanced configuration and usage, you can check out the official Meta Avatar SDK documentation. source (https://normcore.io/documentation/guides/meta/meta-avatars-customization)
If you're still encountering issues after checking these points, I recommend reaching out to the Normcore community on their Discord for further assistance. source (https://normcore.io/documentation/essentials/common-questions#does-normcore-support-x-y-z)
Please note that the specific details on how to get the final pose data from the Movement SDK and how to send it through Normcore are not provided in the knowledge sources. You might need to refer to the Movement SDK's documentation or ask for help in their community.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@455911545025724418> kapa.ai is still learning and improving, please let us know how it did by reacting below
They are not meta avatars.
I need to know how to get the final pose data or if there is any other way to do it.
@Normcore Assistant This is a reply from a Movement SDK's developer.
Guide me on the compression, sending, receiving and the decompression part of the process.
P.S I am using a custom character not Meta Avatar.
Let me check the knowledge base for you. One moment! :books:
----------------------
To compress, send, receive, and decompress the pose data using Normcore, you would need to create a custom RealtimeComponent subclass to synchronize the pose data. Here's a general guide on how to do it:
1. Create a RealtimeModel: You need to create a RealtimeModel that represents the pose data you want to synchronize. This could be the position and rotation of all the bone transforms in the skeleton. Each field in the model should be a primitive type that can be easily serialized and transmitted to other clients. source (https://normcore.io/documentation/room/realtimemodel)
2. Create a RealtimeComponent: Create a RealtimeComponent subclass that uses the RealtimeModel you created. This component will be responsible for synchronizing the pose data between the model and the game object. source (https://normcore.io/documentation/realtime/realtimecomponent)
3. Compress and Decompress Data: The pose data should be compressed before sending and decompressed after receiving to save bandwidth. The specific method of compression depends on your needs. You might want to look into quantization, which is a common method for compressing 3D pose data. The article at https://gafferongames.com/post/snapshot_compression/ explains some techniques for snapshot compression.
4. Send and Receive Data: Normcore automatically synchronizes the RealtimeModel across all clients. When you update the model in the RealtimeComponent, Normcore will automatically send the changes to all other clients. When a client receives the changes, Normcore will automatically update the model, and the RealtimeComponent will apply the changes to the game object. source (https://normcore.io/documentation/realtime/realtimecomponent)
Please note that this is a general guide and might need to be adjusted based on your specific needs and the details of the Movement SDK. If you encounter any issues, I recommend reaching out to the Normcore community on their Discord for further assistance. source (https://normcore.io/documentation/essentials/common-questions#does-normcore-support-x-y-z)
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@455911545025724418> kapa.ai is still learning and improving, please let us know how it did by reacting below
@Normcore Assistant does normcore support compression and decompression?
Let me check the knowledge base for you. One moment! :books:
----------------------
Normcore is designed to be as bandwidth and CPU efficient as possible. However, the specific details on whether Normcore supports built-in data compression and decompression are not provided in the knowledge sources. source (https://normcore.io/documentation/room/common-questions#are-model-change-events-more-expensive-than-rpcs)
For more advanced configuration and usage, you can check out the official Normcore documentation. source (https://normcore.io/documentation/essentials/what-is-normcore)
If you're still encountering issues after checking these points, I recommend reaching out to the Normcore community on their Discord for further assistance. source (https://normcore.io/documentation/essentials/common-questions#does-normcore-support-x-y-z)
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@455911545025724418> kapa.ai is still learning and improving, please let us know how it did by reacting below
Normcore wasnt able to handle all the bones data. Implemented it with Fusion 2 Successfully.