13 Replies
Hi @xulipa. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
Hmm, I don't know enough to give you a confident answer yet. The provided knowledge sources do not contain information on how to sync walking sounds in NormalVR or Unity. However, in general, you would need to trigger sound effects based on the player's movement or animation states. This could involve using Unity's animation events or checking the player's velocity.
For more specific guidance, I recommend checking out the NormalVR documentation (https://normcore.io/documentation) or Unity's documentation on audio (https://docs.unity3d.com/Manual/Audio.html) and animation events (https://docs.unity3d.com/Manual/AnimationEventsOnImportedClips.html).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@725231244169707600> kapa.ai is still learning and improving, please let us know how it did by reacting below
@Normcore Assistant I want every player to hear walking sounds when you are moving. Should I fx. just instantiate a prefab for every step that has a sound bit attached to it?
Let me check the knowledge base for you. One moment! :books:
----------------------
While the provided knowledge sources do not specifically address syncing walking sounds, they do provide information on how Normcore handles audio and prefabs.
In Normcore, you can use the
RealtimeAvatarVoice
component for handling audio. This component works on any realtime prefab as long as the RealtimeView
component has an owner. It uses the ownership of the RealtimeView
to determine if it should be sending audio for this prefab or receiving audio to play back (source (https://normcore.io/documentation/realtime/xr/voicechat)).
However, RealtimeAvatarVoice
is primarily designed for voice chat, not sound effects. For walking sounds, you might want to consider a different approach.
One possible approach could be to attach an AudioSource to your player prefab and play a walking sound whenever the player moves. You could then use Normcore's Realtime API to synchronize the player's movement and the playing of the sound across all clients.
Here's a basic example of how you might do this in Unity (note: this is a conceptual example and may not work as-is):
Remember, any prefab used with Normcore must be instantiated with Realtime.Instantiate()
to ensure it is instantiated on all clients (source (https://normcore.io/documentation/realtime#prefabs)).
For more specific guidance, I recommend checking out the NormalVR documentation (https://normcore.io/documentation) or Unity's documentation on audio (https://docs.unity3d.com/Manual/Audio.html).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@725231244169707600> kapa.ai is still learning and improving, please let us know how it did by reacting below
@Normcore Assistant but just playing sound a prefab shouldn't work? Because RealtimeTransforms doesn't take care of syncing the audio they only focus on position, scale and etc.
Let me check the knowledge base for you. One moment! :books:
----------------------
You're correct. The
RealtimeTransform
component in Normcore is used to synchronize the position, rotation, and scale of a GameObject, not audio (source (https://normcore.io/documentation/realtime/realtimetransform)).
For synchronizing audio like walking sounds across multiple clients, you would need to use a different approach. One possible method could be to use a combination of Realtime.Instantiate()
and RealtimeAvatarVoice
.
Here's a conceptual example:
1. Create a prefab for the walking sound with an AudioSource component and a RealtimeAvatarVoice
component.
2. When a player takes a step, use Realtime.Instantiate()
to create an instance of the walking sound prefab. This will ensure that the sound is instantiated on all clients (source (https://normcore.io/documentation/realtime#prefabs)).
3. The RealtimeAvatarVoice
component on the prefab will handle sending the audio data to all clients (source (https://normcore.io/documentation/realtime/xr/voicechat)).
Remember, this is a conceptual example and may require additional setup or modification to work in your specific project. For more detailed guidance, I recommend checking out the NormalVR documentation (https://normcore.io/documentation) or Unity's documentation on audio (https://docs.unity3d.com/Manual/Audio.html).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@725231244169707600> kapa.ai is still learning and improving, please let us know how it did by reacting below
RealtimeAvatarVoice it is
just do it locally if it's based on walking speed
calculate how fast a remote client is going and if it should play the walking sound, no need to sync it and waste bandwidth
Ohh so you mean I should grab the transforms of other players avatar
just add a script and audio source to the avatar prefab that plays sound
Great idea!