Syncing Sounds?
Hey everyone 🤟🏻 In our multiplayer soccer game, we need to synchronize sound clips and and some values that are needed for those sounds to be played (such as power of a kick, or a random index of from bunch of audio clips) Actually most of the settings (such as Pitch/volume over power, list of possible random audio clips for a single action) are set in a ScriptableObject; so all I need to have is "name of the sound to be played, random index(if there are multiple clips), and power of the action"
We've experienced kind of a similar issue with animations; and our solution was to synchronize a set of so-called "Animation Packages". Whenever an animation is triggered by a player, an anim package (consists of animationType enum, intensity, value and a selector value) is added to the realtime set, and other players would "simulate" that animation on their side.
My question to y'all is whether it sounds logical to pass around those data for playing a single shot audio clip.
I'm also considering to use the values from animation sync, player's rigidbody speed, plus a float variable called "actionPower" which can have value between 0-1 (but this option leaves me with having only one action power at a time) This way I can easily assume that if a player plays an animation, he/she is most likely to perform the action of that animation. Anyways, what do you guys think about it? Has anyone experienced a similar challenge before?
8 Replies
Hi @mstfmrt07. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
Hmm, I don't know enough to give you a confident answer yet. However, based on the information provided, it seems like you're on the right track with your approach to synchronizing sound clips and related values.
In terms of synchronizing custom data, Normcore provides a rich API for creating your own realtime components. You can create a custom realtime component and model to hold and synchronize the data you need for your sound clips. This could include the name of the sound to be played, a random index, and the power of the action.
You can refer to the Normcore documentation on synchronizing custom data (https://normcore.io/documentation/realtime/synchronizing-custom-data) for more detailed information on how to create and use custom realtime components and models.
As for your idea of using values from animation sync, player's rigidbody speed, and an "actionPower" variable, it seems like a viable approach. However, without more specific information or context, it's hard to say for sure.
I would recommend checking out the Normcore documentation (https://normcore.io/documentation) and guides (https://normcore.io/documentation/guides) for more information and examples that might be relevant to your situation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@208292457979314176> kapa.ai is still learning and improving, please let us know how it did by reacting below
I'm aware my question still needs some extra details, but I didn't want to write a long long text here. I'm not quite satisfied with AI's answer, maybe @Max you can give me a little clue :)
I appreciate you guys using the bot and support channel 🙏 when it responds and is uncertain, the answer gets flagged on our end as possible area of improvement for our docs.
Don't be afraid to hit the bot with details if you need to in future, whether it can assist or not there will always be a team member checking responses and yeah 😄 I'm sure max will be able to assist you further!
it’s a good question! In our own titles we typically play sounds in response to another event that’s synchronized already, so we haven’t had the need to synchronize sounds like this directly
ie in Nock, we’re playing sounds in collision events on all clients. In cases where a client is lagging behind, it means the sounds more closely match what they’re seeing, even if it’s rubberbanding
that said, the RealtimeSet approach does work if you’d like to have explicit events for sounds. I’ve also seen folks instantiate prefabs with an audio source that destroy automatically once finished. I think both are valid
Hey, thanks to both of you for your quick answers! Yeah, I'm sure bot will improve as we continue using it, and the forum-like support channel idea is very nice! I was struggling before when searching it from text channels, now it is way more organized and it can serve as an archive to the future users.👌🏻
About the Max's answer; yeah, playing sounds on collision events would probably solve part of our problem: the ball. For the player, I'm more close to using already synchronized data (ie. animation triggers, player speed), just not sure that if they will be enough for all the possible scenarios. For example: A shoot fake will trigger shoot animation but shoot action would not be performed. However, I will keep thinking about it. Thank you again for the inspiration!
sure thing! let me know what you end up with
I agree with Max that you should play sounds based on other already synced events but if you want a quick and dirty method you could Realtime.Instantiate prefabs with the sounds on them and then Realtime.Destroy them after the sounds is done playing. Another method would be to have an array of sounds and you sync a realtime set with position and sound to play and an audio manager plays the clip using AudioSource.PlayClipAtPoint