Syncing music
Is it possible to execute an audiosource clip and that all the clients hear it?
There is a recipe in the documentation (Recipe: Timers + Synchronized Playback) that says that this sync method can be done to sync audio, however, I could not achieve it.
I'm trying to sync a float with the playback time but no success.
Solution:Jump to solution
We're still refining it. But here's the current state of RealtimeTimer if you want to throw this in your project. Maybe throw it in a different namespace so it doesn't conflict when we ship this in Normcore.
41 Replies
Hi @juanje89. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
Yes, it is possible to synchronize audio playback across all clients in Normcore. The key is to use a synchronized clock to drive the playback. This clock is set to match the room server's clock exactly with no added latency. Once connected to a room, the room time is available via the
realtime.room.time
property.
Here is a basic example of how you can use this property to create a synchronized stopwatch:
In this example, realtime.room.time
is used to calculate the elapsed time since the stopwatch was started. You can use a similar approach to synchronize the playback time of your audio source.
However, please note that Normcore uses metadata on the prefab in order to connect RealtimeComponents on all clients. Therefore, you cannot add RealtimeComponents at runtime.
If you're still having trouble, it might be helpful to review the Normcore documentation (https://normcore.io/documentation/guides/recipes/timers-and-synchronized-playback) on this topic.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@759006240646103041> kapa.ai is still learning and improving, please let us know how it did by reacting below
@kapa.ai how would you use this to synchronize playback of an audio file in Unity?
@juanje89 we’ll see if kapa responds here, but in short, I believe you can set an AudioClip/AudioSource playback position. You’d use the time property here to set it. We’re also about to ship a proper timer component that will help make this easier.
@Tom any idea why kapa isn’t responding to mentions?
@Normcore Assistant Hello?
Let me check the knowledge base for you. One moment! :books:
----------------------
Hello! How can I assist you today?
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@497453196797804544> kapa.ai is still learning and improving, please let us know how it did by reacting below
oh is there two kapa users in here? can we remove the one that doesn’t work?
@Normcore Assistant how would you use this to synchronize playback of an audio file in Unity?
Let me check the knowledge base for you. One moment! :books:
----------------------
To synchronize the playback of an audio file in Unity using Normcore, you can use a similar approach to the stopwatch example provided in the documentation. Here's a basic outline of how you might do it:
1. Create a new RealtimeModel to hold the start time of the audio playback. This could be similar to the
StopwatchModel
in the example.
2. Create a new RealtimeComponent that uses this model. This component will start the audio playback and calculate the current playback time.
3. Attach this component to the GameObject that has the AudioSource you want to synchronize. Call StartPlayback()
to start the audio and synchronize it across all clients.
Please note that this is a simplified example and might not cover all edge cases. For example, it doesn't handle pausing the audio, seeking to a different time, or synchronizing the audio state (playing/paused) across clients. You would need to extend this example to handle those cases.
Also, remember that Normcore uses Unity's audio engine for playback, so all of the Unity audio APIs should work correctly. However, voice chat spatialization does not work on the web because Unity's audio engine, FMOD, does not run on the web. All voice-chat audio is played directly through the browser rather than through Unity (source (https://normcore.io/documentation/platforms/webgl#voice-chat-spatialization-does-not-work-on-the-web)).
For more information, you can refer to the Normcore documentation (https://normcore.io/documentation/guides/recipes/timers-and-synchronized-playback).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@353762082451095552> kapa.ai is still learning and improving, please let us know how it did by reacting below
Oh, great, I'll try this!!
Though the response looks the same as the recipe
This response is good, but PlaybackTime is off. You don't need much on top of the recipe.
@Normcore Assistant How do you set the audio source playback time to match the model? Assume playback was started by another client and the model already has a startTime set.
Let me check the knowledge base for you. One moment! :books:
----------------------
To set the audio source playback time to match the model, you can use the
AudioSource.time
property in Unity. This property gets or sets the current position of the audio clip that the audio source is playing, in seconds.
You can calculate the desired playback time based on the startTime
in the model and the current room time. If the audio source is not already playing when the model is replaced (i.e., when another client has started playback), you should start it and set the time accordingly.
Here's how you might modify the AudioSync
class to handle this:
In this example, OnRealtimeModelReplaced
is overridden to check if the audio should be playing when the model is replaced. If the startTime
in the model is not zero (indicating that playback has started), and the audio source is not already playing, it starts the audio source and sets the playback time to match the model.
Please note that this is a simplified example and might not cover all edge cases. For example, it doesn't handle pausing the audio, seeking to a different time, or synchronizing the audio state (playing/paused) across clients. You would need to extend this example to handle those cases.
For more information, you can refer to the Normcore documentation (https://normcore.io/documentation/guides/recipes/timers-and-synchronized-playback).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@353762082451095552> kapa.ai is still learning and improving, please let us know how it did by reacting below
This looks good
Great, I'll give it a try!