Syncing music
Is it possible to execute an audiosource clip and that all the clients hear it?
There is a recipe in the documentation (Recipe: Timers + Synchronized Playback) that says that this sync method can be done to sync audio, however, I could not achieve it.
I'm trying to sync a float with the playback time but no success.
Solution:Jump to solution
We're still refining it. But here's the current state of RealtimeTimer if you want to throw this in your project. Maybe throw it in a different namespace so it doesn't conflict when we ship this in Normcore.
41 Replies
Hi @juanje89. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
Yes, it is possible to synchronize audio playback across all clients in Normcore. The key is to use a synchronized clock to drive the playback. This clock is set to match the room server's clock exactly with no added latency. Once connected to a room, the room time is available via the
realtime.room.time
property.
Here is a basic example of how you can use this property to create a synchronized stopwatch:
In this example, realtime.room.time
is used to calculate the elapsed time since the stopwatch was started. You can use a similar approach to synchronize the playback time of your audio source.
However, please note that Normcore uses metadata on the prefab in order to connect RealtimeComponents on all clients. Therefore, you cannot add RealtimeComponents at runtime.
If you're still having trouble, it might be helpful to review the Normcore documentation (https://normcore.io/documentation/guides/recipes/timers-and-synchronized-playback) on this topic.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@759006240646103041> kapa.ai is still learning and improving, please let us know how it did by reacting below
@kapa.ai how would you use this to synchronize playback of an audio file in Unity?
@juanje89 we’ll see if kapa responds here, but in short, I believe you can set an AudioClip/AudioSource playback position. You’d use the time property here to set it. We’re also about to ship a proper timer component that will help make this easier.
@Tom any idea why kapa isn’t responding to mentions?
@Normcore Assistant Hello?
Let me check the knowledge base for you. One moment! :books:
----------------------
Hello! How can I assist you today?
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
<@497453196797804544> kapa.ai is still learning and improving, please let us know how it did by reacting below
oh is there two kapa users in here? can we remove the one that doesn’t work?
@Normcore Assistant how would you use this to synchronize playback of an audio file in Unity?
Let me check the knowledge base for you. One moment! :books:
----------------------
To synchronize the playback of an audio file in Unity using Normcore, you can use a similar approach to the stopwatch example provided in the documentation. Here's a basic outline of how you might do it:
1. Create a new RealtimeModel to hold the start time of the audio playback. This could be similar to the
StopwatchModel
in the example.
2. Create a new RealtimeComponent that uses this model. This component will start the audio playback and calculate the current playback time.
3. Attach this component to the GameObject that has the AudioSource you want to synchronize. Call StartPlayback()
to start the audio and synchronize it across all clients.
Please note that this is a simplified example and might not cover all edge cases. For example, it doesn't handle pausing the audio, seeking to a different time, or synchronizing the audio state (playing/paused) across clients. You would need to extend this example to handle those cases.
Also, remember that Normcore uses Unity's audio engine for playback, so all of the Unity audio APIs should work correctly. However, voice chat spatialization does not work on the web because Unity's audio engine, FMOD, does not run on the web. All voice-chat audio is played directly through the browser rather than through Unity (source (https://normcore.io/documentation/platforms/webgl#voice-chat-spatialization-does-not-work-on-the-web)).
For more information, you can refer to the Normcore documentation (https://normcore.io/documentation/guides/recipes/timers-and-synchronized-playback).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@353762082451095552> kapa.ai is still learning and improving, please let us know how it did by reacting below
Oh, great, I'll try this!!
Though the response looks the same as the recipe
This response is good, but PlaybackTime is off. You don't need much on top of the recipe.
@Normcore Assistant How do you set the audio source playback time to match the model? Assume playback was started by another client and the model already has a startTime set.
Let me check the knowledge base for you. One moment! :books:
----------------------
To set the audio source playback time to match the model, you can use the
AudioSource.time
property in Unity. This property gets or sets the current position of the audio clip that the audio source is playing, in seconds.
You can calculate the desired playback time based on the startTime
in the model and the current room time. If the audio source is not already playing when the model is replaced (i.e., when another client has started playback), you should start it and set the time accordingly.
Here's how you might modify the AudioSync
class to handle this:
In this example, OnRealtimeModelReplaced
is overridden to check if the audio should be playing when the model is replaced. If the startTime
in the model is not zero (indicating that playback has started), and the audio source is not already playing, it starts the audio source and sets the playback time to match the model.
Please note that this is a simplified example and might not cover all edge cases. For example, it doesn't handle pausing the audio, seeking to a different time, or synchronizing the audio state (playing/paused) across clients. You would need to extend this example to handle those cases.
For more information, you can refer to the Normcore documentation (https://normcore.io/documentation/guides/recipes/timers-and-synchronized-playback).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------<@353762082451095552> kapa.ai is still learning and improving, please let us know how it did by reacting below
This looks good
Great, I'll give it a try!
a proper timer with a start and stop event would be fantastic, i'm about to make one atm
Solution
We're still refining it. But here's the current state of RealtimeTimer if you want to throw this in your project. Maybe throw it in a different namespace so it doesn't conflict when we ship this in Normcore.
THANK YOU!
@Tom reputation bot should give me more points because it was written in all caps ;P
lol...
double xp
lmk how it works for you. would love any feedback so we can incorporate it before the feature ships
I have multiple use cases but is it a bad idea to use something like this for say hundreds of objects?
depends on when you start / stop them. This only incurs a network cost when you start, stop, or pause.
If you start 100+ of these on the same frame you may end up with a big packet, but I doubt it will be too too bad
worth trying!
should be offset sweet sounds good!
I get
RealtimeTimer.cs(3,18): error CS0311: The type 'Normal.Realtime.RealtimeTimerModel' cannot be used as type parameter 'TModel' in the generic type or method 'RealtimeComponent<TModel>'. There is no implicit reference conversion from 'Normal.Realtime.RealtimeTimerModel' to 'Normal.Realtime.RealtimeModel'.
i'm running 2.7 would I need to update probably?
hmm no you should be good
does the model have any other errors?
this looks like the autogenerated code isn’t being generated here
and so the model doesn’t have the base class of RealtimeModel
I don't think i've made a new model since updating to 2.7, is there something you need to do to make models compile?
I mean generate
nope, it should be automatic unless you’re on an old version of Unity
2020.3
so a bit
in which case you’ll need to compile it but the project needs to be in a compiling state for the button to appear
let me check
ah yeah roslyn generator support is 2021+
crap... 2021 hard crashes my current game on start 😦
all good. start by adding just the model. then hit the compile button
then add the timer and timer editor files to your project
should work just fine
copy we're back in business!
💪🏻💪🏻
beautiful! implemented and working like a charm! I haven't actually tested it multiplayer but since it's activating from the timer change event should be working.
awesome!
LMAO agreed
I have tried the suggested script and it works when a client plays the AudioSource and then, another client connects to the same room; however, existing players in the room do not hear the audio. I believe this happens becauce OnRealtimeModelReplaced is executed when the client connects @Normcore Assistant
just add the didchange event parameter to the model
Do I need to subscribe to such event, or does OnRealtimeModelReplaced gets called automatically after adding the third parameter to the model?
sub the event. check the docs for how to sub an event
Thanks!