beeeen
beeeen3w ago

RealtimeAvatarVoice not creating AudioOutput component for remote clients sometimes

Occasionally, during a scene transition, clients can no longer hear one or more remote clients. While digging into this, I found that when this happens, the audioOutput component on the avatars of the remote client on the local client do not create an audioOutput. For example, if Ann and Bob are playing and they go through a scene transition, Bob may no longer hear Ann. In this case, Ann has initialized all voice streams correctly. However, Bob's client did not correctly initialize the audioOutput of Ann on his machine. Because of this, Bob can no longer hear Ann. Android Logcat shows the error: RealtimeAvatarVoice: Unable to find matching audio stream for avatar (clientID: 0, streamID: 0). This message is outputted in line 298 (in the decompilation) of RealtimeAvatarVoice. This occurs when audioOutputStream is null. audioOutputStream is assigned a few lines before with AudioOutputStream audioOutputStream = room.GetAudioOutputStream(clientID, streamID);. One other thing to note is that the AudioSource component never gets assigned its voice streaming clip. I'm assuming that this is because the script fails. Also, this is Normcore 2.9.1. I think that it's worth noting that our game will load the scene, do a match making check, and then connect the room. So the scene loads before the room does. Not sure if that makes a difference. I've attached images of the native error from a build as well as our RealtimeAvatarVoice setup on our avatar.
No description
No description
17 Replies
kapa.ai
kapa.ai3w ago
Hi @beeeen. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave: Sorry @beeeen, I seem to be having a bit of trouble. Don't worry though, I've pinged my support and they will try to fix me up! :technologist: Perhaps try again in 15mins?
maxweisel
maxweisel3w ago
fascinating! the source code is included. It should retry until it gets the audio stream details I wonder why no audio stream exists in the native API if there’s a third player in the room, can they hear everyone? or is it possible Client A’s audio stream isn’t getting created at all
beeeen
beeeen3w ago
Yes, if player A can't hear player B, then it's totally possible (in our game) for player C to hear both player A or player B or neither of them. I don't think so. When I look at the public variables on client A, it seems to be doing alright. They look the exact same as when everything is functioning fine. This doesn't happen in the editor either. I was testing on Oculus 2. @Normcore Assistant, any ideas about my post and these comments? Are you back online?
kapa.ai
kapa.ai3w ago
Let me check the knowledge base for you. One moment! :books: ---------------------- I'm sorry, but I don't have the necessary information to provide an answer to your question. As an AI, I don't have access to your post or comments. I'm here to provide assistance based on the information provided to me. If you have any questions about NormalVR or Unity, feel free to ask! ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------
kapa.ai
kapa.ai3w ago
<@690217885540417680> kapa.ai is still learning and improving, please let us know how it did by reacting below
beeeen
beeeen3w ago
@Normcore Assistant Occasionally, during a scene transition, clients can no longer hear one or more remote clients. While digging into this, I found that when this happens, the audioOutput component on the avatars of the remote client on the local client do not create an audioOutput. For example, if Ann and Bob are playing and they go through a scene transition, Bob may no longer hear Ann. In this case, Ann has initialized all voice streams correctly. However, Bob's client did not correctly initialize the audioOutput of Ann on his machine. Because of this, Bob can no longer hear Ann. Android Logcat shows the error: RealtimeAvatarVoice: Unable to find matching audio stream for avatar (clientID: 0, streamID: 0). This message is outputted in line 298 (in the decompilation) of RealtimeAvatarVoice. This occurs when audioOutputStream is null. audioOutputStream is assigned a few lines before with AudioOutputStream audioOutputStream = room.GetAudioOutputStream(clientID, streamID);. One other thing to note is that the AudioSource component never gets assigned its voice streaming clip. I'm assuming that this is because the script fails. Also, this is Normcore 2.9.1. I think that it's worth noting that our game will load the scene, do a match making check, and then connect the room. So the scene loads before the room does. Not sure if that makes a difference. I've attached images of the native error from a build as well as our RealtimeAvatarVoice setup on our avatar.
kapa.ai
kapa.ai3w ago
Let me check the knowledge base for you. One moment! :books: ---------------------- Based on the information provided, it seems like the issue might be related to the initialization of the AudioOutput component and the assignment of the AudioOutputStream in the RealtimeAvatarVoice script. In the Normcore documentation, it is mentioned that the RealtimeAvatarVoice works on any realtime prefab (https://normcore.io/documentation/realtime#prefabs) as long as the RealtimeView component has an owner. Upon instantiation, RealtimeAvatarVoice uses the ownership of the RealtimeView to determine if it should be sending audio for this prefab or receiving audio to play back. If the component is owned locally, a microphone stream is created, and if it is owned by a remote client, an AudioOutput component is created, in order to play back audio. The error message you're seeing suggests that the AudioOutputStream is not being found for the avatar, which could be due to the RealtimeView not having an owner at the time of the scene transition. One thing you could check is the order of operations in your scene transitions. You mentioned that your game loads the scene, does a matchmaking check, and then connects the room. It's possible that the scene is being loaded before the RealtimeView has an owner, which could be causing the AudioOutput component to not be initialized correctly. You might want to ensure that the room is connected and the RealtimeView has an owner before the scene transition occurs. This could potentially resolve the issue with the AudioOutput component not being initialized correctly. As for the AudioSource component not getting assigned its voice streaming clip, this could indeed be a result of the script failing due to the AudioOutput component not being initialized correctly. Please note that these are just suggestions based on the information provided and the Normcore documentation. You might want to reach out to the Normcore support for more specific guidance. ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------
kapa.ai
kapa.ai3w ago
<@690217885540417680> kapa.ai is still learning and improving, please let us know how it did by reacting below
maxweisel
maxweisel3w ago
I'm thinking more that possibly there's a bug where Client A's message to open the audio stream never makes it to the server. All public vars in C# would look correct, but the bug would be client side. Have you ever been able to replicate this where Client A creates and audio stream, Client B can't hear it due to this issue, and Client C can hear it? I'm trying to figure out if I need to look at the native plugin on the send side or the receive side.
beeeen
beeeen3w ago
That's interesting. I'll try listing in to client A's native logs. Maybe it returns an error that it can't connect. Yes, that has happened during testing: client A can't be heard by client B but client C can hear both of them.
maxweisel
maxweisel3w ago
it’s not going to return an error here but as long as you’ve got a valid test that shows it’s on the receive side, I can start digging in there
beeeen
beeeen3w ago
What would be a valid test for you? How can I show that?
maxweisel
maxweisel3w ago
I just meant you tested the Client A, B, C scenario and confirmed B can’t hear A and C can hear A. As long as you’re sure that’s what happened, that’s all I need for us to look into it. I just want to make sure we’re digging into the right place Also, out of curiosity, how often can you replicate this bug? If I send you a custom build of Normcore with extra logging, would you be able to get me the logs from a test where this happens?
liamc
liamc3w ago
I had an issue that sounds exactly like this a long time ago and fixed it in my fork of the voice script. It runs a coroutine for remote players to check that the audio stream is connected, and reconnects it if not. It doesn't prevent the issue occurring, it just recovers when it does occur. I've been using this in production for the last ~18 months, but we're on normcore 2.6.1, so im not sure if its still relevant in 2.10.x
No description
No description
maxweisel
maxweisel3w ago
fascinating! thank you for sharing this. I think this will help me get to the bottom of it.
beeeen
beeeen3w ago
Hi Max, it happens about 1/5 times during our scene transition. And yeah! I'd be down to do that.
maxweisel
maxweisel3w ago
Let me see if I can implement a fix using liamc’s example and I’ll send a build your way to try out