You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 22, 2022. It is now read-only.
Can you help me solve a mystery? Looking through the Unity sample project, I'm not seeing how the audio track is getting played. Scene doesn't have a Unity AudioSource MonoBehaviour. The code doesn't do anything with audio frames. How is audio getting to the speakers?
Context: In my project, I need to pipe the WebRTC audio track to a spatial audio emitter so that the audio is played at the media player's location in the scene. How can I pipe the audio track to a Unity AudioSource?
Can you help me solve a mystery? Looking through the Unity sample project, I'm not seeing how the audio track is getting played. Scene doesn't have a Unity
AudioSourceMonoBehaviour. The code doesn't do anything with audio frames. How is audio getting to the speakers?Context: In my project, I need to pipe the WebRTC audio track to a spatial audio emitter so that the audio is played at the media player's location in the scene. How can I pipe the audio track to a Unity
AudioSource?Thanks.