See in Glossary. To learn how to use audio in your Unity project, see the Audio Overview.
Because Unity uses FMOD to manage audio for platforms, Unity WebGL supports limited audio functionality, which only includes the basic features. FMOD relies on threads, which WebGL doesn’t support. For this reason, Unity uses an implementation based on the internal Web Audio API, which enables the browser to handle audio playback and mixing.
Note: Google Chrome’s new Autoplay policy prevents autoplay of audio and video under certain conditions. For example, while your game might be set to autoplay some background music soon after the game loads, it won’t play automatically unless you click or tap on the website. For more information on how to enable or disable this policy, see Google Chrome’s documentation on Autoplay policy in Chrome.
Unity WebGL supports the following API classes:
|Class||WebGL Support status|
|AudioSource||WebGL supports some APIs. See AudioSource for specific support details.|
|AudioListener||All APIs supported.|
|AudioClip||WebGL supports some APIs. See AudioClip for specific support details.|
|SystemInfo.supportsAudio||The browser provides audio support for WebGL. For this reason,
The AudioSource API supports basic positional audio playback, including:
Unity WebGL supports the following AudioSource APIs:
|Clip||Determines the audio clipA container for audio data in Unity. Unity supports mono, stereo and multichannel audio assets (up to eight channels). Unity can import .aif, .wav, .mp3, and .ogg audio file format, and .xm, .mod, .it, and .s3m tracker module formats. More info
See in Glossary that plays next.
|dopplerLevel||Sets the Doppler scale for the AudioSource.|
|ignoreListenerPause||Allows AudioSource to ignore
|ignoreListenerVolume||Ignores the end-user’s AudioSource volume.|
|isPlaying||Returns true if the
|loop||Allows the application to loop the
|maxDistance||Sets the maximum distance at which the
|minDistance||Sets the minimum distance at which the AudioSource.clip no longer rises in volumes. The sound starts to attenuate beyond the minimum distance.|
|mute||Mutes the AudioSource.|
|pitch||Sets the pitch of the
|playOnAwake||Plays the AudioSource on Awake.|
|rolloffMode||Sets the AudioSource attenuation over distance.|
|time||Sets the playback position in seconds.|
|timeSamples||Sets the playback position in Pulse-code modulation (PCM) samples.|
|velocityUpdateMode||Sets whether the AudioSource updates in the fixed or dynamic update loop.|
|volume||Sets the volume of the AudioSource (0.0 to 1.0).|
|PlayOneShot||Plays an AudioClip and scales the AudioSource volume by volumeScale.|
|PlayScheduled||Plays the AudioSource at a time you specify.|
|SetScheduledEndTime||Sets a time that a scheduled
|SetScheduledStartTime||Sets the time that a scheduled
|Stop||Stops playing the
|UnPause||Unpauses a paused
Unity WebGL imports AudioClip files in the AAC Format, which is widely browser supported. Unity WebGL supports the following AudioClip APIs:
|length||The length of the AudioClip in seconds.|
|loadState||Returns the current load state of the audio data associated with an AudioClip.|
|samples||The length of the AudioClip in samples.|
|loadType||The load type of the clip. You can set the AudioClip load type in the InspectorA Unity window that displays information about the currently selected GameObject, asset or project settings, allowing you to inspect and edit the values. More info
See in Glossary.
|AudioClip.Create||Creates an AudioClip with a name and length you specify.||Unity WebGL partially supports
|AudioClip.SetData||Sets sample data in an AudioSource.clip.||Unity WebGL partially supports
To use compressed audio with WebGL in Unity, set the AudioClip loadType to one of the following options:
|CompressionA method of storing data that reduces the amount of storage space it requires. See Texture Compression, Animation Compression, Audio Compression, Build Compression.
See in Glossary method
|CompressedInMemory||Use this to compress the audio on disk and have it remain compressed after it loads into your application memory.||Compressed audio can cause latency and is less precise when it comes to audio playback. However, compressed audio uses less memory in your application than decompressed audio. It is best practise to use
|DecompressOnLoad||Use this to compress the audio on disk, similar to CompressedInMemory, and decompress when it loads into your application memory.||Decompressed audio uses a significant amount of memory compared to compressed audio but has lower latency and more audio flexibility. Use
For security reasons, browsers don’t allow audio playback until an end user interacts with your application webpage via a mouse click, touch event or key press. Use a loading screen to allow the end user to interact with your application and start audio playback before your main content begins.