In short: Audio (songs and podcasts) gets delivered by artists to Spotify and Apple Music at different volume levels, which is why some are quieter and some louder than others.
Some songs on Spotify and Apple Music are also louder or quieter than others because of the difference in the quality of file, recording, and genre of music being played.
Unless you run a record label you probably have no idea of the process it takes to get a music file from the production company to a streaming service like Apple Music or Spotify.
To understand why some songs are louder while some songs are quieter than others, it is important to know how each music streaming service delivers its audio.
1. Difference in recording
Every artist records their tracks in a recording studio. It is a facility for recording and mixing instrumental or vocal musical performances, spoken words, and other sounds.
With the evolution of music technology and digital audio equipment, artists can record tracks in a small recording studio, also known as a home recording studio.
It’s obvious that each artist has different tastes and preferences when it comes to music. One artist might add more bass into their song while some would prefer more treble.
The end result is a song that is louder than others, while some are quieter than others. But it doesn’t end there, because each recording is sent to music streaming services as a file.
2. Difference in quality of the file
The mastered digital files are sent to third-party companies known as “aggregators”. These are the people who have relationships with music streaming platforms and record labels.
What do they do?
The aggregator sends the files from the labels to the streaming companies (Apple Music and Spotify), then collects the money from sales and makes the necessary payments to the labels.
In exchange, the aggregators receive a fee based on a percentage of sales.
However, every aggregator has their own music platform and their unique way of handling files. This is why you can only play a certain file using their own app.
3. Each platform adjusts the audio file
Once a music streaming service receives an audio file, they use its own techniques to adjust the audio file (based on its preference) and serve it to its listeners.
This process involves optimizing the volume rather than the sound by compressing the file. For example, Apple converts its files to AAC while Spotify uses the Ogg Vorbis format.
Since audio gets delivered at different volume levels, each music streaming service aims to normalize each track so the volume level stays consistent.
For example, Spotify has a normalization feature, while Apple Music has the Sound Check feature. Both features reset the average loudness of all tracks to be approximately the same.
Both features serve the same purpose: Keep the playback level consistent across all tracks so listeners don’t need to adjust the device volume from song to song.
Every music streaming service adjusts its tracks differently. For example, Spotify turns the volume down by 5.3 dB whereas YouTube turns it down by 3.9 dB.
The bottom line is this: All platforms deal with music files differently, which is why some songs are louder than others. This is done so you may appreciate subtle differences in their sound.
4. Some music genres are louder than others
All songs are different. Some songs have different loudness depending on the style and genre. Of course, loud EDM music and an acoustic ballad will have different loudness.
Some genres, like pop music, for example, often have vocals slightly louder than the beat to emphasize the beat. It all depends on how the artists produce their tracks.