Mastering

Mastering is the process of shaping a work in progress, that is beyond the planning or recording stages, into a finalized form.

It's obviously an important step. If you've got all the basic parts of a sound or track in place but it still doesn't "feel" complete then you've already heard one reason for further mastering. If you've ever been happy with a sound you engineered using only your headphones and thought you were finished, only to realize that the sound loses certain qualities upon moving to actual stereo monitors, you've heard another.

Order of approach

It should be noted there is no one "true" order of approach to what mastering techniques should be applied to a track or a sound. Many different producers will have different opinions. Ideally the order that you apply changes in will not necessarily be oon which ones produce larger or smaller changes but an order that takes into consideration how each change will affect further changes to be made.

For instance adjusting the phase of a particular sound may or may not encroach upon another sound channel, but adjusting the volume definitely will. At the same time shifting the LFO frequency of a particular channel will, of course, adjust the phase of that channel and encroach upon nearby frequencies.

It's accepted, even encouraged, to go back and tweak previously set settings anyway. Do no think most producers will check items off a list and stop once the at the end of said list.

Let's look at a basic list to check off.

  1. Basic automation and effect tuning
  2. Volume Levels
  3. Panning
  4. Checking phase
  5. Filters (and "subtractive" equalizers)
  6. Compressing
  7. Shaping Equalizers
  8. Automation tunning

Listening Fatigue

Threshold of Pain: Depending on who you ask this is usually between 200hz – 10Khz. It doesn’t refer to an absolute effect, but to the frequencies that, presuming an even level of volume, can most affect the inner ear and therefore perceived “listeners fatigue”.

If you've ever listened to music long enough, especially while wearing headphones, you may have experienced an obvious pain because of it. More likely though you've experienced a dulling of the sense most noticeable once you've stopped listening. It may come in the form of a ringing sound, a lowered perception of certain frequency ranges, or simply music immediately after sounding "different".

These would all be forms of listener's fatigue and it's something you should always be wary of. Not just for your own personal health but the enjoyment of those listening to your audio.

While over-amplification of any frequency can cause it, it's typically stems from a mix using too much of the middle frequencies.

This video via Jolene Berg demonstrates the effects of hearing loss.

As you can see it's usually the higher frequencies which will start to dull first as the scarring of a damaged ear first prevents minute vibrations of the inner ear receptors. Eventually it will work down through the mid-ranges to the lower.

If you've ever listened to hobbyist electronica, house or edm producers and wondered about static or other artifacts in the higher ranges. Yes, it's because they're friggin' deaf.

The Great Loudness Wars

Auditory illusion: Sounds and sound qualities that are perceived through the interpretation of the brain but not measurable in the original stimulus. For instance sounds lower than 1khz might seem even lower as amplitude is increased while sounds higher than 2khz might seem louder.

The "loudness wars" was the name given to the tendency of modern producers to take advantage of the auditory illusion of compression quality where compressing amplitudes to higher levels boosts the perceived clarity of the lower signals but results in a lower range of possible frequencies.

With each new song trying to overpower the last the possible range becomes thinner and overall quality decreases.

Normalization

Normalization can refer to ...

(MOVING THIS TO A DIFF. SECTION)

You can bring the highest peak along your wave to the highest possible signal strength. You can bring the average of all peaks to the mid point of your possible signal strength, or any combination of these. Normalization is probably going to be a part of the mastering process since, if you had the opportunity to re-record the original, you would have the best possible recording to begin with. While the rule of thumb is that every step of modifying audio will degrade the audio, it is possible to adjust the normalization and compression of audio without too much degredation. The main thing to remember is raising peaks above the maximum amplitude will result in lost information. So how should it be used? Consider this situation. You've recorded vocals and a perhaps a simple drum / guitar combo. You want to add a synthezided backing

Stereo seperation

Stereo seperation refers to how well the listener can discern each ear when listening to a similar or the same sound. With most well mixed stereo tracks you can have the same sound seem to flow from front to back and well as left to right. There are a few ways of getting this effect and a few things that may prevent it from being too easy.

The thing about a lot of default stereo seperation plug-ins that come with various DAWs is that they work best when there are already clear differences in the left and right channel to begin with since they rely on amplifying those differences (The basic FL stereo enhancer plug in does this). This means you notice less of an effect when using them with a single mono source.

The "haas effect" (or "Precedence effect") refers to a listener discerning the delay of a sound as part of the original sound if the delay is small enough.

One way around this is to add *very* minute differences to one side. For instance a delay effect.

No matter what delay plug-in you are using there are really only two important factors to ensure.

  • That the "time" or "step" property is set to, or next to, zero.
  • That the effect is applied to one side (left or right) only.

You may also have luck using a pitch shifter in place of a delay effect, though this may cause an audible "warping" depending on the plug-in.

Try different types of modifications and see which you like!

Frequency Seperation

We previously mentioned the idea of "masking". The idea that one frequency can intrude upon another and even cancel it out. For instance a 4 kHz sound interferring with a 3.8 kHz sound. The wobbling effect mentioned in the discussion on Phase is essentially a form of masking.

In fact it's a good rule of thumb to keep most of your frequencies seperated across various tracks. But what exactly do we mean by that?