audio engineerWhen you listen to a song that you haven’t heard before, what is it that draws your attention?  Is it the beat that gets you moving?  Is it the lyrics tugging at your heart strings?  Is it the arrangement and interplay of the instruments?  A great song has a bit of all of these elements, but what can set a song apart from the rest often involves the hidden science and art of the audio engineer helping to put those pieces together.
With the diversity in music today we can all find something that draws our attention.  What I want to look at is how the Musical Trinity of songwriting, arrangement, and engineering creates those songs that linger with us.

Engineering

Some people hear “engineering” as a dirty word when it comes to music; they start to think about the dreaded auto-tune and how someone has manipulated the recordings to make flawed artists sound like premiere musicians.   Sadly, this has become more commonplace with the more famous artists as they want to have that absolute perfection to the recording.  But what about the up-and-comers?  They can’t all afford the studio time or the necessary equipment to put out that top notch production.  This has created an interesting gap between the self-produced albums and the label productions.  Is one production better than the other?  Sure, you can hear a better fidelity with a label production but that doesn’t take anything away from the quality of the self-produced albums.  The key thing to look for when listening to the production of an album is balance.  Having that discerning ear to notice where instruments are placed in space, to notice if they all sound like they are playing in the same room, to be able to close your eyes and actually picture it, as if you are sitting in front of this group watching them perform.

The idea behind a modern recording studio is to record the sounds as closely as possible so as to not hear any of the environment around them.  During the production and mixing of the album, the engineer will help to create the room that those sounds should be placed in.  The engineer creates that width so we hear something from the far left to the distant right.

A lot of inexperienced engineers, though, forget that we don’t just hear in 2 dimensions when we’re listening.  This means that they tend to forget to take depth into account.  When we listen to a concert are all of the musicians right at the front of the stage, hitting us at the same volume?   No, of course they’re not.  The drummer is usually towards the back of the stage, the singer is front and center, and the guitars and bass will be sitting between the two.  We hear with depth, we recognize that a sound is far away or right in front of us.  This is what we know as sound localization or the ability to recognize where a sound originated.
It’s that principle of placing instruments in a three dimensional field that is the foundation of balance in a recording.  I say the foundation because once you’ve figured out where in the environment you are placing the instruments, you have to look at the listening environment and ask yourself the following question:  if a sound is directly in front of me or far off, how much ambient noise or reflections in the room will I hear?  That question tells you how much reverb and delay to use so that all the instruments will sound like they are playing together.

Comments are closed.