Monday, February 21, 2011

Have we come full circle?

I have a theory.  Should I take a poll?  When was the last time you actually sat down and listened to a full-length recording from start to finish?  Unless you're an audiophile or an engineer, it's probably been awhile.  I suspect it's been several years.  My theory is this: We've returned to being consumers of single songs, much  like the early days of the recording industry.  Only this time, it's become a soundtrack for our lives.

When Thomas Edison first invented the wax cylinder, it became possible for us humans to listen to a previously performed song in the privacy of our own living room.  Before then, one would have to witness the musical event in person.  Among the limitations of the cylinder was the time limit.  Early cylinders would hold only about 2 minutes of audio.  Later, it would advance to 4 minutes.  Just about enough time for one song.

Then came along disc recordings.  At 78 RPM, disc phonographs would play about the same length per side.  But, because there were two sides, there was room for two songs.  If a record label were to release more than two songs from an artist, multiple records would be packaged together in an "album."  Due to the cost of producing records, careful attention was paid to crafting and picking the songs that would be included.  The result was a collection of good, well-written songs.

Technology changed again and we could now fit up to 25 minutes per side of a 33 1/3 RPM Long Playing (LP) record.  The term "album" stuck through the consolidation of music to a single disc. 

45RPM records appeared around the same time and were a smaller format and cheaper to produce.  With their sonic superiority, they were intended to replace the 78RPM discs.

45's were more commonly purchased by teenagers while LP's were purchased primarily by adults who had more disposable income.  Initially, while LP's would sell, single 45's made the most money for record labels.  Often, labels would pressure artists for singles they could release and get radio play to generate money to produce full-length records.  It was common to hear a single on the radio and buy the 45 within days of recording it.

LP's took more time and cost to produce.  As I said before, it was important to carefully consider the material that went onto the LP.  Those of us who remember (and still have) our vinyl records, distinctly remember songs that didn't make the radio.  There were many.  A great album was a collection of songs that could stand on their own as singles.

8-tracks enjoyed success from 1965 to the late 1970's.  They were the first "portable" music format.  You can't play records very well in your car.  Even if you could, it would be impractical.  Radio was fine in the car, but the station chose what to play.  With 8-tracks, the choice was ours.  8-tracks consisted of 4 stereo tracks on an endless loop tape cassette.  Only one stereo track would playback at a time.  A full LP could fit onto an 8-track.  However, sometimes songs would need to be suffled into a different order.  Often, a song might fade out at the end of a loop and then fade back in on the next track.  While 8-tracks died out with disco in the general population, they continued to serve radio broadcasters as "carts" for commercials and short material through the late 1990's.

Cassettes came shortly after the 8-track, offering a much more compact portable medium.  With blank cassettes, we could record our favorite LP's and take them with us in the car.  We could also make "mixtapes" and give them to anyone we wanted.  We could make a mixtape of love songs for our sweetheart or a workout mixtape to listen to while we exercised.

In 1978, Sony introduced the Walkman.  The first personal portable cassette player.  The headphones were miniaturized as well.  We could now listen to our music while walking, riding, skating or any number of ambulatory methods that tickled our fancy.  All without bothering anyone else.

Compact Discs emerged in 1983.  Initially, CD's were expensive to manufacture.  Previously released albums were remastered and re-released on CD.  Playback systems were also expensive.  As costs lowered, it became a more popular format.  Eventually, CD's became portable too.

As CD's began to takeover the LP market, record labels were beginning to push artists to release their contractually obligated albums. Less attention was paid to writing quality songs.  Songwriters would write "filler" songs to fill out rest of the album time.  It reached a point where there were only two or three songs on a CD that were worthy of airplay.  Single CD's cost just as much to manufacture as full-length CD's, so record labels decided they weren't worth the effort.  Few artists were carefully crafting their projects.  Most of the popular stuff was generic.  But, people wanted the songs they heard on the radio and would cough up the extra money for the full-length disc.

Enter the consumer digital age and MP3's.  Did you know that MP3 is a format designed by video people?  MP3 is short for MPEG-1 or MPEG-2 Audio Layer-3.  MPEG stands for Moving Picture Experts Group.  Anyway, MP3's were small and could be shared via the internet.  If you wanted to share a song with someone, you could email it to them.  Websites like Napster and Rhaposdy quickly rose to help facilitate people sharing files.  Labels cried "murder" (again) and started suing.  Laws were passed and children were being fined for piracy.

Then came iTunes.  Steve Jobs essentially dictated terms to the record labels and slowly helped the record industry realize it could survive in the modern age.  Labels needed to adapt to the technology.  Why would someone spend $17 for a CD filled with fluff for two songs they really wanted, when they could now buy the songs they wanted for $2?  And they could do it legally.

The demographics haven't changed much.  Most music is still purchased by 13-year-old girls.  They hear a song on the radio and they have to get in on their iPod right away.  The CD as a merchandizing tool is on its way out.  Anyone can post music to iTunes and they do.  It's become more cost effective to produce and release your own music.  Artists are selling "download cards" at their shows for an EP (usually 3-6 songs) and charge $5.

Speaking of iPods, music has become so portable now, a lot of us take it for granted.  It used to be, we would sit around a listen to music.  Now, it's on while we're doing dishes or dusting.  It's background noise while we work.  It's on in grocery stores while we shop.  It has permeated every facet of our existence.  So much so, it's become almost mundane.

There was a time when the "single" was king. I believe we've returned to that era - albeit with a twist.  As Dennis Miller used to say, "That's my opnion.  I could be wrong."

Rock .  Roll.  Repeat.

Tuesday, February 8, 2011

Oops! I'm only human.

Okay, we all watched or heard about the Superbowl XLV halftime show.  I have to say, I was amazed at the level (or lack thereof) of quality in the audio production.  There were some obvious oversights and some not so obvious.  Whether or not you like Black Eyed Peas, Slash or Usher, or any of the music performed, is not the point.  I've seen a live Black Eyed Peas concert from 10 feet off the stage.  I was really impressed then.  What I hope to accomplish in this post is to offer some plausible explanations for the less than stellar audio production of said halftime show.

Let's begin by describing what I found to be the most egregious mistakes.  When the group began singing, it was obvious that Fergie's microphone was not on.  Will I Am's microphone was too loud.  As the songs progressed, the mix did not improve much.  Furthermore, the music track was too low.

At least we know they weren't lip syncing!

Having said all that, let's take a look at some possible explanations for how a multimillion dollar production, viewed by hundreds of millions around the world, could allow such errors.

1.  Lack of rehearsal - Dallas was hit pretty hard by a snow storm days before the Superbowl.  Ice was sliding off Cowboys Stadium the day before the big game.  I know there were plenty of preproduction meetings.  But, there wasn't much time to rehearse and fine tune everything.  However, the sound we heard broadcast could have been much better - even by amateur standards.  In addition, with today's technology, it is possible to take a "snapshot" of the mixing board during rehearsal.  That snapshot would contain levels, effects and any channels that needed to be turned on/off.  It sounded to me like someone was asleep at the wheel.

2.   Lack of redundancy - Equipment failure happens.  Those of us who have ever done any live event have our fair share of horror stories.  I have a few of my own centered around equipment failure.  It happens in the studio too.  Albeit, a lot less frequently.  More often than not, most of these mistakes are preventable through redundancy.

Avid's Venue console was designed by professionals, like Robert Scovill (Rush, Tom Petty, Sting), who've had more than their fair share of horror stories.  They intentionally designed redundancy into the boards.  Each section of Venue has two power supplies in case one fails.  There are redundant cabling.  If the computer crashes (which it rarely does), you can still run sound while it reboots.

With a production like the Superbowl, I imagine they have failsafes in place.  At least that's what I would assume.

3.  An emergency - Whether medical, accidental, or restroom. emergencies happen.  Again, I refer to to point #2.  If Mixing Engineer A is in the hospital from some bad sushi he ate the night before, there should be Mixing Engineer B standing by to fill in.

4.  Inferior equipment - Each audio geek knows that we make judgement calls based on what we hear.  If the monitors are bad and the room has acoustic deficiencies, we may think something sounds great when in fact, it sounds truly horrific.  Again, I'd like to point out that the Superbowl is a big deal with big budgets.  I would assume, they weren't mixing the live feed to the world on a pair of computer speakers in the backseat of a minivan.

On a tangent, if the production company was trying to justify upgrading some equipment to their superiors, they might have blown the mix on purpose, and made the case the upgrade would have made the mix perfect.  Knowing that my name would be associated with a production would prevent me from actually following through on this kind of conspiracy.  I doubt anyone would stoop to such a level.

5.  ESO/ID10T error - Simple human error.  Most likely the cause.  All it takes is one little oversight and panic sets in.  It's possible.  ESO/ID10T errors happen all the time.  At the level of the Superbowl, ESO is less common.  ESO (Equipment Superior to Operator) usually happens with inexperienced people.  ID10T (a silly way of expressing the word "idiot") errors can happen to anyone.  We're only human after all.

There is no use in rehashing what happened during the halftime show, except to learn from it.  If any of you come across an article explaining what happened, please share.

None of us are perfect.  As I said before, we all make mistakes.  We're human.  We have to learn how to get over those mistakes, learn from them and move on.  We can't go back and fix it.

Rock.  Roll.  Repeat.

Tuesday, February 1, 2011

Where are we and how fast are we going?

Thank you, Justin, for this post's topic. He would like me to explain why time code is not dead. At Rocky Mountain Recorders, engineers like Justin breathe time code. It isn't just old fashioned methodology. It's Alive! I imagine Justin suggested it because in today's computer-driven society, time code has been brushed aside as a secondary thought. Most consumer or "pro-sumer" video editors don't even know what time code is, why it is, and how unbelievably necessary it is. There are many DAW's out there that claim you don't need to use time code anymore in post-production. This simply is not true.

In the production of movies, television and music, time code is used to convey two vital pieces of information. Location and speed. In other words: Where are we and how fast are we going?

Since this is a fairly technical topic, I'll try to simplify it for those who aren't familiar with it. If I geek-out a bit, please forgive me. My brain is full of useless information. Actually, its uselessness is still to be determined.

First, let's define what time code is and how it came to be. Have you ever watched an early black and white film and it looks like everyone is running around in fast motion? It looks that way because early cameras were hand-cranked. That meant that the film speed was up to the camera operator's strength and endurance. In order for it to appear normal, the playback would have to precisely mimic the operator's speed and variances. A standard was needed to make it all look normal. So, it was decided that film should run at 24 frames per second. What that means is that 24 still images pass by each second. This is only the "how fast" part of the time code information.

As films became longer and edits more complex, film editors would make an "edit decision list" or and EDL. In the EDL, they would notate the location of the cut by using a reel's feet + frames position. The other part of the time code - where are we? I can still see the editor's mark on the film at each edit in today's releases. I won't ruin it for you if you don't know what it looks like.

Separate audio recorders were used to record "talkies." The audio was then married to the film before editing. We've all seen the "clapboards" when someone comes in front of the camera, annouces the scene and take number, claps the board and runs out the frame. This was done for two reasons. Firstly, to record on film and tape which take and scene to use. Secondly, the audible "clap" could be lined up with precise frame of film for playback. Special machines were used to run at consistent speeds and the audio would match the film from that point forward.

This system wasn't perfect. Over the course of longer pieces of film, the audio would "drift." In other words, the audio would increasingly move out of sync with the film. Enter the Society of Motion Picture and Television (SMPTE).

SMPTE developed a method for matching the audio machine and film regardless of its starting point and continually check and adjust its speed. They developed an audio signal that could be recorded on both the film camera and the audio recorder that transmitted hours, minutes, seconds and frame information. They called it (insert trumpet fanfare here) "SMPTE Time Code." The audio playback machines had special decoders that could translate the signal and control the machine and keep it in sync. So now, it didn't matter where you started playback. Today, various forms of time code exist, but the most popular is SMPTE.

In any synchronization scheme there are two parts: a master and slave. In the previous scenario, the film machine is the master and the audio machine is the slave. The master dictates the location and speed via SMPTE. The slave constantly chases the master signal through it's time code translator. The first time I saw it in action, it was like magic. One machine was moving on its own!

There have been many technological advancements in film and music over the past 50 years. Film became video and television. Tape machines added more tracks. In the 60's Sir George Martin synched two four-tracks using a sine-wave signal and grease pencil marks on tapes to gain just two more tracks for a total of six. Twenty years later, 24-track tape machines began synching to each other, netting 46 recordable tracks for those guitarists who couldn't get enough. Even as Pro Tools and other DAW's started replacing tape machines, synchronization was still key.

Today, most video and audio editors don't have to think about time code as much. It's easy enough to import the footage of your family reunion, make a few edits and spit out a few DVD's. However, professional productions require proper use of time code. Time code is paramount. I'll give you a few real world examples of time code uses and misuses; music examples too.

An example of a disaster production that took way more time that it should have: A four-camera shoot of a live concert. I was in charge of recording the multi-track audio. I asked the video director for a time code feed so I could synchronize my audio to the cameras. He looked at me, dumbfounded, as if I had asked him to shred his shirt, smother it in ketchup and eat it. He couldn't fathom why I, an audio dood, wanted time code. I was successful in at least getting a clocking signal so my system would, in the very least, run at the same speed. It was later apparent this guy hadn't a clue about time code. Each of the four cameras were running at different time code rates. So, when it came time to edit the video, it was a mess. The video editors spent months lining up video clips by eye. To further salt the wound, they were using my audio mixes as guide tracks. Had they used proper time code techniques (and the guy on the roving camera #4 wasn't fixated on the well-endowed blonde in the audience), the whole production could have been finished in weeks instead of months.

At post-production facilities around the world, time code is a way of life. Broadcast media require exacting standards. A simple commercial may seem trivial to the general populace. But, there are many man-hours spent on each one by various people. Furthermore, each commercial may have several different versions. There may be a 30-second, a 20-second, a 15 and a 10-second version. They're all very similar and often will be produced simultaneously. A video editor will make the rough audio cuts with the video and it's up to the audio editor to clean them up. All versions will likely appear in the same session and start at predetermined time code positions. These are predetermined so that everyone working on the project knows where to look. It's more efficient.

Moreover, quite a few facilities will transfer their productions between environments on tape. Yes, I said tape. Would it shock you further, if I said that tape was beta? Well, it is. Digibeta. Each tape is formatted or "striped" with time code. When either "ingesting" the material from the tape or "laying back" to the tape, everything needs to be put in its proper place. This is made possible because of time code.

Have you ever had Pro Tools crash on you while recording before you could hit the Save button? And when you rebooted the machine and launched the session all the audio was missing? You look on the hard drive and the files are there. How do you get them back? The answer is easy. Time code. Each audio file, as it begins recording, will have a time code starting point ebedded into it. In Pro Tools, an audio file can be "spotted" on a track using its original time stamp. Spot each audio file to its respective track to the original time stamp, and you're back in business.

It's easy to become complacent when dealing with time code. Most people don't ever pay attention to it. More don't even know it exists. It's the hamster that makes the wheel go 'round. Without it, we'd be lost.

Rock. Roll. Repeat.