Where are we and how fast are we going?

Thank you, Justin, for this post's topic. He would like me to explain why time code is not dead. At Rocky Mountain Recorders, engineers like Justin breathe time code. It isn't just old fashioned methodology. It's Alive! I imagine Justin suggested it because in today's computer-driven society, time code has been brushed aside as a secondary thought. Most consumer or "pro-sumer" video editors don't even know what time code is, why it is, and how unbelievably necessary it is. There are many DAW's out there that claim you don't need to use time code anymore in post-production. This simply is not true.

In the production of movies, television and music, time code is used to convey two vital pieces of information. Location and speed. In other words: Where are we and how fast are we going?

Since this is a fairly technical topic, I'll try to simplify it for those who aren't familiar with it. If I geek-out a bit, please forgive me. My brain is full of useless information. Actually, its uselessness is still to be determined.

First, let's define what time code is and how it came to be. Have you ever watched an early black and white film and it looks like everyone is running around in fast motion? It looks that way because early cameras were hand-cranked. That meant that the film speed was up to the camera operator's strength and endurance. In order for it to appear normal, the playback would have to precisely mimic the operator's speed and variances. A standard was needed to make it all look normal. So, it was decided that film should run at 24 frames per second. What that means is that 24 still images pass by each second. This is only the "how fast" part of the time code information.

As films became longer and edits more complex, film editors would make an "edit decision list" or and EDL. In the EDL, they would notate the location of the cut by using a reel's feet + frames position. The other part of the time code - where are we? I can still see the editor's mark on the film at each edit in today's releases. I won't ruin it for you if you don't know what it looks like.

Separate audio recorders were used to record "talkies." The audio was then married to the film before editing. We've all seen the "clapboards" when someone comes in front of the camera, annouces the scene and take number, claps the board and runs out the frame. This was done for two reasons. Firstly, to record on film and tape which take and scene to use. Secondly, the audible "clap" could be lined up with precise frame of film for playback. Special machines were used to run at consistent speeds and the audio would match the film from that point forward.

This system wasn't perfect. Over the course of longer pieces of film, the audio would "drift." In other words, the audio would increasingly move out of sync with the film. Enter the Society of Motion Picture and Television (SMPTE).

SMPTE developed a method for matching the audio machine and film regardless of its starting point and continually check and adjust its speed. They developed an audio signal that could be recorded on both the film camera and the audio recorder that transmitted hours, minutes, seconds and frame information. They called it (insert trumpet fanfare here) "SMPTE Time Code." The audio playback machines had special decoders that could translate the signal and control the machine and keep it in sync. So now, it didn't matter where you started playback. Today, various forms of time code exist, but the most popular is SMPTE.

In any synchronization scheme there are two parts: a master and slave. In the previous scenario, the film machine is the master and the audio machine is the slave. The master dictates the location and speed via SMPTE. The slave constantly chases the master signal through it's time code translator. The first time I saw it in action, it was like magic. One machine was moving on its own!

There have been many technological advancements in film and music over the past 50 years. Film became video and television. Tape machines added more tracks. In the 60's Sir George Martin synched two four-tracks using a sine-wave signal and grease pencil marks on tapes to gain just two more tracks for a total of six. Twenty years later, 24-track tape machines began synching to each other, netting 46 recordable tracks for those guitarists who couldn't get enough. Even as Pro Tools and other DAW's started replacing tape machines, synchronization was still key.

Today, most video and audio editors don't have to think about time code as much. It's easy enough to import the footage of your family reunion, make a few edits and spit out a few DVD's. However, professional productions require proper use of time code. Time code is paramount. I'll give you a few real world examples of time code uses and misuses; music examples too.

An example of a disaster production that took way more time that it should have: A four-camera shoot of a live concert. I was in charge of recording the multi-track audio. I asked the video director for a time code feed so I could synchronize my audio to the cameras. He looked at me, dumbfounded, as if I had asked him to shred his shirt, smother it in ketchup and eat it. He couldn't fathom why I, an audio dood, wanted time code. I was successful in at least getting a clocking signal so my system would, in the very least, run at the same speed. It was later apparent this guy hadn't a clue about time code. Each of the four cameras were running at different time code rates. So, when it came time to edit the video, it was a mess. The video editors spent months lining up video clips by eye. To further salt the wound, they were using my audio mixes as guide tracks. Had they used proper time code techniques (and the guy on the roving camera #4 wasn't fixated on the well-endowed blonde in the audience), the whole production could have been finished in weeks instead of months.

At post-production facilities around the world, time code is a way of life. Broadcast media require exacting standards. A simple commercial may seem trivial to the general populace. But, there are many man-hours spent on each one by various people. Furthermore, each commercial may have several different versions. There may be a 30-second, a 20-second, a 15 and a 10-second version. They're all very similar and often will be produced simultaneously. A video editor will make the rough audio cuts with the video and it's up to the audio editor to clean them up. All versions will likely appear in the same session and start at predetermined time code positions. These are predetermined so that everyone working on the project knows where to look. It's more efficient.

Moreover, quite a few facilities will transfer their productions between environments on tape. Yes, I said tape. Would it shock you further, if I said that tape was beta? Well, it is. Digibeta. Each tape is formatted or "striped" with time code. When either "ingesting" the material from the tape or "laying back" to the tape, everything needs to be put in its proper place. This is made possible because of time code.

Have you ever had Pro Tools crash on you while recording before you could hit the Save button? And when you rebooted the machine and launched the session all the audio was missing? You look on the hard drive and the files are there. How do you get them back? The answer is easy. Time code. Each audio file, as it begins recording, will have a time code starting point ebedded into it. In Pro Tools, an audio file can be "spotted" on a track using its original time stamp. Spot each audio file to its respective track to the original time stamp, and you're back in business.

It's easy to become complacent when dealing with time code. Most people don't ever pay attention to it. More don't even know it exists. It's the hamster that makes the wheel go 'round. Without it, we'd be lost.

Rock. Roll. Repeat.

Comments

Popular Posts