We are fortunate to have featured some top peeps' in modern pop culture; Epic Records A&R Executive Aaron Reid, Celebrity Life Coach Tony A. Gaskins Jr., and Grammy Nominated Dance Producer Bill Hamel. Check them out in the archive at the top.


How Rock Concerts Work by Mark Sullivan @thesullivan

Chances are, you’ll find yourself at some kind of big-venue music show this summer, whether it’s Radiohead at an outdoor festival or Fiona Apple at an intimate, medium-sized music hall. We’ve all been there. We expect to be dazzled (or dizzied, or blinded) by the light show, and rumbled deep in our guts by the subwoofers.

The people who produce rock concerts are always looking for new ways to thrill and surprise us, and they’re increasingly turning to technology to do it. Here’s how some of the major components of the “big show” work.

The Mixing Engineer: The Real Star

To you and me, the star of the show is usually the singer onstage, but the most important person at a rock show is really the sound engineer who mixes the live audio. As former house big-concert mixing engineer Jon Graves tells me, the mixing engineer makes very good money, gets some star treatment on tour, often has a bit of an ego, and tends to bark orders to the other engineers and techs around him.

Graves, who now works as a concert applications specialist for the PA speaker company QSC, had such a role for ages. He toured with Metallica and Guns N’ Roses (among many other major acts), and mixed the sound at the Us Festival in the ’80s.  Graves says that if the mixing engineer has a bad night, “everybody has a bad night.”

The mixing engineer is the guy who sits at a small booth “at the front of the house” (facing the middle of the stage from the floor in the center of the venue) in front of the huge mixing board. He controls and mixes all of the sounds coming from the stage—every piece of sound-creating equipment onstage and in the production booth runs through the main board, which is often big enough to handle more than 100 tracks.

At the mixing board, the engineer has all the instruments laid out on separate tracks. The 12 microphones placed around the drum kit, for instance, might use tracks 1 through 12 on the mixing board. Once all the instruments are mapped to a track, the engineer can then meticulously mix a live sound that is perfect for the venue.

The Monitor Mix

But that’s just the mix that the audience hears. The musicians on the stage need a completely different mix so that they can hear their own sounds in the context of the sounds that their fellow musicians are making. Creating such a mix is a big job, so a separate engineer with a dedicated (monitor) mixing board is installed at the side of the stage.

A pair of Shure in-ear monitors.

Graves tells me that the onstage monitor mix used to issue from a set of wedge-shaped speakers on the stage, pointed up at the musicians. But these days, many performers wear wireless in-ear monitors, which present their own challenges for the engineer. The monitor engineer, Graves explains, can be required to create as many as 15 different stereo mixes—complete with effects—to match the monitoring taste of each individual musician.

The usual scenario is that some of the musicians will prefer a stage monitor mix, while others will prefer an in-ear headphone mix, and still others will prefer a mix combining both.

Backing Tracks

Take the case of Sleigh Bells, the Brooklyn-based two-person act that appeared on Saturday Night Live recently. Sleigh Bells has no drummer or bass player—just a female vocalist (Alexis Krauss) and a single guitarist (Derek Miller). The backing vocal parts, beats, bass, synth patches, and samples in the live set are all preprogrammed in a digital audio workstation (DAW) software product called Ableton Live.

An arrangement in Ableton Live.

Miller writes and records many of the beats and guitar loops on a laptop running Ableton Live, and Krauss performs the backing tracks and vocal loops that crop up in many Sleigh Bells songs. For live shows, an offstage engineer plays the backing tracks on a computer running Ableton Live. Krauss and Miller prefer not to wear in-ear monitors onstage, relying on (very loud) onstage monitors to hear the mix.

Many musicians, especially drummers, must wear in-ear monitors that play the backing tracks. The drummer, in fact, is sometimes the person who decides when the next backing-tracks “song” will start. At Sleigh Bells shows, an engineer at the side of the stage watches the show closely and starts the song in Ableton Live at the correct moment. But some other musicians can start the tracks from the stage using a foot pedal that triggers the audio software.

Laptops on the Stage

In Sleigh Bells’ case, the backing tracks are run from offstage, but more and more musicians are hauling their laptops onstage to do all kinds of things. Many keyboardists run Logic or Ableton on their laptops, and trigger patches from the software libraries using a MIDI controller (a keyboard that generates no sound of its own but instead triggers sounds in other devices).

Other musicians hate using laptops onstage (I’m one of them), preferring to trigger sounds from physical MIDI keyboard modules. The advantage of using DAW samples is that the library of sounds is much more easily expanded.

Public Address

The speakers used in rock shows have changed significantly over the years, and concert sound has gotten a lot better. Concert engineers used to stack loads of large speakers on top of one another to get the volume they needed. But the sound that all those speakers created was not unified, so the music could turn out radically different in various spots around the hall.

Without a sound system, you don’t have much of a rock concert.

At most medium to large shows you attend these days, you’re likely to see a big vertical bank of speakers hanging high up on each side of the stage. This is called a line array. Each of the identical PA speakers in the line array contains high-, medium- and low-frequency drivers (speakers). When the speakers are stacked close together and exactly on top of one another, all the low-frequency speakers in the array line up; the same thing goes for the mid- and high-frequency speakers.

This arrangement allows the speakers in the line to work together to make a single, unified sound. When the lines of low-, medium- and high-frequency speakers are working together, they collectively fill out the entire frequency range of the music being created on the stage, and deliver it to large areas of the room.

The number and placement of the speakers in the array depend on the unique design of the venue. A smaller array of PA speakers might be appropriate for a medium-size hall, while a much larger array might be suitable for a big outdoor festival.

Video Production

Video has become a huge part of the concert experience, often making the onstage performance seem like just one component of a music video being presented live. In fact, the video system usually consumes more electricity than the light show or the sound system do. Editors piece together video footage of all kinds (pretty much anything you can imagine) to correspond to the assorted songs, sections, moods, or moments in the band’s set. The video you see above the stage can sync up to the backing tracks and the lighting system via MIDI or SMPTE code.

Celt-rockers U2 have pushed the live rock video concept further than any other major act. Instead of preparing all the video beforehand, U2’s video team shoots live video at the concert itself. During every show of the band’s 2009 “360” tour, the U2 video design group created original visual graphics using live video footage shot on 15 cameras positioned throughout the stadium. They also brought 120GB of prerendered video to intersperse with the live video during each show.

But U2 wanted more. Dell provided some powerful laptops that the U2 video team used to create original video on the fly, on the road, much of it related to current news events, or to local news in the city where they were playing.

Dell’s Chris Ratcliffe, director of solutions and services marketing, accompanied U2 for many dates in the “360” tour, and worked closely with the tour’s video director, a man called “Smasher.” Ratcliffe hooked Smasher up with an M6400 (later an M6500), on which he created original video using Adobe Creative Suite, Autodesk Maya, and other programs.

Ratcliffe says Smasher was able to create new video while sitting at a coffee shop in the morning, and then dump it onto a thumb drive and transfer it to the show’s video servers for presentation on the big screen that night. Those video servers—a set of three Dell Precision R5400 rack-mounted workstations—were also provided by Dell.

 During the “360” tour, the band played underneath a huge circular video screen designed by British engineering firm Buro Happold. The screen weighed 54 tons, measured 4300 square feet when closed, and expanded to over 14,000 square feet and 7 stories tall when opened.

The Light Show

In modern rock shows, the lights on the stage move around and point at different places on the stage. Older PAR64 “light cans” are sometimes used in concert with electromechanical lights. An engineer controls the movement of the lights, as well as the light colors and levels, from a lighting board that is usually located near the main mixing board out in front of the stage.

Older light boards were little more than a box with a bunch of dimmer switches united in one place. But modern lighting consoles (such as Jands Vista models) are powered by computer chips and are fully programmable and automated. They can control the projection of LED light images, all stage lighting, video, and even pyrotechnic effects. Many boards have faders, as well, so that the operator can control all lighting effects in real time.

More often, however, the lighting for a concert is all pre-programmed in the light board by the light-board operator and the lighting designer. This lighting program is programmed onto a timeline that displays on the lighting console or on an external monitor.

The timeline on the light console adheres to the same MIDI Show Control (MSC) or timecode (SMPTE timecode) that the soundboard runs, so the two systems can easily link together and synchronize, along with the digital audio workstation (usually Pro Tools) that might be playing the backing tracks. For instance, a sudden flash of brilliant light can be programmed to coincide perfectly with a sudden, dramatic crescendo in the music, creating an exciting sensory experience for the audience.

Of course, the mixing engineer, the monitor engineer, the video director, and the lighting engineer must all communicate with one another before, during, and after the show. Usually they accomplish this through walkie talkies, via a hardwired intercom system, or by way of a mini wireless network custom-designed by one of the wireless carriers.

%d bloggers like this: