Using Playback Live – Some Thoughts

If you are a musical director, session musician, playback tech or band member who regularly uses backing tracks as part of their live show, then this post is for you. Whether you are using Ableton Live, Digital Performer, Reaper or Pro Tools, I hope there are some ideas here that you find useful or informative. As always, if you have any feedback or suggestions, or want to discuss anything in this post, please do get in touch with me.

1. If it can be played live, play it live.

This sounds obvious, but it’s definitely something to bear in mind when putting together your playback tracks. If there are musicians on stage, then they must be there for a reason. Let the playback enhance what is happening live.

2. If it sounds like it should be played live, but it needs to be on track, change it.

I usually work from the premise that the audience understand the difference between “acoustic” and “electronic” sounds. For example, a snare drum played live has its own sound on stage, but a recorded snare drum coming from playback will sound different to the audience, and I believe this causes a disconnect in the audience’s engagement with the live mix. An obviously electronic snare sound can stay on track, and remain believable to the listener, because modern audiences understand the difference between a live drummer playing real drums, and a programmed drum machine. In the same way that Mozart’s audiences were sensitive to unusual key changes, modern audiences are sonically literate in a way that needs to be accounted for when putting together your onstage sound.

If the drummer wants control over the timing and arrangement of the drum part, then drum samples can easily be lifted from the studio tracks and put into a sampler to be triggered by the drummer using triggers such as ddrums or similar. This way, the aesthetic of the studio mix is preserved, but the control returns to the performer. The decision has to be based on the overall aesthetic and intention of the artist. Spend some time working out what your intention is, what is a priority in the sound, then make decisions from there.

There are no golden rules. Only use your ears and your taste and go from there.  

3. Impossible performances / arpeggiators (delay).

Sometimes it can be fun to play arpeggiators in live, usually when the chords are changing often enough to allow some room for imperfect timing (who, me?!). If I am playing arps in live, I will always program a delay into the synth patch at the tempo of the song, which helps bed the arp into the mix. If the arp part on the original track has a really strong identity, it usually makes sense to leave it on the playback track. There are obviously options to keep things interesting by manipulating filters and delays on the arp track, so there is a “live” element to that part of the playback. Again, it all comes down to what fits the overall aesthetic best. Ultimately, the audience want it to sound GOOD. If you or your superstar keyboard player have setup a sweet midi chain with tempo sync and filter sweeps using a D-Beam, the audience really don’t care if it doesn’t sound any better than the original.

4. Backing vocals on track. EQ / fake reverb.

Backing vocals, like lead vocals, should – in my humble opinion – be live. However, if you want to thicken the sound, or add support for, ahem, weak vocalists (such as myself), then backing vocals on the playback track can be a really useful tool. First thing to do is discuss with your FOH engineer what would work best, and what is missing from the sound. You might find that they are happy with just the live vocals, apart from maybe one or two songs where they feel the singers need an extra bit of support. How you feel about this kind of approach will dictate how you use vocals on track. If you are working for an artist as musical director, then it’s important you understand how the artist feels about using vocal on track, and what the opinion of management is. If you are programming playback for your own project, then it’s up to you make sure you have a clear idea of what the boundaries are going to be for this kind of reinforcement.

Personally, I try to avoid lead vocals on track in all circumstances. I think it’s dangerous territory, and once you cross that line it’s easy for things to become a bit murky, in terms of “integrity” (however you define that). One thing I have done a lot, however, is use the FX sends from the studio vocals and feed them to FOH for specific moments. A good example of this is when there is a reverse delay that happens before the vocal entry, which obviously can’t be done in real time (or can it??). Certain types of delay or reverb from the studio tracks might be great to use live, and what I’ve found is that after a few rehearsals or even a few shows into a tour, the FOH will let me know that they’ve worked out how to recreate that sound at their end, so the vocal FX track can be muted on playback. Once again, it’s all down to what the overall aesthetic of the project is. How do you best deliver the artist’s sound live?

5. What does the FOH want on track?

As I’ve already mentioned a couple of times, it’s really important to keep checking in with the FOH about what’s working and what needs to change. I’m usually writing from the perspective of a Musical Director working with a signed artist, but this principle applies in all scenarios. If you are playing with your band in a local venue and they have an in-house sound engineer, make sure you have this conversation with them. It’s especially important if you are delivering multiple channels of playback to FOH on top of the usual live drums, bass, guitar, keys, vocals channels. You might find that it’s just too much to make sense of with limited soundcheck time and two or more bands to mix. If you are on tour, and using local FOH engineers, make sure you have a Plan A / Plan B setup. Talk it through ahead of time, and ask the opinion of the local crew. Being flexible is going to win you a lot of friends.

If you have your own dedicated FOH engineer on tour with you, then you can have a conversation every day about what’s working and what needs to change. In terms of what’s on playback, you need to keep your setup stable for live use, but open-ended enough that you can tweak things like level and EQ on a daily basis in collaboration with your sound engineers. And remember the golden rule (I know, there are no golden rules) – once you have soundchecked, DON’T CHANGE ANYTHING. If there are things you want to change, make sure you get in before soundcheck, do the tweaks with your FOH and playback tech, save them, THEN soundcheck them with the band. Don’t go onstage with new changes to your setup and hope for the best. Trust me, I’ve done this and it really sucks.

6. Playback doesn’t all have to go to FOH. Tuning guides, clicks and cues.

Remember that if you find some elements of the studio tracks useful as tuning guides or whatever, they don’t all have to go out to FOH. You can always route things to the monitor mix (presuming you are working with in-ear monitors) for your reference only. I’ve done a youtube video talking about clicks and cues – check it out here: https://bit.ly/2BdsG2x

San Francisco Sep 2017


The last three shows of the US leg of this tour will be San Francisco, San Diego and LA. Every show has been incredible so far. Amazing audiences. Looking forward to getting to Australia and New Zealand – they have a lot to live up to now!

We will be saying goodbye to our US lighting tech Eric Morriss, who has been doing a stellar job. Will be sad to leave him behind. Onwards and upwards… 

Washington, Philadelphia, Boston 

Two shows into the Alison Moyet US Tour and everything is sounding great. Very enthusiastic audiences. Having a lot of fun playing new songs and new arrangements of familiar songs. This is tonight’s venue in Boston. Day off tomorrow and then we play New York.