Volley Up

by Bryan Reesman

Paul Charlier serves up his sound artistry in the Broadway drama Deuce.

Veteran sound designer Paul Charlier is nothing if not devoted. When he speaks to Stage Directions, it is 1 a.m. where he is in Australia, and he has just finished a crazed day at work. But he is more than willing to discuss his work on the Tony-nominated Broadway drama Deuce, which stars Angela Lansbury and Marian Seldes as retired women’s tennis pros who made a stellar doubles team back in the day, and who are now guests of honor at a modern match between two new stars. But as their verbal interplay proves, the game has become much more about achieving stardom and nabbing en-dorsements than the love of the sport. The 90-minute drama grips audiences because of its luminous leading ladies, engrossing story and Paul Charlier’s dynamic sound design, which creates the illusion that a live tennis match is going on just in front of the cast. It is complemented by Sven Ortel’s clever video design, which includes large projections of digitized audience members that help create the illusion of a live stadium audience.


Charlier is a veteran of film, television, radio, dance and theatre whose credits include Democracy, Copenhagen and the Heath Ledger/Geoffrey Rush film Candy — not to mention being involved in the early 1980s with pioneering Aussie industrial group SPK (which also featured Hollywood com-poser Graeme Revell). He knows his stuff, and he loves chatting about the artistry of sound. In this case, Deuce provided Charlier with an exciting opportunity to truly create a world with which we are somewhat familiar  — that of a live tennis match — by using his sound techniques to their full po-tential.

Stage Directions: How many years have you worked in sound design?
Paul Charlier: About 25. A long time ago, I used to joke that it was a little bit like seasonal fruit picking, because I had to go where the harvest was. I worked as a radio producer at ABC Radio for a while, making programs there and doing installations elsewhere. I’m more interested in crossing over between and bringing techniques from one to another. Doing music and sound design are part of that anyway because I don’t usually distinguish between the music and the sound design that much. With a show like Deuce, I think of the effects in the same way as I think of music cues anyway.

The volleys back-and-forth definitely have a certain rhythm.
That was one of the good things about working with the actors in rehearsal. I worked a lot of it in rehearsal because the sound is like a character. There are also musical things that happen. For exam-ple, the tempos of the games vary throughout the show in the same way that music does. Then there are some other music production techniques because trying to get the sound of the tennis hits turned out to be quite a huge task. It was a little bit like trying to perfect a snare drum sound. It’s a sound that lasts for a fraction of a second and has a certain impact, and it has different expectations for peo-ple. It was actually a little bit harder with this one than, say, doing a film because you don’t have any visual reference to help the audience hear what you’re doing. Everything has to be in the sound.

You help to create the illusion. Obviously the video projections of audience members behind the two leads help that, but you do need sound. It’s very cinematic, and I assume your experience with film probably helped with that.
The odd thing for me, which is different from a cinematic experience, is that you don’t have that visual connection. Walter Murch said that 90 percent of sound is what people hear in their head, and often in film that’s triggered by what they’re seeing. When you don’t have that actual physicality of seeing the effort that the tennis player puts into hitting the ball and what that generates in you, you have to imbue this very short sound with all that energy that you can’t see. Obviously, having the spectator reactions is part of doing that, but also manipulating the timing. There was a lot of detail work in trying to build up the tension with bouncing the ball before the serve, the delay before the serve, and then getting the shock from the hit.

How did you create those sound effects? Did you tape people playing?
I do a lot of field recordings myself. My initial assumption was that I would get someone and re-cord the hits. Then the Australian Open was on while I was doing prep in Australia, and it suddenly occurred to me that there are only a 100 people in the world that can actually hit the ball like that; I wasn’t going to get any access to them because they’re all professional tennis players. There wasn’t any point in getting a good player out somewhere because there’s the racket, the ball, the tennis sur-face, the acoustics of the stadium and just that energy that it takes to hit a ball over 100 kilometers per hour.

So I took the path that was closer to recording drum sounds. I recorded a lot of the Australian Open, and I was listening back to the sound, which isn’t a strong indication of what it sounds like in the stadium, but it is the sound that people identify as it. Most of those recordings were off-miked because it’s all shotguns on the edge of the court. So I actually turned the sound off and started watching the tennis without the sound to hear what I heard in my head when they hit; then I set about to create that sound. I took elements of tennis hits that I could find and other percussive elements to build the sound from scratch just to get something that had that sense of the ball traveling a 100 kilometers per hour.

There’s a sort of a development in the piece, too, that wasn’t an original concept of the sound de-sign, but developed out of the need in the rehearsal room where the early games are quite slow and polite. Over the period of play, it became like a history of women’s tennis. The first games are slower and more polite. As it goes on, they talk about modern tennis and how it has changed, and the games get a bit faster and the hits get a bit harder, then they start grunting and the tempo continues to increase. They start swearing. By the end, the last two games that you hear are much more energetic than the ones that you heard at the beginning of the show. So there was some sense of that shift in tennis.

Then there are the other little details. The foot sound is an important part of recording, but early on we decided not to incorporate the sound of them running to the ball because it just got too distracting. It was one of those things where you can’t use it, drop it and then bring it back again. So the only foot sounds we ended up using were with some of the serves as it built up to make the serves bigger. My family has a tennis background, so I sort of grew up with all of that.

The performances are very closely linked to your sound design. How closely did you work with the actors?
Director Michael Blakemore is important because he’s really interested in sound design. The three shows I’ve done with him have had no music. It’s all been structured around sound design. He also likes to bring the sound into rehearsal very early, both for the performers’ sake and also because he says it’s the only thing he can tech before getting to the theatre. So I went into rehearsal with pieces that I started putting together. We could change around the games and change the nature of them.

 When the performers began to realize that I could do it on the spot, they started realizing that it was a flexible thing and began asking and making suggestions. The way I work is I basically have a mini studio in the rehearsal room with Logic, so I can put the thing together and take it apart and put it back together again and change the timing. We couldn’t expect dialogue to fit within firm, timed pieces, so a lot of the elements, like the bouncing of the ball in the game itself, were all designed to go as long as they needed to go and are all separate elements within the playback. Some of them were then linked later on. Just about everything you hear are separate elements that could be fired separately.

A big part of getting that to work was that in the theatre, the design was about localizing the game. Especially with musicals, the idea is to give an evenly dispersed sound throughout the theatre so eve-ryone hears the same thing, but we wanted the game to be quite localized down in front of the per-formers, as well as in the speakers on either side. We hung the balcony speakers quite low so that from the balcony you heard the sound from below. It was consistent with the performers’ eyeline. We put speakers in the platform underneath the performers so that the net was located in the center. Then, all of the announcements came from the cluster, and the crowd was spread around through the surround in the front of house as well.

You used Meyer speakers on this production. Which ones did you employ, and how did you place them?
It was a combination of UPAs and UPJs and some UPMs for fills. We used UPAs on the side for the grunts and the hits. The UPJs were in the center for the hits on the net. We ended up using that to localize the radio mics, which was fantastic. It’s so rare that you can put a speaker underneath the performers so that the sound is reinforced and totally localized. There were also UPJs up for the TV commentators because we wanted to localize their sound, too. We didn’t want them being in the whole system because they would have ended up being the “voice of God,” which would have been a bit out of proportion with everything else. So they were localized through a UPJ that was underneath them, with delayed reinforcement through the system.

So how far back in the theatre does the sound reach?
There were rear delays underneath the balcony and above the front of the balcony itself. I have this rule of thumb that if an unamplified actor is standing on stage, and what you hear is them in the space, then the bottom line is the speaker needs to do that as well. Here we’re lucky in that we didn’t want the sense that everyone was on the line of the court, so obviously if you’re down in front, it felt like you were closer to the game, that there was a natural acoustic roll-off from the stage. It keeps your perspective to the game the same as your perspective to the voices of the actors on stage.

The hardest thing about it was actually getting the 40,000-spectator sound when you can’t actually put a huge P.A. behind the audience. Getting the sound of 40,000 spectators is huge as it is, especially in tennis, because tennis crowds are unlike any other sporting crowd as they’re traditionally quite polite; they don’t tend to react the way baseball or soccer crowds do. Unlike soccer or football, they’re not cheering constantly through it — they’re very quiet in between points. Probably the hard-est thing was pulling together enough crowd sounds. I joked with Michael that the acting range of the crowd was the biggest problem I had with the sound design, just being able to get enough of various reactions — those “oohs” and “aahs” when a ball’s almost hit or when someone does something good in the middle of a rally, and then trying to create the sense that there is actually that many people in the stadium. We didn’t try to go with making it sound like there are 40,000 people with you in the theatre because it just didn’t make acoustic sense, and because there is also this false perspective that you’re hearing these two people talking on the other side of the court at the same volume as you’re watching the game, so there are a lot of liberties that you’re taking with the realism.

What kind of console were you using?
We had a Yamaha DM2000 as the desk. As far as the sound design goes, I end up using the mixer as an extensive matrix because all of the localizing is done in Cricket. It comes out of the computer and is routed directly one-to-one through the inputs/outputs of the mixer, and I think we had 10 out-puts coming out of the MOTU into the mixer. I could start programming Cricket in the rehearsal, then just take that into the theatre and open up the outputs and just keep working on it, so I didn’t have to start all over again. By the last week of rehearsal, they had 90 percent of the show with the playback as it would be in the theatre. It’s fantastic for being able to break everything into small ele-ments and roll them over.

Then Jake Rodriguez, who wrote Cricket and some modules specifically for the show, put a MIDI show control module in it because sound triggered half the video, which is running off the Hippo sys-tem. In order to keep their heads in sync with the game, Cricket triggered the video when it needed to be in sync with the sound. Cricket also triggered cue lights to help the actors know where they had to look, because often from their point-of-view, it was not easy to tell which way they were supposed to be looking. We had two lights at the back of the theatre that Cricket also triggered. Cricket became like a pacemaker that was running the show. It’s probably the biggest show I’ve done on Cricket because there were about 2,000 modules in the sound design, and we got up to about 400 cues at one point. I think that came down to about 320 by the time it opened, and they were linked within 80 cold cues. It’s possibly much more complicated than it sounds, but it’s a tribute to the system in a way. There are a lot of fail-safes built in to make sure those games run in open time.  


Bryan Reesman is a New York-based writer who has been published in the New York Times, Playboy, Billboard and Moviemaker.