• Skip to primary navigation
  • Skip to main content
  • Login

IATSE Local 695

Production Sound, Video Engineers & Studio Projectionists

  • About
    • About Local 695
    • Why & How to Join 695
    • Labor News & Info
    • IATSE Resolution on Abuse
    • IATSE Equality Statement
    • In Memoriam
    • Contact Us
  • Magazine
    • CURRENT and Past Issues
    • About the Magazine
    • Contact the Editors
    • How to Advertise
    • Subscribe
  • Resources
    • COVID-19 Info
    • Safety Hotlines
    • Health & Safety Info
    • FCC Licensing
    • IATSE Holiday Calendar
    • Assistance Programs
    • Photo Gallery
    • Organizing
    • Do Buy / Don’t Buy
    • Retiree Info & Resources
    • Industry Links
    • Film & TV Downloads
    • E-Waste & Recycling
    • Online Store
  • Show Search
Hide Search

Features

Mission: Impossible – Dead Reckoning Part One

Tom Cruise in Mission: Impossible – Dead Reckoning Part One from Paramount Pictures and Skydance
© 2023 Paramount Pictures

I was no stranger to the Mission: Impossible series and have worked on several films with Tom Cruise. Dead Reckoning Part One was my third outing in the Mission: Impossible franchise, following Fallout and Rogue Nation, all directed by Christopher McQuarrie and of course, starring Tom Cruise (TC). I also worked with Chris McQuarrie (McQ) when he was the writer on another TC project that I worked on called Valkyrie.

One of the advantages to having the writer as director is that on the previous two films, the script was continually evolving during production and even in post production. It is not unusual to complete principal photography and then return for several weeks of additional shooting to hone the plot and storytelling arc. This, my third film in the series, not counting some work that I did on the first Mission: Impossible, was to be different. We didn’t have a script at all! That is to say there was no script in pre-production and we would often only be given scenes on the day or a few days prior to shooting a sequence. During pre-production, McQ would call all HoDs to a meeting and tell us the plot and how he saw each scene developing, what stunts were being considered and hopefully, what requirements could be expected of us. Of course, this meant there was also no traditional scheduling. We mostly just worked from a block calendar. Good communication with the director and production would be key to making this work.

My first contribution to this latest episode started in October 2019 when I was already working on Black Widow. I had a message from TC’s team to say that he was in training for a speed-flying sequence that would be featured in the film but that it was not due to start shooting until February 2023. Fortunately, I was about to travel to Atlanta without my UK team so was able to send my 1st AS, Lloyd Dudley, to look after this aspect of pre-production. As always, TC needed communications for this stunt sequence and also as usual, there would be some dialog. Though it is well known that Tom always performs his own stunts, long ago I suggested that giving him dialog during the stunt sequence would also confirm to the audience that this is him and not a stunt double. I have sometimes come to regret that suggestion.

Esai Morales and Tom Cruise in Mission: Impossible – Dead Reckoning Part One from Paramount Pictures and Skydance. Photo by Christian Black

On every Mission: Impossible, the stunts get bigger and seemingly more dangerous though I should say that for Stunt Co-ordinator Wade Eastwood and his team, safety is paramount. As part of the safety requirement is good communications with TC and the stunt team, I decided to utilise the bone conduction technology that I had developed for use in the last Mission: Impossible (Fallout). The system uses custom ear moulds that are both microphones and earpieces. A small conductor is designed to sit on a specific part of the ear to pick up speech with minimal background noise. Though you may imagine speed flying to be near silent as opposed to the helicopter sequences that I had previously used bone conduction for, the wind noise could be quite high and difficult to keep off a conventional microphone. There were also safety considerations that we did not want a lot of loose cables that could interfere with the parachute when deployed. We were now using Audio Ltd. (now Sound Devices) A10 radios in simultaneous transmit and record modes as we are able to do outside of USA. Communications were using digital Motorola radios that gave good long-distance coverage to a base station setup. In conjunction with my long-term technical collaborator, Jim McBride, and his son Mark, custom interfaces were made to integrate both recording and comms systems. Lloyd and Mark spent most of the rest of 2019 with TC on various locations, including the Lake District in UK and in South Africa.

Tom Cruise and Hayley Atwell in Mission: Impossible – Dead Reckoning Part One from Paramount Pictures and Skydance. © 2023 Paramount Pictures

Having finished on Black Widow, I started my prep early in 2020 tech scouting various locations, including Venice where we were due to start shooting. It was on the tech scout that when standing in line for the buffet breakfast with McQ behind a large party of Chinese tourists, who were coughing and sneezing, that McQ remarked, “I’m not going to stay here when we shoot, you don’t know what you’ll catch.” Though it was said as a joke, we both requested to stay in different hotels, though for me, it was to a hotel that I had previously stayed in when shooting Spider-Man: Far from Home, because it had a very good space for equipment storage and better boat access than the typical tourist hotels. It was also in a part of Venice that was away from the tourist areas and a better place to spend several weeks in Venice.

Tom Cruise in Mission: Impossible – Dead Reckoning Part One from Paramount Pictures and Skydance.

We were a couple of days away from our first shoot day and testing our record function of the A10 radios for a boat sequence. Here, I planned to record on the body-worn A10’s and some well-placed A10’s for sound FX. I had recently been working with Paul Isaacs of Sound Devices to develop conform software so that all the recordings could be conformed to what is shot on camera and available for dailies as a poly wav file. We were testing with stand-ins on the canals when we started to hear in the news about COVID-19 causing some concerns in areas of Northern Italy not far from Venice. It was a matter of days and the day before we were due to start shooting that we were closed down and evacuated back to quarantine in the UK. A small number of the crew had symptoms but were unsure whether it was COVID or the flu and fortunately, no one suffered any serious effects.

We spent the rest of the year up until September expecting to restart as the epidemic grew bigger and more serious. The rest is history for all of us but McQ and TC were very keen to restart production as soon as possible but it was to be six months of downtime.

Before we eventually started back to work in September 2020, I spent some of the downtime looking into communication systems that could allow us to work efficiently whilst maintain social distancing and also that could allow remote working in the event that director, DP, or any of the key crew were forced to isolate. I discovered the BOLERO system from a company called Riedel https://www.riedel.net/en/products-solutions/intercom/bolero-wireless-intercom/.

The system was mainly used in F1 motorsport and big sporting events and allowed full duplex communications over large areas. I made contact with Paul Rivens, who ran the Riedel UK Operation. Good news for us was that due to the postponement of the Tokyo Olympics, there were lots of units available. Paul kindly set me up on a training course at Riedel in UK and then set up a demo for production. The producers asked me to specify a system which they then bought. I passed on most of this information to the British Film Institute, who were also investigating ways to get us all working again and many other productions in UK adopted the BOLERO system. We bought twenty-five BOLERO handsets for key HoDs, then integrated with our Motorola walkie-talkies so that everyone could communicate effectively from a safe distance. I also integrated my directors talk-back system and the video assist system and sent production audio to the BOLERO handsets so that there was no need to wear an IEM or to carry a walkie-talkie. A BOLERO antennae typically covers an area the size of a soccer pitch and multiple antennae connected by an Ethernet cable can be used to cover vast areas. I was even able to integrate cellphones when much larger distances needed to be covered, for example, when shooting the car-chase sequences in central Rome.

Tom Cruise and Hayley Atwell in Mission: Impossible – Dead Reckoning Part One from Paramount Pictures and Skydance. © 2023 Paramount Pictures

We restarted production in Norway. The first sequence was shooting the big motorcycle off the mountain stunt that was seen in most of the press and teaser trailers long before the film’s eventual release. Once again using the Bone Conduction Tech to communicate with TC and to record any dialog. We recorded this to a body-worn A10 and to my Scorpio and CL16 main unit cart setup.

COVID was still a big problem, so to create a bubble for the crew and actors, we all stayed on a Hurtigruten cruise ship. This was a brand-new ship that had only just come into service and was now laid up because there were no cruises operating. The beauty of being on the ship in addition to the comfort and great food was that when we needed to move to another location, we did so overnight waking up to the sound of multiple helicopters ready to ferry us to set.

Tom Cruise, Rebecca Ferguson, Simon Pegg and Ving Rhames in Mission: Impossible –Dead Reckoning Part One from Paramount Pictures and Skydance. Photo by Christian Black

We then stayed in Norway to shoot much of a sequence on trains that involved cast members on the roof of the train carriages and sequences inside the train. The majority of interiors were completed much later in UK in rail carriage sets built on the backlot at Longcross Studios.

Also, in Norway we shot some of the speed-flying and parachute sequences with TC often jumping from a helicopter. The Bone Conduction Tech once again allowed dialog to be recorded, as well as enabling communications with helicopters and cameras.

After Norway, we moved to Italy to shoot in Venice and Rome. We shot fairly conventionally in Venice using our BOLERO system for communications which McQ, our director, had fallen in love with, as it allowed him to speak directly to his DP, AD’s, Gaffer, Key Grip, Operators, and to the Voice of God system if necessary. This included a big party sequence in Palazzo Ducale. Boat sequences and chase sequences through the streets and alleys of Venice were shot using body-worn A10 transmitters and the tracks conformed to a poly wav because it was neither possible for us all to be in the boats or for us all to cram into a following boat. The Rome car-chase sequences presented similar issues and employed similar solutions. This sequence has a lot of cars and a lot of dialog. We rigged the cars with A10’s and fitted them to the actors putting everything into record. There were no follow vehicles except when using Russian Arm or similar vehicle-mounted cranes or tracking vehicles. Mostly the cameras were mounted on the action vehicles the director was watching by high-powered video transmitters which were not always in range. I sent guide production audio to the video transmitters and also used cellphones connected to the car hands-free systems to communicate back to the BOLERO system to keep the cars always in communication. When we broke dailies at lunchtime, we did not actually break for lunch as we always work continuous days, I would conform the individual files recorded on the audio A10 and A20 transmitters to a single poly file for the editors. I did the same at the end of the shooting day. As explained earlier, this process is relatively simple. The best way is to leave all of the transmitters in record for as long as possible and not to stop them in between takes. The metadata for slate and take information is entered onto the main recorder, in this case, a Scorpio with a CL16. Even if there was no useable audio, the Scorpio is put into record just to capture the metadata info. Making a sound report in a CSV (Comma-separated Values) format creates an EDL (Edit decision list) that is used to conform all of the individual A10/A20 recordings into poly files for each take ignoring anything recorded between the takes using SD-Utility https://www.sounddevices.com/product/sd-utility/

Tom Cruise and director Christopher McQuarrie on the set of Mission: Impossible – Dead Reckoning Part One from Paramount Pictures and Skydance. Photo by Giles Keyte

This means that from an editorial perspective, they receive a single poly file for each take that is similar to what they would expect in conventional shooting. I became very fast at conforming, meaning this did not take up too much time at the end of the day. Of course, it did mean that my team had to be particularly vigilant at timecode synching all of the A10 and A20 radios. It was not until toward the end of production that the Nexus was available which made the whole process much easier. With the Nexus, all transmitters are automatically timecoded and it is possible to remotely stop and start recording on each device, as well as making other changes like file naming, sleeping/unsleeping, changing frequencies, and adjusting power levels. This has been quite a game changer for me since we now set up a network making access to the transmitters possible for any of my team from a phone or iPad.

We had several shutdowns due to COVID but on the whole, probably managed to keep going more than most productions.

After Italy, our next location was Abu Dhabi. The good thing here was that there was already a vaccination program in place for any of the local crew and hotel staff we were likely to come into contact with. However, we were all still unvaccinated as there were as yet no vaccines approved in UK or Europe. We shot primarily in an airport that was under construction and yet to be opened, and for the first time, shot in a fairly conventional manner. We also shot the desert sequences which were tricky to protect so much equipment in what was intended to be a sandstorm. We worked mostly in Land Cruisers that we had fitted out. However, COVID was becoming a real problem in the UK with yet another wave and new strains of the virus being discovered. We had to leave Abu Dhabi before we had shot everything needed because the UK government was about to introduce an isolation program that on our return to the UK would force us to isolate in specific hotels for two weeks. We made it back just in time and eventually rebuilt parts of the airport set in a shopping centre in Birmingham, and the desert sets in a quarry, to complete the sequences.

Most of the remainder of the film was shot on sets at Longcross Studio either on one of the two stages that we had specifically built for us during the COVID shutdown or on the backlot and on UK locations.

It is often said that “Necessity Is the Mother of Invention,” and in this case, the necessity to comply with COVID restrictions forced us to investigate new ways of working. Mainly with a complete change of how we shoot and record car and chase sequences to avoid the crew being jammed into follow vehicles by using the recording capabilities of radio mics and the ability to conform to a single poly file, and as well as the use of duplex communications to avoid having to work too closely together on set enabling us to communicate efficiently.

Certainly, as far as the Mission: Impossible series is concerned, I do not think there will be any going back on the changes we made during the COVID pandemic. We have proved that recording on radio mics, particularly now that the A20 has 32-bit float and SD Utility has the capability of conforming the files for editors, is the way to go. I’m not certain that other productions will continue with the BOLERO communication systems but our director, Christopher McQuarrie, has requested the system to be expanded for Dead Reckoning Part Two which is already in production, though currently on hiatus due to the industrial action taken by writers and actors.

We are already planning how we will record dialog and facilitate communications for some even more daring stunts, including using some new technology to record underwater. Currently, I am not permitted to write about this or the even more amazing stuff that we are doing but look forward to telling you all about it in 2024.

Due to the extended schedule because of COVID, we had a number of different team members for the shoot. These were the following: Lloyd Dudley, 1st AS/Additional Production Mixer; Tom Harrison, 1st AS/Boom Operator; Luigi Pini, 1st AS/Boom Operator (Italy); Freya Clarke, 1st AS/Boom Operator; Hosea Ntaborwa, 1st AS/Boom Operator; Ayesha Breithaupt, 2nd AS.

Post-Strike Post Game

by James Delhauer

It very much feels as though Hollywood is at a crossroads. The #MeToo Movement began a cascade of high-profile scandals that continue to this day. The COVID-19 pandemic shifted the industry from a “Cinema First” to a “Streaming First” model that’s proven financially disastrous. The 2021 IATSE Basic Agreement & Area Standards Agreement negotiations unveiled a longstanding custom of worker abuse through platforms like the IAStories Instagram account. Throughout 2022, Variety and Deadline reported on the deteriorating working conditions in visual effect companies. The oversaturation of CGI-heavy blockbusters had created a crunch culture of unpaid overtime and exhaustion. Then on May 2, 11,500 members of the WGA put down their pens and walked off the job in protest of the poor wages, conditions, and job security protections offered by the Alliance of Motion Picture and Television Producers. On July 14, they were joined by more than 160,000 members of the Screen Actors Guild—American Federation of Television and Radio Artists. For a combined 191 days, film and television productions across the country shut down and the industry nearly went dormant. Now the strikes have come to an end and production is beginning to resume. According to the Milken Institute, the strikes have cost the American economy approximately $6 billion. So as the dust settles, let’s take stock of what’s just happened.

Local 695 member Omar Cruz Rodriguez and his son Lorenzo on the strike line
Local 695 Trustee Jennifer Winslow on the strike line

Throughout early 2023, there was a growing sense that production was beginning to slow in anticipation of the strikes. The Motion Picture Industry Health & Pension Plans have reported a steep decline in reported working hours as early as February compared to 2021 and 2022. By April, the plans had seen almost a 20% reduction in working hours. Perhaps this was merely a result of the studio’s realization that they had overcommitted on the development of streaming content during the pandemic, but that’s a hard narrative to accept. Early on during the strikes, Deadline and Variety reported that studio execs were going to “bleed out” the unions, that “The endgame [was] to allow things to drag on until union members start losing their apartments and losing their houses,” and that the studios said they were indulging in a “cruel, but necessary evil.” Such public statements make it clear that the goal was to break us; not just the WGA and SAG-AFTRA, but all of labor.
This adversarial relationship that has cropped up between business and labor is to nobody’s benefit. The studios did not save themselves any money by allowing a strike to go on for 191 days. The concessions that were made in September and November could have been made in May and July, and we’d all be $6 billion better off for it. Nearly 40% of the Los Angeles economy is tied to the motion picture industry, which translated to a very real cost for almost 4 million of the 9.83 million Los Angeles County residents. That is to say nothing of entertainment workers throughout the rest of the country and world who were impacted. Now both business and labor are trying to recover from wounds inflicted upon one another as we stare down the IATSE and Teamster contract negotiations set to take place next year.
A more productive relationship between both sides going forward is essential. In my experience, filmmakers don’t want to go on strike. Getting into this business is a dream that many have and the select few of us who managed to get our foot in the door want to keep chasing that dream as far as we can. We want to make movies and shows that we can share with millions of people around the world. We want to explore our crafts and become better and more skilled craftspeople than we were yesterday. We’re proud people who are proud of the art we create. But before all of that, we’re human beings. We have families. We have needs. We want to be our employers’ proud collaborators.

But this isn’t a collaboration. Setting out to bleed your partners dry and watch them lose their homes is not partnership at all. Worst of all, it didn’t even work. Labor banded together. The WGA and SAG-AFTRA were not alone on the strike lines. They were joined by the IATSE, the International Brotherhood of Teamsters, the Basic Crafts, and the labor community as a whole. Workers from labor unions that have nothing to do with entertainment joined us in solidarity on the picket lines. What’s more, we worked hard to take care of one another.

During the strike, our partners at the Motion Picture Television Fund (MPTF) and the Entertainment Community Fund moved mountains to support out-of-work casts and crews, offering a combined total of more than $15 million in grants to workers whose jobs were impacted or suspended by the strikes. Both the IATSE and the Teamsters set up more than $4 million each in relief funds for their respective memberships in order to create a safety net for those who were out of work due to the strikes. Similarly, Local 695 contributed another $250,000 specifically to support our members and various other locals across the country set up similar support systems for their members.

Numerous food and grocery drives were set up, helping hundreds of workers put food on the table for their families. Thanksgiving saw the IATSE, the Brotherhood of Teamsters, and Basic Crafts joined with Labor Community Services to provide up to 2,500 families with Thanksgiving meals to celebrate the holiday. Those on the strike line created a sense of comradery with music and performances for one another. This is the strength of the labor movement: Unity. It feels almost redundant to have to say this, but the power of unions is that we are unified. Our core principle comes down to the idea that an injury to one is an injury to all. So when our brothers, sisters, and kin took to the strike lines, the whole of the labor community flocked to support them.

Local 695 member Gussie Miller on the strike line
Local 695 member Stephen Harrod and his guitar on the strike line
Local 695 Asst. Business Agent Heidi Nakamura and Local 705 Business Rep Adam West

As of November 14, more than 366 labor actions have occurred in the United States in 2023. We are seeing a swell within the labor movement as strike after strike results in new and fairer deals being made between labor and business. For the first time in almost sixty years, Gallup has reported that 71% of Americans support unions and our cause. The momentum that we’ve built will continue into next year as the IATSE and International Brotherhood of Teamsters go into their negotiations with the AMPTP. Labor is united. The employers should remember that.

HOT IN THE CITY (and everywhere else)

by Bryan Cahill

Climate change means more extreme weather events like the record cold and rain we experienced last winter here in sunny California. No matter the cause, just about everyone agrees that the earth is getting warmer. No matter where we live, we will all experience more frequent and more extreme heat events leading to more heat-related illnesses.

Over the past thirty-five years, heat has claimed more lives per year on average than flooding and hurricanes combined. Heat is the leading weather-related killer in the United States. Even so, many heat-related deaths go misdiagnosed or unrecognized because heat exposure often exacerbates underlying medical conditions such as diabetes and heart disease.

“As we continue to see temperatures rise and records broken, our changing climate affects millions of America’s workers who are exposed to tough and potentially dangerous heat,” said U.S. Department of Labor Secretary Marty Walsh. “We must act now to address the impacts of extreme heat and to prevent workers from suffering the agony of heat illness or death.”

In 2021, LA Mayor Garcetti named Marta Segura as the city’s first Chief Heat Officer. On August 31 of 2022, Burbank and Woodland Hills reached record highs of 112 degrees. Temperatures continue to rise as evidenced by July of this year which is believed to be the hottest July experienced by the planet in the one hundred seventy-five years of recorded global temperatures.

On the upside, mitigating the risks is actually good for a business’s bottom line! I’ll get back to that point in a minute. First, what exactly are heat-related illnesses and what are the causes?

According to the EPA, heat-related illnesses can occur when a person is exposed to high temperatures, such that their body cannot cool itself sufficiently through sweating. Symptoms range from mild swelling, rashes, or cramps to potentially deadly heat exhaustion and heat stroke.

On Thursday, February 9, of this year, California Attorney General Rob Bonta joined a multistate coalition of attorneys general in a petition urging the U.S. Occupational Safety and Health Administration (OSHA) to take emergency regulatory action to protect workers against extreme heat. Bonta stated, “As climate change results in longer, more intense, and more frequent heatwaves, workers in California and across the country are increasingly and unnecessarily exposed to dangerous conditions on the job. We have the tools to address this challenge and we must use them.”

“Requiring cooling tents, extra workers, or other solutions should be proposed by the IA as we ramp up for negotiations on a new agreement with the AMPTP. Like the Teamsters, it is time for us to demand action on this issue at the bargaining table.”

–Teamsters General President Sean O’Brien

Heat illness isn’t just something that occurs outdoors, it can happen indoors too or on days with multiple locations both indoors and outdoors. Thus, California is considering new indoor regulations. So there goes California again, creating new regulations that are bad for business, right? Not this time!

In considering the new regulations, California commissioned a cost analysis from the prestigious think tank, Rand Corporation. When Rand ran the numbers, they found that heat mitigation measures in the workplace proposed by California would reduce approximately two hundred injuries a year and prevent one to two deaths per year on average in indoor work environments. The study estimates the reduction in heat injury and death would add $200 million a year in benefits for California businesses! Truly a win-win situation!

On union productions, we already have one tool in place; Safety Bulletin #35 “Safety Considerations for the Prevention of Heat Illness.” The summary states, “Heat illness is preventable. Know your limits and take time to adjust to the heat. Above all, drink plenty of water and immediately report any signs of heat illness in yourself or others.”

But, is that really enough? One thing that irks me about this safety bulletin summary, as well as the language in many safety bulletins is that they place the burden of action for staying safe squarely on the workers. How many times have you heard someone in authority on a set say, “We are providing everyone some time to adjust to the heat.” What is the AMPTP’s responsibility in protecting us from heat illness?

Employers should have a plan in place in case of a heat illness emergency. This plan should include procedures for identifying and treating heat illness, as well as procedures for protecting workers from the heat. Employers should also monitor the weather forecast and take steps to protect workers if temperature or humidity is expected to be high. We do it for thunderstorms and as I have already pointed out, heat is a more deadly phenomenon.

According to the CDC, “The best way to acclimate yourself to the heat is to increase the workload performed in a hot setting gradually over a period of 1–2 weeks. Perhaps productions should be required to bring in extra workers or limit outdoor work during the acclimatization period.

Heat illness is a concern for all unions. On 6/15/2023, the Teamsters announced that UPS had agreed to provide air-conditioning in all new vans. “Air-conditioning is coming to UPS, and Teamster members in these vehicles will get the relief and protection they’ve been fighting for,” Teamsters General President Sean O’Brien said. Requiring cooling tents, extra workers, or other solutions should be proposed by the IA as we ramp up for negotiations on a new agreement with the AMPTP. Like the Teamsters, it is time for us to demand action on this issue at the bargaining table.

Even under the best protections, it is still true that we will still have a need to look out for each other both above and below the line. One condition of heat illness is confusion and a confused person can’t be trusted to make the best decisions for themselves. With temperatures in Los Angeles reaching what is traditionally our hottest time of the year, it is up to all of us to learn and recognize the signs of heat illness.

Video Assist: The Ace UpYourSleeve

by Amber Michaëlle Maher

The back set of Daystrom Station in Star Trek: Picard. Video village and QTAKE system pictured background left.

Before electric turns the lights on and while the stages are still cold, you open the large elephant doors, do your morning walkthrough, strategize, and get your carts out of your truck and set up for the day.

The Video Assist Operator’s job involves setting up video villages, routing signals to other departments, recording camera and sound signals for playback, and more recently, streaming video and sound signals in sync to every director, producer, and crew member who needs to be able to see what is being shot. Most of the time, you’re a one-person department on set, so there is no time to waste. You have to plan where you’re going to land and hope you’ve picked the perfect spot for your director and producers to work. Hook it all up and go! Setting all this up typically is the first ten to fifteen minutes of the day.

Doing all of this work typically requires three carts, each weighing approximately 200 lbs–500 lbs. First, the main cart is compact. It’s equipped with a fully engineered computer system to run QTAKE, AJA routers to support four or more camera/video and sound inputs, A-D camera outputs, secondary village outputs, and all the cables; snakes; and other miscellaneous items necessary to get through the day. These setups are typically designed by the most skilled and OCD engineers in the business and are designed to make use of every nook and cranny of space. Often you feel like the pit crew at the Indy 500 or at the Grand Prix because you need to be able to plug in and reset at a moment’s notice. Five minutes to move your all three carts or bang! You’re dead!

Amber Maher running two QTake systems, totally eight camera inputs on Star Trek: Picard.
Video assist carts on the set of Star Trek: Picard USS Titan Bridge, Season 2 and 3

The second cart is the director’s cart, better known as a “Village,” and it is outfitted with three or four very expensive OLED monitors, an extra stand, and maybe even a robocup—if you’re fancy. This is all in order to provide the director with the tools they need to see and hear what’s happening on the set. Lastly, your third cart is for the producers so that they have a “Producers’ Village” of their own. This allows their own creative visions to be in a separate world from what’s going on in the director’s world. The two should not mix. Therefore, hooking up and servicing all three to your carts is a little like being a video and sound octopus, spreading out in many directions at a time. And that’s just with three carts; sometimes larger productions require additional carts. It can be physically overwhelming at times, especially when off-site, on location, or off road somewhere.

Prior to the pandemic, video assist work was fairly standardized. The job revolved around taking in audio/video signals for the purpose of servicing the rest of the set. You’d play back footage as needed, mostly as reference material for whomever needed it. While it could be a large workload, most of the time you knew what you were signing up for. COVID-19 completely changed our jobs as we knew them. The need for social distancing and new safety measures caused an acceleration of remote technology that has had a massive impact on Video Assist Operators.

Due to distancing restrictions imposed by COVID, the remainder of the crew can’t hover around video village like the old days. Instead, everyone else who needs a video feed gets an iPad setup. People working remotely and talent who need video in their trailers are usually given Apple TV’s. New people come and go all the time and they need to be streaming right along with everyone else. There are a lot more plates spinning in the air on any given day. The rest of the day is spent alternating between recordist, customer service, playback, emotional support person, creative consultant, and Apple Genius/IT specialist. Everyone on your set is watching and listening to the QTAKE feed, ready to get the day started and ended, and create Hollywood’s movie magic in-between.

All of this requires that the Video Assist Operator be a lot more involved in the minute-to-minute decision making than in the days before COVID. I’ve found myself on a first name basis with directors, showrunners, and executive producers a lot more than I did before the pandemic. By coordinating playback and communications among the top brass, I have had the wild privilege to be a fly on the wall and to be involved problem solving with these giants. We look over shots, sequences, and script changes for some of the best TV series and films in the business. Often the director or VFX team needs a quick mock-up and rendering to make sure that what we’ve been shooting is working. This means that the most-effective operators require multiple skill sets. Can you cut a few clips together? Can you do some on-the-spot VFX in a pinch? Can you figure out why one producer’s iPad just won’t seem to work? Great! Now can you do all of that while you’re also doing your regular recording, cataloging, playback, and streaming?

In a worst-case scenario, the cameras and crew may be on the move while you are required to stay behind and review footage with the director/producer/showrunner/etc. Once you are no longer needed, there’s a mad dash to get your three carts moved to the next location so production isn’t waiting on you! This is one example of why having a Y-7 Video Assist Utility working with you is crucial; they can help move the carts and set up while you are working on other tasks with above-the-line staff. Utilities are integral to the ability to run, patch cable, and troubleshoot technical issues while the operator is otherwise occupied.

Amber Maher and her Video Assist Trainee Vadym Medvediuk on the USS Titan’s bridge, Star Trek: Picard Season 2 and 3
Some of the cast & crew on Space Jam: A New Legacy basketball court on wrap day at Warner Bros. Studio Lot, Burbank, CA.
Live compositing of VFX in QTake for Star Trek: Picard.

For example, I once was crawling underneath the stage because I had lost my internet connection and had to re-run the cable right quick. It was early in the morning before anything had really started. I was solo on this show and didn’t have a Y-7 with me, but I thought I had a brief second to investigate it. So I’m under the stage when suddenly, I heard, “Where is the Video Assist? Where is Amber?” over the walkies. I popped up out of the middle of a starship stage like a hamster out of a hidey-hole, shouting, “I’m here!” I was covered in dust while the whole cast and crew were staring at me. The Executive Producer needed me to return immediately to do a comp, so I ran right up to do it, but I knew I still had to fix the cable, too. Working our jobs is to constantly be on call while on set. It’s very difficult to leave your cart, even to do another part of your job. Time management is key, managing expectations is key, and under-promising and over-delivering helps. Video assist responsibilities on set have grown beyond what is reasonable to ask of a single person and the amount of work is only increasing as technology develops and is incorporated into our workflows. We need a bigger team. Time is of the essence and a precious commodity on any large production. There’s too much at stake and this department is one of the worst places to cut corners. You are, in essence, cutting your directors and producers off from quality checking your film. Mistakes that are missed on set can cost a fortune to fix in post or reshoot later.

On Star Trek: Picard, there was a previous shot that we had recorded that was “obviously” not working with our new crane footage. If not fixed, talent would have to be flown back into the country and crew would have to go into a sixth day. Due to my previous VFX know-how, I was asked to perform a miracle and composite the two shots. I was able to mask, animate, render, and comp the previous shot together with the new one to work seamlessly. The whole crew held their breath while I worked. And when our showrunner gave it his nod of approval, everyone cheered. Our VFX Director patted me on the back then told me, “Do you know you just saved them about one year of your salary in what it would have taken to fix that shot?” This was only possible because I had the necessary support to do my job well on that show, but if I hadn’t been a part of the conversation because I’d been off fixing a broken cable or rebooting a router because there was no one to help me, the show would have paid through the nose for it.

And this isn’t an isolated incident. When I recently worked on Beverly Hills Cop 4, I was the Video Assist Utility and was able to make quick-reference video edits on the fly on my laptop, which were used by the Director to save time and to move on to the next shot quickly. It was very helpful to all involved. On Aquaman and The Lost Kingdom, the advanced video engineering technology at our fingertips was mind-blowing. Video assist worked together with the video wall engineers and camera technicians to “scan” actors in these LED video wall stages in order to “paste” them into the films. This is done live, as we’re shooting, while we comp or overlay it all together. Then I had to use a secure double-encrypted link from QTAKE to stream it all to the above-the-line team in Japan and across the world. Video assist has revolutionized the entire creative process! When I worked on Space Jam: New Legacy, we were getting the motion capture composites and overlaying them live over the actors. This allowed the director and actors to see the 3D cartoons playing basketball together in real time while filming. Conversely, while working on Star Wars: Skeleton Crew, we used entire rooms of 180° video walls where the set is placed inside the walls, and the walls change the imagery while rolling. This means a green screen and huge sets are no longer needed. The backgrounds are baked into the picture and look seamless on camera. Video assist has revolutionized the entire creative process!

What we can do now is amazing, but it is additional work. With the new demands of technology and the current shooting culture of wanting to see a quasi-final concept of the finished product while shooting it, extra demands are being made in the video assist world. After recently being part of Local 695’s LED wall training at Stargate Studios and going through ROE Academy’s LED wall certification program, I realized that modern Video Engineers and Video Assist Operators need multiple skill sets comprised of many visual components. You need to understand camera processes, video editing, VFX workflows, color timing, and video engineering to get these walls working in the camera frame. The technology available continues to get more complex and our skill sets as Video Engineers have to grow along with it. Yet getting additional staffing on any given production can be like pulling teeth!

Amber photographing on location with her company Silver Pix Studios in Block Island, Rhode Island. Photo by Melissa Arkin

On a conceptual level, all of this isn’t that different from what Video Assist Operators have always traditionally done. We’ve always been responsible for recording all the video and reference audio signals, cataloging that media for later reference, and playing it back for all parties. It’s the massive scope of what this work entails that has grown in a huge way.

Video engineering is entering a new renaissance right now. It is a very complex visual field now in palette which you collaborate with directors and producers continually. Your computer is your palette and you assist the directors to create these new worlds instantly while recording. You will need to be more ingenious in order to succeed. All of this happens from your three little carts, on a film set wherever in the world you are, with your little wheels pushing around your long octopus cables, running like the wind to get it all plugged in, streamed, and working before the camera sets up. It’s a whole new exciting game out there now. This is the next level of what’s happening, where video engineering is headed and we are all very much needed on set to perform these duties.

I’m certain the role of video assist will continue to change in the coming years. Evolving technology and innovative filmmaking practices will drive the importance of more Video Engineers working on set. I have no doubt that we’ll see more responsibility in the future and, as someone who loves her job, I’m looking forward to all the new ways that I’ll be able to contribute to the filmmaking process. Expanding the video department and integrating the Video Engineers working in this particular specialist field to that of creative director/collaborator is in my opinion needed to be made to set a precedent for us to follow. If you are working alongside directors, conceptualizing the shots with showrunners and needed to do on-the-spot edits for post production to follow the director’s instruction, you are in essence an integral part of the creative team. There are lots of new lines of creative work being done here. The video engineering and technical challenges that we experience on set currently are very demanding intellectually and creatively. You are often the right hand of any director. We are heavily relied on and support the entire crew with our streams now. By defining these new technology roles, this will potentially enlist more Local 695 positions, it adds a layer of quality to any project. There are truly no downsides to expanding the video assist engineering team. We sometimes are the video wizards who create a magic miracle from our carts. We are the troubleshooters, the fixers, and the quality checkers needed while in production. We are the one-two stop shoppe and backbone for production. Therefore, our value needs to be known. Long gone are the days of just pressing the record and playback buttons. Our technology is ever-expanding, increasing, and manning up our departments is the most financially responsible decision that a production can make! On any given day, we can catch errors that could cost production hundreds of thousands of dollars and ample headaches.

Otherwise, how much time and money is one willing to lose by not having a proper video assist team here servicing the cast and crew, streamlining the creative process for all. Not delegating enough manpower and funding in this department to function fluidly, well … that my friend is a double-edged sword that cuts both ways.

Building Solidarity: Discussing Artificial Intelligence

by James Delhauer

I still remember the first time my brother showed me The Terminator. Arnold Schwarzenegger lumbered through that film, a soulless and unstoppable killing machine hellbent on one thing: killing Sarah Connor so that the Skynet artificial intelligence (AI) could take over the world. This was my first exposure to the concept of artificial intelligence and because of it, I spent years waiting for our family computer to go postal in the middle of the night. Fortunately, computers don’t seem all that interested in genocide just yet. Instead, the AI applications that have come to dominate the news cycle in the last few years are focused on efficiency, productivity, and creativity. Services like ChatGPT, Google Bard, Midjourney, Stable Diffusion, and more are being used to complete tasks and churn out content faster than ever before—all of which have the potential to severely disrupt the work and livelihoods of people worldwide. That is why the IATSE has begun a series of initiatives to address the troubling concerns raised by the proliferation of machine-learning technology.
Back in May, IATSE International President Matthew D. Loeb announced the creation of the IATSE Commission on Artificial Intelligence. The goal of the commission is to bring IATSE members and representatives together with external experts to help shape the union’s approach to handling the challenges and opportunities presented by AI.

“As AI continues to evolve and proliferate, it is critical that our union is at the forefront of understanding its impact on our members and industry,” said President Loeb. “Just as when silent films became talkies and as the big screen went from black-and-white to full color, the IATSE Commission on Artificial Intelligence is part of our commitment to embracing new technologies. We will work to equip our members with the skills to navigate this technological advancement, and to ensure that the transition into this new era prioritizes the interests and well-being of our members and all entertainment workers.”

As part of the union’s efforts to embrace this new technology, the IATSE Education and Training Department has released a LinkedIn Learning Path (a compilation playlist of video courses on a particular subject). This initiative is the first of many educational initiatives, both on the international and local levels, that aims to equip IATSE members with comprehensive knowledge about the core aspects of contemporary artificial intelligence technologies so that workers are prepared to take advantage of the opportunities that new technologies bring. Additionally, the IATSE Training Trust Fund provides all members with a complimentary LinkedIn Learning Account, meaning that current members can take advantage of this learning path today.

However, embracing these new technologies isn’t the same as allowing them to run rampant. To that end, the union’s Political and Legislative Department has begun talks with government officials and fellow labor leaders to discuss the implications of AI for workers and the economy. Political and Legislative Director Tyler McIntosh met with representatives from the Biden/Harris administration to discuss IATSE’s concerns about how our members might be affected or displaced by machine learning. The meeting, which was attended by labor leaders, officials from the White House National Economic Council, the Office of Science and Technology Policy, and the Office of the Vice President, demonstrated both the potential benefits and harm of using artificial intelligence in the workplace. On the one hand, there is no denying that many tasks can be completed in a fraction of the time they used to require and that these tools have the potential to improve safety and efficiency in the workplace. On the other hand, there are already real-world examples of the cons as well. Employers using AI to track employee performance metrics have been shown to inaccurately report or flag performance problems where none exist, leading to increased stress and mental anguish at work. It also opens up concerns about workers’ right to privacy, civil rights, and autonomy from employers. Leaders within the union also concurred that the integration of artificial intelligence poses a threat to the rights of creators, including their ownership of voices, likenesses, and the ability to derive fair benefits from their intellectual property contributions. Furthermore, the utilization of AI tools by employers has introduced the potential for job cuts and increased work schedule uncertainty. In light of these concerns, participants emphasized the importance of employers and the administration ensuring that workers continue to have access to high-quality employment opportunities that prioritize their well-being and health. They further emphasized the need for workers to have a voice in determining how AI is implemented in the workplace.

These talks and concerns have resulted in the creation of the IATSE Core Principles for Applications of Artificial Intelligence and Machine Learning Technology. These eight principles represent the values our union holds when it comes to these new tools and our approach for how we will handle their continued development going forward.

1. A Comprehensive Approach

With stakes as high as the livelihoods of IATSE members in all crafts, the International is committed to addressing artificial intelligence in a comprehensive manner. Therefore, the union’s approach will encompass research, collaboration, education, political and legislative advocacy, organizing efforts, and collective bargaining.

2. Research

AI is evolving at an exponential pace. From the time the first machine-learning tools hit the market a few years ago up till now, the progress these applications have had is staggering. This is because machine learning tools beget new tools to accelerate the pace of machine learning. This rapid pace requires constant vigilance and diligence when it comes to staying informed of current and developing trends. Therefore, the IATSE is committed to studying this technology with a focus on how they might reshape the entertainment landscape and work with experts to develop contract provisions, legislation, and training programs to ensure that these tools are used in an equitable way.

3. Collaboration

with Partners and
Stakeholders

IATSE leadership will collaborate with allied groups and organizations to build solidarity amongst labor advocates when it comes to AI. This partnership includes the AFL-CIO Technology Institute, the Human Artistry Campaign, the Copyright Alliance, and the Department of Professional Employees. Considering the decentralized nature of this technology and the practicality of engaging with multinational corporations, IATSE recognizes the importance of ongoing collaboration with external allies beyond the United States and Canada. This includes fostering partnerships with organizations such as UNI MEI and BECTU. By working collectively with these international counterparts, IATSE aims to address the challenges posed by the global reach of AI and strengthen the labor movement on a global scale.

4. Education

IATSE members perform some of the most specialized jobs in the world. Many of us have had a direct hand in developing the tools and technologies we use at work. Given that these new tools have the power to reshape many of our crafts as they currently exist, the IATSE is committed to ensuring that members have the right to receive necessary training and retraining opportunities so that their livelihoods may be protected in the face of technological advances. This will be facilitated through the union’s Education and Training Department, the IATSE Training Trust Fund, and Local-sponsored training through the Contract Services Skills Training Program.

5. Organizing

Although AI and machine learning have the potential to disrupt jobs and displace workers, they undeniably also have the potential to create new jobs, new fields of industry, and new avenues of entertainment. The applications for AI in motion picture and television are obvious, but the possibilities for virtual reality, augmented reality, and yet unimagined artforms are endless. As new fields of work emerge, the IATSE is committed to organizing workers under union contracts to ensure that technology does not replace human interests as the priority in our industry.

6. Maintain Workers’ Rights, Members’ Job Security, and Union Jurisdiction

Employees who utilize AI tools deserve the same rights and protections as those who do not. It is crucial that the introduction of new technology does not serve as a pretext for undermining the hard-won advancements in working conditions that unions have tirelessly fought for over the course of many decades. Nor should it serve as a means to bypass the role of unions altogether. The union remains steadfast in its commitment to advocating for the job security of its members in the face of artificial intelligence.

7. Political and Legislative Advocacy

The union will continue to pursue its Federal Issue Agenda, focusing on strong copyright and intellectual property laws and labor protections. This will include lobbying efforts to ensure workers making use of AI are appropriately compensated for their work, that people are prioritized over machines in the creative process, that intellectual property owners are protected from theft, and to prevent legal loopholes from being used to exploit individuals, companies, and organizations within the IATSE’s scope of influence.

8. Collective Bargaining

Is this one really a surprise? It’s basically priority one for every union in the world. IATSE is committed to negotiating with employers and fighting for provisions to address the negative aspects for AI in all future contracts. IATSE is committed to demanding transparency from employers with regards to their use of AI, even if government policy does not yet reflect this demand. Lastly, the IATSE is committed to protecting privacy rights and ensuring that AI applications are held to the highest ethical standards, especially when regarding issues of discrimination and fairness.

AI scares me. In all the generations in history, we will likely be the first to see two industrial revolutions in a single lifetime. The rise of the internet changed the face of the world. Jobs were created. Jobs were lost. Knowledge was shared like never before. People who had never had the opportunity to be educated suddenly had the opportunity. People who couldn’t communicate due to language or distance suddenly could. But we’ve all seen the dark side of it. Intellectual property theft exploded like never before. Algorithms that prioritize profits over the well-being and mental health of users are everywhere. Media echo chambers that drown out real conversation have led to civil unrest. For every good that the internet has achieved in the last thirty years, there is a corresponding negative to go with it. Artificial intelligence has the potential to do the same thing, but bigger. This is why I applaud the IATSE for stepping up to address these concerns in such a committed manner. The genie of AI might already be out of the bottle, but there is still time to make sure that that is a good thing.

Sports Broadcasting Audio Mixer: Antony Hurd

Few things have the power to unite people quite like organized sports. From the tense moments before the game starts to the final buzzer, the electrifying atmosphere of it captivates fans around the world. There is a palpable tension that hangs in the air as a ball sails through the air, only to shatter as the roaring cheer of the crowd erupts as a goal is scored. Everything from the players to the music to poorly made hot dogs creates an energetic camaraderie felt amongst strangers, bonded by their shared passion for the game. The experience of attending a game transcends mere entertainment. However, for those who cannot attend in person, there is the art of the sports television broadcast.

I had the pleasure of sitting down with Local 695 member Antony Hurd, a Sports Broadcast Audio Mixer with a career spanning three decades.

Q: Let’s start with learning a bit about you and how your career began.

It’s all who you know. My dad was a career man at CBS, and in the early ’80s said he could get me work in Rams pre-season football. It just wouldn’t be paid work. Immediately, I was on my way to Anaheim Stadium. I started out with the intention of being a camera operator, and it was a fluke that I was assigned to the sound department. Steve Kibbons was mixing, and Bob Tully was the A2.

A few weeks after I started, Bob told me they needed an A2 at Video Tape Enterprises (VTE). He pitched me to them with “I know this kid who doesn’t know anything, but he’ll work hard.” They took a chance on me, and I was hired to work a Dodgers game with Carolyn Bowden, who was mixing. She told me to go set up the booth, but I was so green I didn’t know what she was talking about. Carolyn took me under her wing and walked me over and showed me how to set up a booth and taught me how to be an A2.

Working with them and building connections through my start with them has kept me working ever since.

Q: Are you a member of any other unions?

In addition to being a Local 695 member, I am also a longtime card-carrying member of IBEW and NABET. The network the game airs on dictates which union the work falls under. If given a choice, I choose to work under 695 as our Local provides the best benefits of the three.

Q: What has been the bulk of your work?

When I was 19, I mixed my first show. I am now 59, so it’s been a long career. 99% of my career in that time has been in live sports. The other 1% was comprised of E red carpet events and shows.

I’ve been working Lakers games for 40 years, starting as an A2 for the first few years, and then moving to mixing. The home games were on FSN, and the away games were on Kcal. I traveled with Sue Stratton and Kcal for 17 years. When Kcal Sports was discontinued, I was no longer on road games, but still did the home games. When Spectrum acquired the entire Lakers package of home and road games, I began doing all non-network exclusive Lakers games, both home and road. I believe this is my 25th NBA Finals. I also work on about one hundred baseball games a year. I have worked on eight Olympics, some World Series, Stanley Cup, etc.

A few years back, I counted how many days I worked in the year, and it was three hunderd four. Now I work less and have slowed down.

Q: Since you have also worked on the E Red Carpet, how do sports and broadcast differ?

My E experience was not in the studios, it was strictly on the red carpet. I think my sports background helped me in that environment. For example, for the Oscars, they would do a live from the red carpet show and a countdown show, which began airing approximately four hours before the Oscars started. Once the celebrities start arriving, it’s very “fly by the seat of your pants.” You may end up switching focus abruptly if someone more interesting shows up. My experience with switching quickly in sports helped my ability to do the E red carpet. I have also worked on the SAG Awards, Grammys, Emmys, and Golden Globes.

Q: Gotcha. So what does a typical day look like for you?

I’m a sports fan, so I really enjoy my job.

We arrive about six hours before the game time for regional sports. We don’t strike the truck every game. At Dodgers Stadium, everything stays plugged in and laid over for ten days. Now that they have the pitch clock, it’s much nicer because we have a better idea of how long we’ll be working. I’m told that on average across the league, the pitch clock has cut thirty minutes from the game.

The A2’s will set up the microphones and run the cables and set up the booth. When I get to the baseball game I’m doing tomorrow, I will plug in whatever cables I need to, and recall my setup at my console. On the first day of the season, I build my console from scratch. One of the biggest difficulties about working in sports now is all of the commercial signage. All the processing is slow, so we have to delay the audio to match the video. An example is when we switch from camera four (pitcher/batter from centerfield), to camera two, the delays have to change. The guys at Sunday Night Baseball told me they have eight different delays depending on which camera is online. It’s a nightmare. If the system malfunctions, the bat crack sounds like a machine gun.

Q: Do those delays stay consistent, or does it vary by game?

It stays fairly consistent within each stadium. It does vary by stadium. For example, the pitch cast at Dodgers Stadium is six frames, and at Anaheim Stadium it’s seven frames.

Q: Are there issues or challenges that arise in the sports area that don’t pop up in other areas of Local 695 work?

We have 20,000-30,000 screaming fans that we have to consider. We often choose consistency over quality. If you hear the bat crack or the ball swish through the net or the announcers, we’ve done our jobs. We mount Sony ECM-77’s on a pad under the rim of the basket. We also have directional microphones, usually Sennheiser 416’s. Those point at the keys. We have seven stationary mics in basketball.

Q: Typically, how many mics or audio feeds are you managing at any given time?

It’s generally about twenty mics for each sporting event. Golf has more, at about 6-7 mics per hole. For NBA Finals, we have extra effects mics. The amount of mics can vary a lot depending on the sport and venue. Tennis is probably my favorite sport to mix because the crowd is quiet. When covid started, they pumped in fake noise to make up for the lack of crowds, which I thought was ridiculous. It didn’t make sense to me; I thought it was great that we could hear everything. Then we were on a conference call with Major League Baseball, and they pointed out that with no crowd noise, you can now hear the catcher shift, which before the batter and pitcher wouldn’t be able to hear. So there’s a lot to consider about the sound.

For baseball, we’re now in stereo and we have about fifteen effects mics on the field for a regional game. I have three talent mics and I manage those and the fifteen effects mics the whole game. I have a 64-input board that is always being used.

Q: How has sports broadcast work changed over the course of your career?

The analog-to-digital shift was huge. Also, audio has become more important over the years. When I started forty years ago, we only had one Bat Crack mic and one crowd mic in baseball, and now we have twenty mics on the field.

If you watch USFL or XFL, you will see everyone is wearing a microphone now. I haven’t worked on those, but it seems like a lot is happening at once. Sound being more important has definitely improved the end product.

Sports remote work isn’t always as glamorous as it seems

Q: How was the transition from analog to digital for you?

I’m not great with technology, so for me personally, it was a difficult transition. I was lucky because Calrec, which is the main company that provides the consoles for sports, made its first digital deck look like its analog deck. When Yamaha went from analog to digital, it was totally different and was a difficult learning curve.

Q: What teams are you most proud to have worked with?

The Lakers, by far. Traveling with the team on the team charter one hundred thousand miles a year is the only way to travel. I’m in a hotel about 100-120 days a year for my road games across the various sports. Traveling with the Lakers lets me stay in some pretty nice hotels.

Q: That’s fun. Any interesting travel stories?

Years and years ago, I was on a plane when we hit a sudden air pocket. I was asleep and woke up held down by my seat belt, but with my arms and legs in the air. Chick Hearn announced that we’d dropped ten thousand feet.

When I worked on the Olympics in France, I had several days off that I was paid for, and I was just able to spend the time skiing.

Q: Have you formed friendships with any of the players?

I don’t know if I would say friendships, but I do have a picture of myself and Kobe Bryant playing Ping-Pong on Thanksgiving in Detroit. Any time we’re on the road for a holiday, the team usually does something. This time, they weren’t going to do anything, so Kobe said he would. He rented out the banquet hall at the Townsend Hotel in Birmingham, Michigan. He got a bunch of TV’s and the Ping-Pong table, and catered the whole thing. Kobe was very approachable. I also have a picture of him with my nephew’s Flat Stanley.

Q: How do the Olympics differ from other sporting events?

Hauling cable up a snowy mountain is hard work! Overall, everything is just a much larger scale. It’s about two weeks of setup, and two weeks of the Olympics, so it’s approximately a month-long gig. When I worked for CBS mixing the freestyle skiing, the downhill kept being delayed due to the snow, so our team had to scramble to adjust our schedule and workflow so the network had enough content to make the day.

Q: Any career advice for someone who wants to enter the sports world?

Be computer-savvy. Everything is moving toward Dante at this point, so that’s an important skill to have. It’s almost more important than being able to mix the game. If you want to be a sports mixer, picture yourself as a fan and imagine what you want to hear watching the game. Though my best advice is: run for your life. Only a crazy person does this for a living.

I’d like to thank Antony for his time and for sharing the benefit of his experiences with us for this publication. His work and the work of so many members like him help to connect sports fans across the globe with their games, their teams, and fellow fans. Learning a little bit about the behind the scenes of the process will definitely change how I look at a game the next time I have one on and I hope the same is true for anyone reading.

2023 PRIMETIME EMMY NOMINATIONS

Nominations for Outstanding Sound Mixing 75th Primetime Creative Arts EMMY Awards

Outstanding Sound Mixing for a Comedy or Drama Series (One Hour)

Better Call Saul
“Saul Gone”


Larry Benjamin CAS, Rerecording Mixer Kevin Valentine, Rerecording Mixer
Philip W. Palmer CAS, Production Mixer Production Sound Team:
Mitchell Gebhard, Boom Operator;
Andrew Chavez, Utility Sound Technician

The Last of Us
“When You’re Lost in the Darkness”


Marc Fishman CAS, Rerecording Mixer
Kevin Roache CAS, Rerecording Mixer
Michael Playfair CAS, Production Mixer

The Marvelous Mrs. Maisel
“The Testi-Roastial”


Ron Bochar, Rerecording Mixer
Mathew Price CAS, Production Mixer
Stewart Lerman, Scoring Mixer
George A. Lara, Foley Mixer
Production Sound Team: Carmine Picarello,
Spyros Poulos, Egor Panchenko

Stranger Things
“Chapter Nine: The Piggyback”


Craig Henighan CAS, Rerecording Mixer
William Files CAS, Rerecording Mixer
Mark Paterson, Rerecording Mixer
Michael P. Clark CAS, Production Mixer
Production Sound Team: Brenton Stumpf, Boom Operator; Stokes Turner, UST;
Andrejs Prokopenko, Production Sound Mixer (New Mexico Unit); Vytautas Kizala,
Production Sound Mixer (Lithuania Unit)

Succession
“Connor’s Wedding”


Andy Kris, Rerecording Mixer
Nicholas Renbeck, Rerecording Mixer
Ken Ishii, Production Mixer
Tommy Vicari, Scoring Mixer
Production Sound Team: Peter Deutscher, Michael McFadden, Luigi Pini

The White Lotus
“Arrivederci”

Theo James as Cameron Sullivan, Meghann Fahy as Daphne Sullivan, Will Sharpe as Ethan Spiller, and Aubrey Plaza as Harper Spiller hang out on the beach in Taormina in Season 2, Episode 1 of HBO’s “The White Lotus.” MUST CREDIT: Fabio Lovino/HBO


Christian Minkler, Rerecording Mixer
Ryan Collins, Rerecording Mixer
Vincenzo Urselli, Production Mixer
Production Sound Team: David D’Onofrio, Boom Operator; Curzio Aloisi, Sound Utility


Outstanding Sound Mixing for a Limited or Anthology Series or Movie

Beef
“The Great Fabricator”

Penny Harold, Rerecording Mixer
Andrew Garrett Lange, Rerecording Mixer
Sean O’Malley, Production Mixer
Production Sound Team:
Chris Thueson, Boom Operator;
Kendra Bates, Sound Utility;
Jeffrey Zimmerman, Music Playback; Byron Echeverria, Video Assist
Additional Mixers: Jeremy Brill,
Mark Stockwell

Dahmer – Monster: The Jeffrey Dahmer Story
“Lionel”

Laura Wiest, Rerecording Mixer
Jamie Hardt, Rerecording Mixer
Joe Barnett, Rerecording Mixer
Amanda Beggs, Production Mixer
Production Sound Team: Zach Wrobel, Boom Operator; Saif Parkar, Utility Sound Technician; Britney Darrett, Leslie Metts, Brandyn Johnson, Sound Trainees

Daisy Jones & The Six
“Track 10: Rock ’n’ Roll Suicide”

Lindsay Alvarez CAS, Rerecording Mixer
Mathew Waters CAS, Rerecording Mixer
Chris Welcker, Production Mixer
Mike Poole, Music Mixer
Production Sound Team: Ryan Farris,
Boom Operator/Music Playback Operator; Emily Poulliard, Utility Sound Technician
Additional Crew: Donovan Thibodeaux &
Jared Lawrie, Boom Operators;
Oliver Bonie, Utility Sound

Obi-Wan Kenobi
“Part VI”

Danielle Dupre, Rerecording Mixer
Scott Lewis, Rerecording Mixer
Bonnie Wild, Rerecording Mixer
Julian Howarth CAS, Production Mixer
Production Sound Team:
Ben Greaves, Boom Operator/2nd Unit Sound Mixer; Eric Altstadt, Boom Operator; Yohannes Skoda, Sound Utility;
Chris Burr & Yisel Pupo Calles, Sound Trainees; Scott Solan, Boom Operator;
Cole Chamberlain, Boom Operator

Weird: The Al Yankovic Story
“The Roku Channel”


Tony Solis, Rerecording Mixer
Richard Bullock, Production Mixer
Brian Magrum, ADR Mixer
Phil McGowan, Score Mixer
Production Sound Team: Tanya Peel, Boom Operator; Kelly Lewis, Sound Utility


Outstanding Sound Mixing for a Comedy or Drama Series (Half-Hour) and Animation

Barry
“wow”


Elmo Ponsdomenech CAS,
Rerecording Mixer
Teddy Salas, Rerecording Mixer
Scott Harber CAS, Production Mixer
Production Sound Team: Erik Altstadt, Charles Stroh, Evan Scheckwitz

The Bear
“Review”

Steve “Major” Giammaria,
Rerecording Mixer
Scott D. Smith CAS, Production Mixer
Production Sound Team: Joe Campbell, Boom Operator; Nicky Ray Harris, Boom Operator; Nicholas Price, Sound Utility

The Mandalorian
“Chapter 24: The Return”

Scott R. Lewis, Rerecording Mixer
Tony Villaflor, Rerecording Mixer
Shawn Holden CAS, Production Mixer
Chris Fogel, Scoring Mixer
Production Sound Team:
Patrick H. Martens, Boom Operator;
Yvette Marxer, Sound Utility;
Moe Chamberlain, Tandem Unit Production Mixer; Kraig Kishi, Boom Operator; Cole Chamberlain, Sound Utility; David Hernandez, Sound Trainee

Only Murders in the Building
“The Tell”

Penny Harold, Rerecording Mixer
Andrew Lange, Rerecording Mixer
Joseph White Jr. CAS, Production Mixer
Alan DeMoss, Scoring Mixer
Production Sound Team:
Jason Benjamin, Timothy R. Boyce Jr.


Outstanding Sound Mixing for a Variety Series or Special

Bono & the Edge: A Sort of Homecoming With Dave Letterman

Phil DeTolve, Rerecording Mixer
Brian Riordan, Rerecording Mixer
Alastair McMillan, Music Mixer

Elton John Live: Farewell From Dodger Stadium

Michael Abbott, Broadcast Production Mixer
Eric Schilling, Music Mixer
Matt Herr, FOH Mixer
Alan Richardson, Monitor Mixer

The 65th Annual Grammy Awards

Thomas Holmes, Production Mixer
John Harris, Music Mixer
Eric Schilling, Music Mixer
Jeffery Peterson, FOH Production Mixer
Ron Reaves, FOH Music Mixer
Mike Parker, FOH Music Mixer
Andres Arango, Monitor Mixer
Eric Johnston, Supplemental Mixer
Christian Schrader, Supplemental Mixer
Kristian Pedregon, Rerecording Mixer
Juan Pablo Velasco, Playback Mixer
Aaron Wall, Playback Mixer

Saturday Night Live
“Co-Hosts: Steve Martin & Martin Short”

Robert Palladino, Production Mixer
Ezra Matychak, Production Mixer
Frank Duca Jr., FOH Production Mixer
Caroline Sanchez, FOH Music Mixer
Josiah Gluck, Broadcast Music Mixer
Jay Vicari, Broadcast Music Mixer
Tyler McDiarmid, Playback Mixer
Christopher Costello, Monitor Mixer
Teng Chen, Supplemental Mixer
William Taylor, Supplemental Mixer
Geoff Countryman, Supplemental Mixer
Devin Emke, Post Audio Mixer


Outstanding Sound Mixing for a Nonfiction Program (Single or Multi-Camera)

Moonage Daydream

Paul Massey, Rerecording Mixer
David Giammarco, Rerecording Mixer

100 Foot Wave
“Chapter V: Lost at Sea”

Keith Hodne, Rerecording Mixer

The Sound of 007

Richard Davey, Rerecording Mixer
Jonny Horne, Production Mixer
Simon Norman, Production Mixer
Francesco Corazzi, Production Mixer

Stanley Tucci: Searching for Italy
“Calabria”

Matt Skilton, Rerecording Mixer
Christopher Syner, Production Mixer

Still: A Michael J. Fox Movie

Skip Lievsay, Rerecording Mixer
Benjamin Berger, Production Mixer
Martin Kittappa, Production Mixer
Lily van Leeuwen, Production Mixer


Outstanding Sound Mixing for a Reality Program (Single or Multi-Camera)

The Amazing Race
“The Only Leg That Matters”

Jim Ursulak, Lead Production Mixer
Troy Smith, Rerecording Mixer

Deadliest Catch
“Call of a New Generation”

Jared Robbins, Rerecording Mixer

RuPaul’s Drag Race
“Wigloose: The Rusical!”

Erik Valenzuela, Rerecording Mixer
Sal Ojeda, Rerecording Mixer
David Nolte, Production Mixer
Gabe Lopez, Music Mixer

The Voice
“Live Top 10”

Michael Abbott, Production Mixer
Randy Faustino, Broadcast Music Mixer
Tim Hatayama, Rerecording Mixer

Welcome to Wrexham
“Do or Die”

Mark Jensen CAS, Rerecording Mixer


Names in bold are Local 695 members


Disclaimer: The Academy of Television Arts & Sciences (ATAS) does not award Emmy statuettes or nomination certificates to those listed under “Production Sound Team”

Ric Rambles

by Ric Teller

Pre-ramble: When I asked Pete Korpela, one of two percussionists playing in the orchestra at the Academy Awards this year, how he was doing, he answered, “Living someone else’s dream.” It was a great reminder not to take any of my experiences for granted. Sure, the dream of being on the crew of a major motion picture, a big live-television show, or just being able to make a living doing what we do must seem to many like a reverie, a fanciful and impractical idea. I’m sure at one time, it seemed that way to me.

Pink tape in the patch room at the Oscars

For those of you who follow these rambles, first of all, I thank you and appreciate every bit of positive feedback. And by feedback, I mean nice comments, not the ringing, squealing, or screeching sound that makes mixers wish they could defy the laws of physics. On that subject, is anyone else bothered by the fact that nearly every time a person in a movie or on a scripted television program steps up to a microphone, it feeds back. The truth is that it rarely happens in real life, even on live shows. Do you suppose some long-forgotten director stepped up to the mic in a very important pat-yourself-on-the-back moment and his comments were masked by the accidental acoustic meeting of an input and an output? From then on, as payback, he was determined to make us cringe each and every time a mic appears on camera by adding that undesirable sound.

My favorite feedback, in case you wanted to know, can be heard at the beginning of “I Feel Fine” by the Beatles. John Lennon’s guitar, “Nnnnnnwahhhhh!” according to Paul, Geoff Emerick, and my Friend, Robyn.

Oscar’s audio patch and Kit Donovan’s wall of fiber

And now, the pink tape story. If you have worked on award shows or specials in the last twenty years, chances are you might have run into something labeled in pink gaff tape. It is my labeling tape of choice. Others use gray, green, purple, and even white. I prefer pink. You may ask, how did this come about? I’ll tell you. In 2003, I was invited to be a band A2 for the 45th Grammy Awards at Madison Square Garden in New York. At the time, Local One wasn’t very interested in letting visitors work freely on stage. I was told that I could not be the band patch master. Hmmm. That was my job. They assigned a Broadway mixer to work with me. Although he was an experienced sound engineer, patching a three-hour live show with multiple bands was not in his comfort zone. I knew better than to argue the decision and proceeded to label everything in bright pink gaff tape. Together, my new friend and I patched the heck out of a very difficult show with many bands and the New York Philharmonic Orchestra. Pink tape became my go-to and is still used today.

The story among friends is that when I run out of pink tape, I get to retire.

A2’s Craig Rovello, Kim Petty, Bruce Arledge, Jr., Steve Anderson, and Robyn Gerry-Rose (Damon and Eddie were working).

Walking out of a very busy, tiring 2022 Grammys at the MGM Grand in Las Vegas, I complained to Craig that I was worn out and my bag felt very heavy. Unbeknownst to me, several coworkers had gifted me with new rolls of two-inch pink tape. Enough to last many more shows.

Not long ago, I was talking to some people not familiar with our business. I mentioned that I am on the crew of Jeopardy! and one of them asked when we film. 1987 was my smarty-pants answer, a product of early onset Weisenheimer’s. That year, I received a phone call from mixer Russ Gary, asking if I knew anything about the film world because we were going to be the sound crew on a new sitcom shot on 35mm called Take Five. Haven’t heard of it? I’m not surprised, we completed six episodes, but only two aired. The star, George Segal, played banjo in a band. He could really play, but the other cast members in the band were not musicians. George wanted the music to be live, so he played on camera while the other cast members finger-synched to a live band playing just off-camera. We recorded audio on two Otari reel-to-reel machines that were about the size of our Maytag washer and dryer. Although it has been a long time, Bruce Arledge, Jr., Rick Luckey, and I had the same recollection about channel assignments. The ½” 4-track captured dialog, music/sound FX, audience response, and 60hz tone. We had no timecode. The ¼” 2-track only recorded dialog and music/sound FX. The show was mono, and we had no iso tracks, no prefade, nothing else. Comm was primitive. AD’s used walkies, boom operators had basic two-channel RTS (audio PL and program), and the camera operators and dolly grips communicated with the camera coordinator using a half-duplex system of Maxon Radios that may or may not have been the prize from boxes of Cracker Jack. The other thing we didn’t have was video assist. It existed but not on our show. The director just asked the camera operators if they got the shots. Imagine, no video village. Four or five years later, on another sitcom called Family Matters, we finally had video assist for the director (and reluctantly for the producers). The three cameras (yes, three) were even switchable for the audience. A dozen years after Take Five, the last film sitcom I did, had digital multitrack audio recorders, timecode slates, full duplex RF PL’s, and wireless, switchable, color video assist. The technology helped but it wasn’t long before film sitcoms were a thing of the past.

One more Oscar note: David Byrne loved the Shure RF transmitters with googly eyes
With Robyn on wrap day, 95th Academy Awards

After a couple of Oscars that are memorable for the train station location and the slap, we did the 95th at The Dolby Theatre in early March. I believe my first was number sixty-four at The Dorothy Chandler Pavilion and I missed one year along the way, which equals … a bunch. Sometimes math is elusive. One thing is for sure, the A2’s got in their steps and flights of stairs on this one. The orchestra, made up of so many great musicians, sounded terrific. Cip, you were truly missed. In one of my favorite moments, I had a chat with David Byrne while he put on his hot dog fingers for the dress rehearsal performance of “This Is a Life.” Those were just the rehearsal hot dog fingers; he had better ones for the show. We do not have a particularly large A2 crew for the size and scope of the Oscars. Before we started, Steve Anderson, lead A2, put together a plan for cable runs, patching, microphone assignments, and workflow. When we got on-site, we installed cables, patched, and tested the connectivity. In 1979, when I walked into the maintenance shop at KTLA, there were signs posted around the work area. One said, “NOT DONE ’TIL TESTED.” A simple and very important reminder.

By midweek, we gravitated to more specific Oscar duties, all the while helping each other with projects as needed. Steve worked with production to mark a rundown with microphone assignments and kept us on track with tasks. Bruce built the RF Schoeps mic tubes and managed the sets and strikes as needed. Kim took care of the host mic needs using two Q5X transmitters with Shure Twinplex lavs for Jimmy Kimmel. Craig, with some guidance from Denali Audio Engineer Hugh Healy, patched and set up the many complicated production needs in the Orange Court parking lot, and then put lavs on presenters. Eddie and Damon set up the eighty-two input Oscar Orchestra, later joined by Dan Vicari. Then Damon took care of the guest bands, which came up hot on the stage elevator from the pit, while Eddie managed the performance RF mics. As you might imagine for a live show like this one, we practice as many elements as possible, including all music performances and award presentations. It is one of the few events that encourage all the presenters to come and rehearse, sometimes giving pop-up microphone operator Tom Streible an opportunity to note the height for each item which is subsequently adjusted on the fly for the winners. If you have watched or attended the Academy Awards, you probably realize that gowns are a very important part of the proceedings. Women do not wear their show attire to rehearsal, and they don’t come to the two full run-throughs, one on Saturday night and one Sunday morning, a very talented group of stand-ins perform those duties. The first time Robyn sees the dresses is when she mics them in the live show. Many of you have put lavs on talent. Think about doing it on very expensive gowns that you have never seen, often hiding the mics, just before presenters walk out in front of their peers and are broadcast to millions of people. No pressure, right? Great job, everyone! I am honored to work with all these terrific A2’s and the rest of the very talented audio crew.

Rebecca Kobik visiting America’s favorite quiz show

I suppose, in some ways, I too have been living someone else’s dream. The truth is that I have been places and done things that a kid from a small town in Nebraska could not have imagined. Recently, I spent some time with dreamer, podcaster, and future Y-something, Rebecca Kobik. We talked about skill sets, work ethic, setting goals, asking questions, continuing education, and even dumb luck. All things that have contributed to my career. I hope some combination of those topics will assist Rebecca in living her dreams.

The Disruption of Technology at NAB

by James Delhauer

The first National Association of Broadcasters (NAB) trade show opened its doors one hundred years ago in New York City. In the century that followed, NAB grew to become one of the most important annual events to those working in the worlds of film and television. It’s where all the latest gizmos and gadgets from the most serious vendors are debuted. Reps from the various companies are there to interface with potential customers; marketing their wares and soliciting feedback from end users. It’s like Disneyland for production nerds and, after last year’s small return to the Las Vegas Convention Center following the COVID-19 pandemic, this year’s centennial celebration was a return to form for this grandiose event.

The technology in our industry is changing at an accelerated rate. While digital workflows were only making their way into the mainstream just fifteen years ago, this year’s NAB show was dominated by LED video wall systems, cloud-based solutions, and artificial intelligence systems. Digital technologies have never been more integrated into the arts of filmmaking than they are today. Heck, the term filmmaking is, in and of itself, a misnomer at this point, as I saw exactly one exhibit pertaining to actual celluloid film during the run of the convention.

In the last five years, we’ve seen the sudden rise of “virtual production,” largely brought about by the development of interlocking LED panels to create larger-than-life “Video Walls.” This technology goes by many names. “The Volume,” “Infinity Stage,” and “Virtual Stages” are just a few that spring to mind. The terminology is not standardized, screens that play back content.

This technology is versatile in its use. Narrative-driven productions can create immersive digital sets on stage, eliminating the need for time-consuming post-production visual effects work. Talent can visualize the environment that they’re in, making for more naturalistic and believable performances. This can be useful for things as mundane as simulating a car ride or as extravagant as setting foot on an alien planet. In broadcast environments like game shows, concerts, and award shows, LED panel systems can create infinitely unique experiences. Photorealistic or abstract, it doesn’t matter. Fireworks can be in the room when someone wins an award or the hosts can float through space. Much of the visual spectacle once exclusive to hundred-million-dollar blockbusters can now be achieved and broadcast in real time thanks to the proliferation of this technology. And if this year’s NAB show was any indication, these systems are here to stay.

Unfortunately, many productions have taken to referring to virtual production as an “on-set visual effect” in order to circumvent the need to hire IATSE members. Visual Effects Artists do not currently work under a union contract, meaning employers can bypass the need to contribute to benefit funds that provide artists with a pension or healthcare. This situation leaves these artists vulnerable to exploitation and poor working conditions, with many news stories recently discussing long working hours and inhumane conditions. Despite the value they bring to production, Visual Effects Artists and companies are not compensated appropriately, with even Academy Award-winning VFX houses being forced to shut their doors or file for bankruptcy due to an inability to make ends meet (such was the case for Life of Pi VFX house, Rhythm & Hues, which was forced to file for Chapter 11 Bankruptcy just two weeks before accepting the Academy Award for Best Visual Effects in 2013). By choosing to assign this additional labor of “virtual production” to VFX houses instead of hiring IATSE workers under their contract rates and benefits, studios are perpetuating an unethical and unsustainable system—all while violating the contracts that the employers sign with the IATSE.

NAB President and CEO Curtis LeGeyt speaking on the subject of Artificial Intelligence

At its core, virtual production technology is an evolution of the work that Local 695 engineers have been doing for decades. It is derived from the Rear Screen Projector/Camera Interlock Process shot, a system originally conceived and developed by IATSE member Henry V. Miller in 1930 (a story we covered in the Fall 2022 edition of this publication). Though LED panels, networking systems, and video throughput workflow have been improved upon with almost a century’s worth of innovation, the primary function of the job has not changed. Video Walls, Volumes, Virtual Production Stages, or whatever you want to call them are a means of playing back an image for the purpose of being photographed by the Camera Department. That work is the jurisdiction of Local 695.

Virtual production is not the only disruptive technology that is gaining traction in the entertainment industry. At this year’s NAB convention, a wide range of new artificial intelligence (AI) and machine-learning applications were unveiled, such as Adobe Firefly (an art generator), Move AI’s Invisible (a real-time marker-less motion-capture tool), and Whisper by OpenAI (an automatic speech recognition tool). While these applications are still in their infancy, there is no denying their potential for development and continued growth. Artificial intelligence and machine learning are poised to revolutionize the industry in the same way as past technological innovations like the internet and smartphones. With AI and machine learning, the industry is expected to become more efficient, streamlined, and cost-effective. However, these new technologies also raise concerns about the displacement of human workers, job security, and ethical considerations.

In fact, during this year’s trade show, NAB CEO Curtis LeGeyt took to the stage to give his thoughts on this emerging market, stating that “This is an area where NAB will absolutely be active… It is just amazing how quickly the relevance of AI to our entire economy, but specifically, since we’re in this room, the broadcast industry has gone from [an] amorphous concept to real.” His presentation echoed many of the concerns that members throughout our industry have expressed since tools like ChatGPT and Stable Diffusion have come into play. “We have been fighting for legislation to put some guardrails on it,” he said. “We need to ensure that our stations, our content creators are going to be fairly compensated.”

This warning already rings true for many, as we’ve seen companies like Microsoft, Disney, Google, and Amazon announce mass layoffs even as they invest billions into research and development for artificial intelligence tools. However, that is not to say that artificial intelligence does not create an opportunity at the same time. As the entertainment industry continues to embrace new technologies, the most desirable workers will be those who make themselves familiar with the latest tools and software on the market. With the rise of machine learning, the rate at which these tools come to market is expected to accelerate rapidly, making it more important than ever for professionals to stay up to date. This will require due diligence and a willingness to constantly learn and adapt as new tools and techniques emerge. Those who are able to do so will be better equipped to compete in a rapidly changing job market, while those who are unable or unwilling to keep up with the latest trends may find themselves left behind. Ultimately, staying abreast of the latest tools and technologies is critical for success in today’s entertainment industry, and those who can do so effectively will be the most in-demand workers.

With the rapid pace of development in machine learning, it is likely that these technologies will take over the entire show by next year. In fact, in the coming years, I suspect it will become increasingly difficult to find an audio or video system that does not incorporate some form of machine learning or AI. As these tools reshape our industry, those who are able to harness its power effectively will be at the forefront of this revolution.

And the last area of disruption we’ll discuss today is the advent of Camera-to-Cloud recording. This is a workflow that allows filmmakers to record digital files directly to online servers and into a post-production environment in real time or near-real time, enabling near instant feedback from production stakeholders and accelerating the post-production process. Though such workflows have been theoretically possible for the last few years, the acceleration of remote work thanks to the COVID-19 pandemic resulted in advancements in internet connectivity. Consumer, prosumer, and business pipelines are now reaching the point where uploading material directly from the camera or a recording device is practical. However, this creates some jurisdictional issues on the set. Off-camera recording, such as through a media server, record deck, or hardline ingest system are the jurisdiction of Local 695 recordists. Camera-to-Cloud is, by definition, an off-camera recording and transmission system, both of which are codified in the Local 695 Collective Bargaining Agreement. This does not mean that productions cannot take advantage of these innovations, but it must be done with a Local 695 Video Engineer on-set to oversee such operations. Even when transmission systems are built directly into the camera (such is the case with Red’s V-Raptor system) or portable camera monitors (such is the case with Atomos’s Ninja V+ video monitor), this is Local 695 work. Before a production can assign this work, a conversation must be had between the involved Local unions, the on-set department heads, and the producers to sort out who will be responsible for what work in a manner that is consistent with the contracts we’ve all negotiated with one another.

NAB 2023 was a wonderful return to form, showcasing countless new devices and tools that continue to push the limits of what is possible in Hollywood. From virtual production to AI and machine learning, the entertainment industry is rapidly evolving and embracing new technologies that promise to revolutionize the way we create and consume content. However, as we embrace these innovations, it is important to be cautious and ensure that they are being used in a responsible and ethical manner. We must also be mindful of the potential consequences, such as job displacement and the erosion of privacy, that come with these new technologies. Ultimately, while the future of the entertainment industry is exciting and full of potential, we must approach it with care and responsibility to ensure that we are building a sustainable and equitable industry that benefits everyone involved.

The Way of the Day Player

by Brandon Loulias

I’ve spent most of my adult life working on film sets; from the Wild West of nonunion indie movies to long-form narrative films and TV. It was a great way to collect experience. I also observed a common duality for most of us on regular shows: long hours, unpredictable wrap times, exhaustion, etc. Once the pandemic hit, it gave us an opportunity to reframe our lives and what we feel is important. I learned there are more things in life than living on a film set. This was a major shift for me, as I had never put a priority on my personal life. I took that time to reconfigure. The variety of work I’d get called for expanded vastly, from primarily narrative work to about ten different styles of sound work on large-scale productions with varying complexities. My post-sound career rekindled as well, which has always been a part of my life and I’ve always kept a mixing room wherever I’ve been for the past twenty years. These days, many of us don’t get the chance to choose the type of work we do. We either take the job, or someone else will and who knows when the next one will come. Regardless of the job requirements, I show up and solve problems, like any of us. I enjoy working in all the different disciplines under and around our little umbrella, albeit sometimes exhausting.

Most of my career has been by the seat of my pants, with gear manuals and internet access keeping me honest and employed since I was a kid. I’ll be the first to admit that I don’t know everything and never will. That’s the joy of this line of work for me. I’m a lifelong student of the crafts and the people around me as long as there’s something new to learn. Our employment depends on relationships, and good gear allows you to be with people instead of all the toys. The point of our equipment is to get it out of the way and be present for the process.

Once upon a time, a sound person could choose a particular discipline and stay within that classification for an entire career if they chose to do so. Things have changed a bit, and now many of our members will take what they can get in any classification. When I got into the union, my intention was to keep mixing. Alas, I was 6’7” and in my twenties, so they gave me a boom pole instead. Booming taught me to navigate and collaborate with other departments, which is vital to achieving a great soundtrack and a positive work environment. Coincidentally, I had to give up booming for health reasons, which has led me to be more dynamic in my skill sets to stay busy.

One thing I love about day playing is the ability to bounce between different workflows and avoid complacency with a particular job. The downside is having to pivot on a dime and sometimes even wearing different hats with variable complexity for each day of the week. There’s also the turnaround issue; like having a split-call mixing job that goes late one day, and an early call on a playback job with a completely different gear package the following morning. Most of the time, we just have to tough it out and be exhausted, as we may not have another job for a long time.

A double-edged sword of working in many disciplines is the need for multiple rigs that do multiple things. I believe in being prepared but will only purchase things out of necessity. Otherwise, we’ll just get into a loop of buying gear to make money to buy gear, which is a dangerous game I’ve been playing for most of my life.

I look at all jobs like a math equation; the problem is what the job requires, and the solution is something we provide through technology and skillful planning. Here are a few examples of problems I’ve encountered in the past few years, and what solutions I devised as a result. A huge benefactor in audio solutions these days is how technology has evolved. I am ever-so-grateful for many of the audio-over-IP solutions that I can rely on for all the mission-critical applications.
Then of course, the most important parts, like batteries and wheels.

Music Recording and Pro Tools Playback on Movies and TV

There’s been an increasing demand for recording live music on movies and TV, and I happen to get a lot of those calls lately. Here’s how I handle them:

I ask for a tech rider, which gives me all the info like track lists, instruments, IEM requirements, mic choices, etc. I contact props and set dec, who are most likely already in touch with the stage and equipment companies, to determine who’s supplying the backline, stage tech, etc. A lot of the time, those companies will also provide additional workforce such as FOH mixers, A2’s, System Techs, etc. I always advocate getting additional A2 labor through Local 695 if we’re short on hands. I also like to have Pro Tools playback and band recording as two separate people if possible and the situation calls for it, although sometimes it has to be one. I have modular rigs that can accommodate both workflows via AES50 and Dante interchangeably. The flexibility of those interfaces gives me lots of options in various situations, and the Midas ecosystem is excellent for accommodating demanding and ever-changing workflows. I still use the good old Sound Devices 970 for large track counts, the Midas M32R & M32C mixers, and DL16 & DL32 stage boxes. Moving forward toward the shoot, I always ask for a rehearsal/pre-record day, if we can. It’s really nice when we can focus on recording with the artist on a click track, dial in IEM’s, and get a few clean takes without the hassle of “getting this next shot before lunch.”

I worked on Yellowstone S5 for a minute, where there was a frequent need for live-music recordings throughout the season. I put the bat signal out to production about needing to connect with the backline company, and they hooked me up with Alex Bruce from Montana Pro Audio, who was an absolute champion about it. He and his team built the stage and got the instruments, and we combined our mic collections to facilitate the needs of recording for all the bands.

I had to revise my system after the first round, as we were spread out on a ranch with a thumper for four hundred cowboys, and the band was in a tent with about five cameras floating around the premises. I returned with a smaller rig to do it all quickly and lightly. The coolest part is the mixer being able to jump between Midas and Pro Tools control at the push of a button, which proved extremely handy when recording thirty-two isos to play back on a dime against a thump with immediate turnover. I had a previous commitment, so my friend Nick Ronzio stepped in and finished the season on my rig. All went smoothly. A perfect reminder that simplicity, even with great technology, is sometimes the best choice.

Live Broadcast Mixing for Justin Bieber’s Virtual Concert with Unreal Engine for WAVE

I was hired to create a sound ecosystem for live broadcast motion-capture virtual concerts using Unreal Engine and various DAW’s. This is how we did it:

The problem we faced was a live concert in real-time with Justin Bieber in a motion- and face-capture suit driving an avatar in Unreal Engine. This was broadcast to millions of viewers. The solution was a lot of rehearsals, MIDI cue points, math equations, and headaches. I built a system that can drive Unreal Engine from Pro Tools through MIDI, LTC timecode, and GPIO. This information was generated by Pro Tools with stems for monitoring, as well as crowd interaction and FX triggering via Ableton Live.

It all interfaced with Unreal and the BTS camera for live picture-in-picture, and everything was in tri-level sync feeding a TriCaster.

Some of the issues we faced in this process were calibrating the synchronization between various peripherals, even down to house sync and black burst. The face motion capture was about 23ms offset from body capture, which had to be ironed out by employing certain system delays. This also meant we needed to delay the audio by 384ms while still having the music in real-time for Bieber to perform. It ended up requiring a lot of different bussing and variable delays, including audio reactivity, which altered the lighting and graphics in the Unreal universe according to music intensity.

We developed a method for global sync calibration, where a technician wore the suit while moving to the beat and counted out loud so we could line up the body, face, and audio for all moving parts. We ran it for the duration of the concert to verify any detectable drift and cross-checked it with a 2-pop at the beginning and end of the show. All interconnects for our department were done using Dante and NDI on four computers. We had two identical Pro Tools HDX systems—one for playback and one for mixing. The other two computers were for systems management and crowd/FX triggers via Ableton. We controlled everything with two Avid S3’s, interfaced with Focusrite Red 8Pre’s, and recorded the whole thing on the good old Sound Devices 970. The result was music playback driving all visual cues and scenes within Unreal Engine while Bieber wore the suit to drive his avatar in that world. I got to work with one of my childhood best friends on this one, Will Thoren. This job was technologically ambitious and great to share together. It was also a good lesson in staying organized when it’s necessary to go big with our rigs.

Dolby Atmos Comedy Show in Dane Cook’s Backyard

Lately, I’ve been working in the niche circuit of live comedy, and we have a system for turning over Atmos-deliverable recordings. Here’s how:

My friend Thomas Cassetta does quite a lot of this work, and we usually do a lot of the production and post-sound on these together. On this one, I was the Production Mixer and Supervising Sound Editor, and he was the Re-recording Mixer. I’ve always found it to be an informative process to do both. Tom is a stellar colleague, and a pleasure to work with.

On live comedy, we like to use the Midas M32/970 workflow and boom recorder as backup. I’ve had a lot of success with those machines for higher track counts where you can’t have anything fail and millions of people are watching. Live comedy is a similar idea. My rig for this is fairly basic and the workload generally goes into hanging mics. On the Dane Cook special, we had to pull many stops as he wanted to do the special in an unusual place—his backyard. His house was in the Hollywood Hills, so loading in and logistics were an absolute nightmare for all involved. Luckily, he owned the house across the street as well, so we ran our base of operations from there over fiber on a crossover.

My main challenge was getting a full-sized LCR line array PA system to Dane’s top balcony while I was three stories down in his garage recording and mixing. Another thing was wind protection for twenty-six crowd mics. We rigged a combination of wired and wireless mics, hanging in spaced pairs along the perimeter, and an Ambisonic mic from above. We had the post contract on this, so any mess we made was ours to clean up. The game is to get full-bodied crowd sounds with all mics in sync and to run the PA quietly to give you enough separation in post for control. We had a Shure Axient System for his handhelds and Midas DL32 on the side of the stage, feeding into a Midas M32 at the FOH mix position via AES50. We daisy-chained the FOH to me downstairs in the garage on my Midas M32R into three backup recorders via Dante. This was a cool configuration as I could set the master trim and preamp controls from my mixer, and my FOH guy could do trims and such upstairs.
One thing I love about comedy stuff is prep days; you’d never see that on a film set. It’s also been a real pleasure meeting the other factions of A2’s and comms people who are 695 members—usually, we’d never cross paths. They are
highly intelligent, great to work with, and quite skilled. This gig showed me the importance of offline technical prep so that life can be easier on the day.

Pro Tools Playback Motion Control Workflow: The Flight Attendant S2

I first got the call for The Flight Attendant to solve a tech issue. They wanted to “parent-trap,” which is where you have one actor play many characters in the same shot. Here’s how:

We stacked Kaley Cuoco’s performances in QTAKE, and by the time we wrapped the scene, had a complete sequence. They were all layered in on a VFX comp, with only her dialog heard from each performance. This was important for VFX as some of the scenes had to have perfect eyelines, throw stacks of paper, high fives, etc. The dialog timing had to be on point as it would motivate the moves, and there needed to be enough of a gap for each line, otherwise, it would throw her off.

This required a repeatable scene. My system would drive six or more peripherals that employed motion control for the camera, QTAKE auto-record and layering, DMX lighting cues, and sound FX on the Pro Tools timeline. I decided to run it with record-run timecode via an Avid Sync HD, and I’d record the isos over Dante into Pro Tools HDX and on a 970. I chose LTC over GPIO to drive this, given the nature of linear time frames of each scene—hitting cues at certain intervals for FX, actions, light, and not to mention the flow of dialog. GPIO had only basic transport functions, and LTC allowed me to adjust things in my repeatable timeline. Each segment would start at 00:59:56:00 so as to give everything enough time to catch sync.

We wired everyone, including the photo doubles, who ran lines to hold space for when Cuoco was to do those character’s performances. I ran a Dante feed from the production mixer’s rig so I could get anything I wanted over that line, including the mix and all isos. I would roll when they called action, kicking the LTC off, which would trigger QTAKE to roll four cameras, the bloop light for Moco would hit at 01:00:00:00, our AD would call out “three … two … one … action,” then they’d do the scene. We wrote motion control, focus racks, DMX lighting cues, and any other chronological scene information against my LTC stripe on all moving parts, recording all data. If we wanted another one, we’d have to write it all again based on the linear requirements of the scenes. After we got a keeper-take, we would then call that the base on which we would build. I would mark that in my session, then figure out which character would go next. I would clean up the dialog on the fly with iZotope and cut out photo double lines of the character she chose to play next, so she could perform it. Sometimes we’d rehearse while she was getting wired, and she would read her next character’s lines against the other lines to practice timing. I’d also retain the verbal “three … two … one” cue, although it was archaic, as it served the purpose of cueing everyone on the exact same start mark each time for the scene.

We had to drive QTAKE’s initial “keeper-take” to play back in sync with the new one and print both into a new VFX comp track. We were effectively “stacking” performances, similar to a sequencer or multitracking in a DAW. I also would run my updated dialog edit into the mixer’s board, so it would always represent the most recent dialog comp for VTR overlays. Sometimes, I’d cue three beeps to help give her a start mark if her character didn’t start the scene or if she needed a cue for action. This was a constant process throughout the season, requiring all departments’ collaboration. It was great practice in how to translate weirdly complex situations to others in a palatable fashion.

To Leslie: Good Old-Fashioned Filmmaking

To Leslie was a special experience for me and a total relief from all the other deeply technological jobs I had gotten myself into. Shot single-camera on film, this job was fun because it wasn’t about big tough problem-solving or new fancy wireless gear; it was about the relationships and organic processes of capturing great performances. It was classical filmmaking. Our Director, Michael Morris, was an absolute joy to collaborate with. From the beginning, we discussed the music as it was important in this film. He left me in charge of the jukebox, and I was always ready whenever he wanted something for needle-drop or otherwise. It was very enjoyable to delve into the “outlaw country” discography and learn about a style of music I hadn’t been entirely familiar with. Some songs helped convey feelings and underlying subtexts in particular circumstances. While other times, it was about creating the right vibe.

Then there was Andrea Riseborough, who immediately caught me off guard and blew me away with her performance from day one. It was amazing to witness this powerhouse of a character, which shook most of us to the core. She had incredible dynamics, and keeping up with that was quite a task at times. Luckily, I had a killer crew of Johnny Kubelka as Boom Operator, and Dan Kelly as Utility. It was a real treat to have them both, Swiss Army knife-generation sound guys who have worked in many disciplines as well. We shot all over L.A. and out in the desert, all on location. It truly reminded me of the good old independent film days that felt like summer camp. Just filmmakers having fun.

No matter the application, it’s safe to say that our job requirements are evolving, and so are we. The amount of gear we have to keep is staggeringly more than ever, and we have to support each other to keep our rates up to accommodate for that. The gear is there to supplement our solutions, not to define us or our workflow. It doesn’t matter if you use Shure or Lectro, Zaxcom, Sound Devices, Cantar, Sonosax or Zoom—what matters is that we know how to use our equipment and adapt to the ever-changing landscape. A mixer’s gear loadout is reminiscent of their mind and what makes sense to them. We mustn’t forget the purpose of the gear is to get it out of the way and don’t forget to step away and live life.

It Takes a Union

Behind the Video Avatar: The Way of Water

by Dan Moore

Being a Video Engineer can be a lonely position on set. On occasion, like a Script Supervisor, you can convince production that additional help is needed because of added cameras, multiple location moves, or cameras that will leapfrog from set to set. Having more support is often the exception and not the rule. However, on Avatar: The Way of Water, with four and sometimes five stages in operation at Manhattan Beach Studios (MBS), quite often three to five operators were employed with some days reaching as many as eleven. With complicated setups and prepping upcoming scenes, planning and coordinating personnel was essential in making this project a success. This started in 2005, when the technical planning and pipeline development began for the first Avatar film.

The pipeline is the process by which all departments contribute their specific function to a main storage center, which then disseminates and organizes large volumes of information. The information being ‘bits of data.’ These digital files are cleaned up and sent to the visual effects house and other departments to use. In 2005, many were responsible for the development of the pipeline, including members from Local 695. Glenn Derry was a principal developer, supported by Gary Martinez and Mike Davis. Both had the creative talent, the engineering expertise, and field experience to build a pipeline that worked. Later in 2012, Ryan Champney, the Virtual Production Supervisor at Lightstorm Entertainment, continued to improve and streamline this pipeline. He was able to make rapid changes to the overall system and personalize it for different departments. For example, writing custom software for the automated publishing and metadata tagging of the audio, video, facial, and performance-capture datasets. This overwhelming task resided with these principal technicians, whose degrees in computer science, electrical, and mechanical engineering supported the groundbreaking work. I enjoyed observing and learning from skilled engineers overcoming complex challenges for this project.

To walk onto the set of Avatar would be deceiving and underwhelming because of its warehouse-like appearance. This is not a traditional film set like those constructed by carpenters, painters, and set designers, to then be populated later with stylized lighting and filmed with traditional cameras.

On one side of the stage are two raised platforms, which look like old-style TV phone banks. In this case, the phones and staff are replaced with computer workstations and technicians. Each station is responsible for different aspects of the show. These raised platforms, called the Brain Bar, are workstations for head rigs, QTAKE, RealTime, take assets, script, stunts, and the Editorial Department.

James Cameron reviews work at the
Brain Bar. Photo by Mark Fellman –
20th Century Studios
Roly Arenas and Peter Joyce pose in front of the soon-to-be assembled carts to be used in New Zealand and Los Angeles

On Stage 27, in front of the Brain Bar, was the Volume that measured 120’ x 75’. The Volume is where the action sequences were recorded. The Volume was surrounded by one hundred eighty infrared cameras, suspended from the ceiling, casting IR light over the entire defined space. These cameras, which are calibrated for accuracy every morning, track the actors’ movements within the Volume. It also tracks a ‘virtual camera’ and the props for those action sequences. The actors wore black suits with reflective spherical markers on them that the IR cameras followed. When looking at a RealTime monitor of the Volume, the performers and the characters they were playing could be seen walking around in a 3-D field. The virtual camera, which put a frame and lens around the image, provided the visual background, creating the location and characters for the scene. On each stage, four 65” OLED monitors were placed around the Volume, and thirty smaller monitors at each workstation, showing the Avatar facial and body movements, as well as the scenery of this imaginative world. Instantaneously, the scene became a close approximation of what it would look like in the theater. Not photo-realistic, but close.

Equipment system used on Avatar: The Way of Water

Additionally, within this crowded Volume, up to sixteen live Sony cameras were positioned to capture close-ups of the actors’ facial expressions and movements. This allowed the director to see the detailed performances of the actors and know what takes ‘to circle.’ These directors’ video notes of the articulation of the actors’ expressions helped to make references for final photo-realistic digital images done later. These movements, along with sound and video, were all digitally recorded in a massive, air-conditioned server room that accommodated several petabytes of storage.

The crew looks at the environment of the set about to be filmed. Mike Pickel, who passed away during the filming of Avatar: The Way of Water, is on the left frame of the picture. His presence, humor, and talents were greatly missed. Photo by Mark Fellman – 20th Century Studios.

Where does the Video Assist Operator fit into a project like Avatar? My primary function was to record the virtual image along with a 16-camera matrix that would be displayed on the monitors. The recording stations became stationary after hundreds of miles of Canare Belden BNC, and fiber-optic cable were laid to create the pathways. Hardware equipment from Evertz, Blackmagic Design, Decimator Design, Panasonic, and Sony were used in the build-out. Much of the equipment was sourced from B&H Photo and Adorama Camera. Production did source equipment locally, as well through Band Pro Film & Digital in Burbank. Air-conditioning, backup power supplies, and racks of digital storage units were in place before the first day of filming. In addition to the build-out, solar panels were placed on top of all stages, to cut down on the electric bill and all water used in the film tanks were delivered to the golf course right next to the MBS stages.

On the first Avatar film in 2005, the production used video assist Playback Technology devices to record the virtual camera and three-quarter-inch tape decks to record the live-action cameras. In 2015, Avatar: The Way of Water production used the QTAKE system to record the virtual camera and a matrix of the sixteen live cameras. In addition, the individual live cameras were recorded on AJA recorders and uploaded onto the digital server after each take. The matrix image for the video assist came from a Blackmagic MultiView 16 device. As a criteria, all devices had to have the capability to be remotely software controlled so that the media was properly named and published on roll and cut.

The Qtake team: Dan Moore, Vlado Struhar – President, IN2CORE, Martin Karsay – Hardware IN2CORE, Jeb Johenning – Qtake Distributor, Andrew Borsuk – Software IN2CORE, Michael Tomlein – Software IN2CORE

The Video Assist Operator was also responsible for troubleshooting and the overall reliability of the pipeline. Making sure that the sixteen live-action cameras, head rigs, and other departments had proper timecode, as well as tri-level sync and other reference outputs. All the workstations, about twenty in total, each had a monitor for live and playback images. The Video Assist Operator was also responsible for transmitting the images to the QTAKE workstations and to the Camera Department monitors for focus.

With hundreds of cables used to attach to reference cameras and other pieces of equipment, a color-coding system was established to visually see the length of the cable. This was an orderly way to pick out the proper length of black cable. Over the course of filming a feature film, this saves time and a substantial amount of money for production.

In June 2016, I was employed once again on Avatar to head and coordinate the Video Department. The build-out of the pipeline was almost complete. I assisted both Ryan and Gary to help finish and get ready to film Avatar 2 and 3. At this point, I needed to assemble a group of Video Engineers who would be able to work on this project at various times. I scheduled personnel who were able to commit to the show and maintain their own client base. I knew a few operators but needed to find more. Within a few weeks, I was able to assemble twenty-five video operators that I could schedule at any given time. As the show progressed, the Director and Virtual Camera Operators would sometimes have a preference and would want to be paired with Video Assist Operators who were suited for them. The operators came from all different backgrounds, which included features, commercials, and live TV. Their technical ability varied as well. From basic QTAKE functions to a variety of problem-solving issues, every day had different challenges. Some operators expressed that the pressure of working with the director would be too tense. In this case, the Director was James Cameron. For Mr. Cameron, you need to be always ‘on,’ anticipate what he might need, and be able to technically answer his questions. Like Mr. Spielberg, he demands an A team of professionals. Surprisingly, Video Operators that came from live TV were able to handle the most stress in terms of personalities and technical issues. In the end, our Local was able to provide the most experienced and proficient technicians.

Shahrouz (Shawn) Nooshinfar and Storm Flejter troubleshoot issues on Avatar: The Way of Water
Roly Arenas and Eduardo Eguia on the carts for New Zealand
Dan Moore – Video Assist Operator & Ryan Champney – Virtual Production Supervisor at Lightstorm Entertainment

All the Video Assist Operators were able to bring their talents to this otherworldly film. I would like to congratulate the following operators, and their contributions to the success of Avatar: The Way of Water.

Shahrouz (Shawn) Nooshinfar: From Tehran, Iran, Shawn had been employed by several Persian and Armenian broadcast channels serving as an Uplink Engineer and Technical Director. He is also fluent in Farsi, German, and English, making him in demand at other international broadcast stations. He was brought on Anchorman 2 as a Technical Advisor and Engineer in 2011 and joined the union in 2012. He has worked as the lead media server and LED Engineer on Dr. Phil and other live TV events. On Avatar, he handled the biggest technical challenges with ease and confidence. He is one of the principal owners of Lightning LED, which specializes in video walls, media servers, and video assist. His company is a QTAKE distributor and technical support center.

Jeb Johenning: From Lexington, VA, was employed by Strata Flotation in 1988 and was responsible for the in-house video productions of their product, which was being distributed nationally. He also served as an Industrial Designer and had twelve US patents on different products he designed. Jeb has been a Local 695 member since 1994 and established his company Ocean Video in 1993. His company is one of the few companies that is a worldwide premier agent/dealer for the QTAKE system, which builds carts and provides technical support. His role on Avatar was more behind the scenes, providing support and perfecting the build-out of QTAKE streaming. This allowed all five stages to be interconnected and view takes filmed on different stages. This was a secure and quick way for the director to comment on the work of others.

Dan Moore: From Chicago, IL, graduated from college in 1983, and trained as a Video Assist Operator with Cogswell Video Services in 1984, one of the first video assist companies. Steve Cogswell trained many Video Assist Operators and made an impact on how video assist is used on set today that includes a sense of organization and the consistent use of quality equipment. Besides managing the operations on the set of Avatar, Dan worked with Ryan Champney in setting up and dismantling the video infrastructure for all the performance-capture volumes. All cables used for video, timecode, tri-level sync, word clock, data, and even the cable’s length had to be color-coded and incorporated onto the stages. With thousands of connections going in so many directions, this made the ability to problem solve much easier. Dan is currently the owner of Video Hawks LLC.

Eduardo Eguia: From San Luis Potosi, Mexico, moved to Mexico City in 1995 to work at the Broadcast Televisa Studios as an Editor and Post-production Engineer. He moved to the US in 1998 and in 2010 joined the union. Working as a Video Operator on Avatar, Eduardo was also responsible for building the QTAKE systems and other recording and editorial carts for the 3-story tank on Stage 18 at Manhattan Beach Studios, as well as all the carts for New Zealand. This turned into a blessing for production, since once the carts were completed, COVID shut down the Los Angeles operations which later resumed in New Zealand with the carts that Eduardo built. With the help of Roly Arenas, Storm Flejter, Ernesto Joven, and Peter Joyce, a total of twenty carts were assembled and used in production.

Eduardo Eguia assembles a video cart for New Zealand
Dan Moore works on the QTAKE system on the water tank stage
Director James Cameron and crew behind the scenes of 20th Century Studios’ AVATAR: THE WAY OF WATER. Photo by Mark Fellman. © 2022 20th Century Studios. All Rights Reserved.scenes of
Director James Cameron on set of 20th Century Studios’ AVATAR 2. Photo by Mark Fellman. © 2021 20th Century Studios. All Rights Reserved.

Roly Arenas: From Havana, Cuba, graduated from the University of Computer Science Havana as a Software Engineer. He worked in Havana as Graphic Artist and Video Engineer at Canal Havana Broadcast Studios and moved to the United States in 2010. He was hired as an Editor for the Caribbean Broadcasting Company in 2016 and joined the union in 2018. He worked on Avatar building carts and working as a Video Assist Operator.

Mike Pickel: From Dallas, Texas, graduated from University of Texas, Austin, with a degree in film. The same year, he moved to Los Angeles to work at Paramount Pictures as a Production Assistant and then transferred to Michael Bay’s company, Propaganda Films. There he worked as a PA and then later as a Video Assist Operator on commercial projects. He became a union member in 1995. Sadly, Mike passed away from cancer in 2018 during the filming. He was one of the first Video Assist Operators to work on Avatar: The Way of Water when production commenced. His presence, humor, and talents were greatly missed.

So many other Video Engineers from our Local were involved and instrumental in making Avatar: The Way of Water a success. They include Andrew Rozendal, Alex Sethian, R. Scott Lawrence, Joe Kroll, Justin Geoffroy, Ben Betts, Peter Joyce, David Santos, Storm Flejter, Ernesto Joven, and several others. Our Local came through with skilled Video Assist Operators who worked together and challenged that singular often lonely position we all have been accustomed to performing, merging our creativity for a once-in-a-lifetime experience.

This Is Not Your Father’s Jury Duty

by Blas Kisic

I remember the phone call very well—a producer I’ve known for years, Matthew McIntyre, was on the other end. He got straight to the point, “Blas, have you ever done a hidden-camera show?”

Indeed, I’d worked on a few. They were typically in a contained space, and lasted a few hours at the most. I came to realize a few weeks later, when it was too late to back out of the job, that my notion of a “hidden-camera show” and the project Matt was asking me about were two very different things.

The show in question was Jury Duty, starring James Marsden, and it premiered on Amazon Freevee in April. It follows a volunteer whom we called “Hero,” who signed up to participate in a documentary-style project about the legal system in Los Angeles. What he didn’t know was that everyone he would interact with was an actor, and that the trial was a recreation.
The scale and scope of the project (which I’d describe as “The Truman Show in real life”) was unprecedented. We had to convince the volunteer he was involved in a jury trial in an actual courthouse during a period of three weeks. We had to control whom he spoke to, what he saw or read, where he went and, naturally, he’d have no access to any electronic devices to keep him from finding out what was happening in the real world. As far as I know, this had not been attempted before.

To be honest (I hope the EP’s don’t read this), I had my doubts. The challenges, especially for the Sound Department, were many. We had two weeks of prep at the courthouse, rehearsing with our cast, who were in character during the whole day, but were also amazing improvisers. Finally, we brought in our Hero for the three-week shoot. What could possibly go wrong?

Two mixers, one utility

My first request to production was to hire a second mixer. I assumed that we’d have to deploy multiple rigs in different sets, requiring two mixers at all times.

I’m incredibly fortunate Dan Kelly was available. He’s a very experienced mixer with many “live-to-tape” projects under his belt. Those typically contend with multiple cameras and elaborate technology, and also, more to the point, few opportunities for a second take. Besides, he’s such a calming presence on set, a welcome contrast to my hyperactive, somewhat neurotic self.

I had to find a new Utility Sound person as my “usual suspects” were already working on other shows. I decided to try out Jennifer Zhang based on a colleague’s recommendation. Jen lived up to her accolades; she’s extremely organized and efficient, and always with a smile on her face. We were lucky to have her, considering all the moving parts we had to deal with every single day.

Several other Local 695 members helped us along the way. John Maynard was with us for a couple of days during prep, as well as Denis Perez, Raam Brousard, and Ethan Molomut who joined the team on some of the “big” days. Tad Chamberlain jumped onboard for one day as well, to replace Dan. I feel extremely lucky to work with such experienced and dedicated professionals.

Location, location, location

Our script called for a courthouse, a hotel for sequestered jurors, a restaurant, and a city park. Only the courthouse and the hotel were locked-down sets. The others were open to the public while we filmed, which, as you can imagine, made those days that much more interesting.

The courthouse was located in Huntington Park, which is five miles south of Los Angeles. It was decommissioned for budgetary reasons years ago, and it’s been in disuse since. Most recently, it was used as a, wait for it, a Halloween funhouse. The Construction Department did an amazing job of bringing the building back to its former glory.

The building sits between city hall and the police station, both still active, which caused us unending grief with RF issues (more on that later).

All the sets had hidden cameras installed. We considered hiding plant microphones in a couple of the sets, but we soon abandoned the idea, as it’s one thing to hide a mic from the cameras for a few hours, and quite another for someone who might be sitting only a few inches away, day after day.

Most of the story beats take place in the jury deliberation room. The cast and our Hero would start the day here before entering the courtroom, and then later be back to discuss the case or spend their lunch breaks.

The other main set was the courtroom. We had several MKH50 microphones on fixed mounts, hardwired to the main cart. As a backup, both the judge and the attorneys had their own lavaliers feeding recording packs, which we downloaded at the end of every day.

We also had interview spaces on the north side of the building. They were pre-lit, with booms set up for proper sit-down interviews. The windows in the interview rooms faced the police department just one hundred yards north of the courthouse blasting twenty watts of RF at random times. We had constant issues here, which kept me from recording clean audio at the main cart.

This prompted us to run hundreds of feet of antenna cable, from one end of the courthouse to the other; even to the roof, to cover the action in a couple of exterior scenes. We installed an RF Venue 4 Zone antenna combiner in the cart, which I had to “operate” when the action moved from one set to another. In order to avoid overloading the antennas, I had to switch different zones on and off, depending on where the actors were headed.

The 4 Zone combiner wasn’t designed for active switching in that manner. It took several button pushes to switch each antenna bank. Naturally, a handful of times in the heat of battle, I managed to keep the wrong zone on, causing dropouts and other issues…

We soon came up with a plan B, a Super Zuca cart (created by fellow 695 member Eric Ballew), which contained two bags; one with a Sound Devices 688 and another with a 633, for a total of eighteen wireless channels. Dan, as part of the “documentary” crew, wheeled it into different locations. During interviews, I would turn off the Comtek feed in the main cart and Dan would switch on a transmitter installed on the Super Zuca, so the creative team at video village could listen to a clean mix devoid of RF hits.

Filming in a restaurant teeming with customers

When I saw “INT. MARGARITAVILLE – NIGHT” in one of the scripts, I immediately called our PM to confirm whether that name was just a placeholder or the well-known restaurant located at Universal CityWalk. I was obviously concerned about all the logistics involved.

The answer was “Yeah, the restaurant will be open to the public.” In the script, this would be a personal outing, away from the court and the “documentary” crew. I was extremely nervous about our ability to capture all the dialogue without lavs, to say nothing of where our rig would be located, and how we’d route feeds for the earwig channels, Comteks, antenna placement, etc.

We did have a pretty thorough location scout a few weeks before, where we had a chance to formulate a plan of attack. To my relief, it was decided the “documentary” crew would tag along with the group of jurors, at least for the first hour or so, which helped bring down my blood pressure a bit.

This was definitely the most difficult location for us. It came after an already long day, starting at the hotel and then at a garment factory downtown. Lots of moving parts, additional day players, duplicate sound teams leapfrogging the cast bus in order to prepare for the next scene, wiring talent with seconds to spare. At the end of the night, we were all exhausted, but also exhilarated because we were able to pull off the seemingly impossible.

Home away from home

In order for The Truman Show conceit to work, we needed a plausible reason to cut off all communication with the outside world. Our Hero would have to relinquish his connected devices, and would not be allowed to go home. Thus, James Marsden’s celebrity became a distraction to the proceedings, which gave our judge a reason to sequester the jury.

Production found a hotel that was closed for renovations, perfect for our needs. The first two or three floors had been remodeled, we could have a mini-production office in one of the rooms, and there would be no other guests that would get in our way.

There was a common area with books, a TV set, and gaming consoles, where the jurors could spend time together, and it would be the setting for a couple of key plot points. We did experiment with a DPA 4097 connected to a recording pack, which we hid high up behind a curtain fold—but, as we suspected, it never provided more than a fuzzy ambient perspective. You can’t fault us for trying!

Another major location involved a birthday party at a city park, which the “documentary” crew would not attend. All the actors would be wearing lavs—except for the Hero. We planned on having him wear a micro recorder and tested a few different models. We asked Prop Master Jason Phillips to come up with a hat or some accessory that could house the recording device.

One of the characters, Barbara, played by Susan Berger, crocheted during court hearings. Jason suggested we make crochet lapel pin buttons in which we could hide the miniature recorder. The story would be that Barbara had made a few crocheted buttons the night before as party favors, which she would pin on various people, one of whom, of course, would be the Hero. Ultimately, our EPs decided the risk of the recorder being noticed was too high, and we shelved it.

The park was a couple of miles from the hotel. It had plenty of trees and a nearby parking lot, where a video village van could be staged relatively inconspicuously. Two “porta-potties” were set up very close to the set, one was occupied by yours truly, operating the Super Zuca, and the other by Steve Canas, our Video Tech Supervisor. We’ll always look back at this shoot as the time when our careers literally, ended up in the toilet…

Digital wireless—a steep learning curve

We needed more RF channels to cover the show’s needs. I briefly debated whether to get a third venue VRM, but I also realized that having digital channels in such a hostile RF environment would help our reliability. I chose Lectrosonics products, a DSQD receiver and DBSM transmitters.

Dan had already worked with DSQD’s and DBSM’s, and he had learned a couple of valuable tips, contrary to what we’re used to with hybrid wireless devices. The DSQD tends to react poorly when digital signals are received by a powered antenna at full strength; there’s intermittent garbling in the dialogue. We spent some time adjusting the antenna bias power to keep them at around 75%-80%. We tested extensively and played with the antenna bias gain to optimize the signal quality.

Perhaps more important, reducing noise is far more effective than boosting the TX signal. We scanned several times, reducing the bias power to the antennas, until the histogram looked pristine. Once the noise detected was minimal or nonexistent, coordinating frequencies became much easier.

Considering that the wires would be in close proximity most of the time, we decided to keep all analog TX’s at 50mW, and the four digital TX’s at 25mW. We also switched the step size on all channels to 25KHz, which helps fine-tune the coordination—100KHz steps are too far apart when coordinating that many channels.

For scanning and coordinating, we used Wireless Designer. It’s quite powerful and has many useful features. Besides the talent wires, we had a handful of earwig and comms channels which we had to coordinate as well. My receivers are mounted on the back of my cart, so I keep Wireless Designer open on my laptop at all times. This allows me to check the signal strength and battery health on each channel regularly.

An old dog learns new tricks

Many of my colleagues have been using Dante for years. In case you’re not familiar, Dante is an AOE (Audio Over Internet) protocol that allows you to transmit full bandwidth audio over long distances, as well as routing signals to multiple devices. It requires Dante-enabled devices, a laptop that can run Audinate’s Dante Controller, and an Ethernet switch.

Before this project, I simply did not have a reason to use Dante. But when we looked at this project’s needs, it was obvious we’d have no choice recording upward of twenty tracks at a time, without counting comms and other inputs. We chose the Sound Devices Scorpio recorder. It has sixteen physical inputs, but with Dante, you can access all thirty-two inputs.

We started looking for a rack-mounted, DC-powered 16×16 Dante interface to feed the analog inputs from my two Lectrosonics Venue VRM’s into the recorder. Unfortunately, there weren’t many options. Worse yet, availability in late 2021 was very limited, thanks to the pervasive COVID-related parts shortage.

We settled on the Audio Science Iyo Dante 16.16MD. I ordered it in early November, and kept my fingers crossed. The expected delivery date stretched into December, then January. I was getting nervous as we still had to install the unit, program the network, and test routing. Fortunately, Aaron “Cujo” Cooley in Atlanta had an Iyo Dante 8.8MD, which he kindly rented to me for the duration of the show. It wasn’t a permanent solution, but it got us through.

The Scorpio, the DSQD, the Iyo interface, and my MacBook Pro laptop were all connected via Dante, while the two Lectrosonics Venue VRM’s were connected to the Iyo.

Besides running the network via Dante Controller, the laptop would also run Boom Recorder by Pokitec, which would serve as our backup recorder. We recorded a mono mix, the first thirteen ISO tracks, plus a sub-mix of all the plants in the courtroom.

Once we completed the initial test/prep period, all our inputs and routing were pretty much set, and there would be no need to fiddle with anything. It proved to be a reliable, solid Dante system, and it gave us zero problems through the run of the show.

Wireless overload

We had our hands full with our wireless channels: sixteen tracks for talent, three earwig channels, and a couple of comms. Coordinating them all took longer and longer, as we tried to optimize our system. We had reached the practical limits for that location.

As the story developed, we saw the need to cover more actors. Dan proposed bringing in additional DBSM’s, to be used as recording-only devices, for day players with one or two lines. To distinguish them from the other DBSM’s, we marked them with bright red plastic covers on the SMA connectors. We jammed them with timecode in the morning. To ID the file, we would record a verbal ID at the beginning of the clip. Jen would whisper the date, time, and character name into the lav while placing the transmitter in an ankle strap.

Can you hear me now?

Earwigs would be a key element of the production. The Phonak Roger system became the de facto standard after the original 216MHz Invisity system was discontinued. It broadcasts in the 2.4MHz range, which can be unreliable, because film sets nowadays are full of RF devices using the same frequencies.

I knew there were repeaters and other solutions out there, but none felt like a winner. Once again, having Dan in our team proved to be a blessing. He showed me his own earwig setup, adding a 2.5 watt signal booster for the Roger base station, feeding the signal through a CP Beam antenna; all in a small, lightweight sound bag. The range is much better, and more reliable than the stock unit.

Since we needed two discreet earwig channels, I shamelessly copied Dan’s bag. They looked so much alike, we christened the two bags “Thing 1” and “Thing 2.”

The Roger earwig system was used by the judge, the defense attorney, and one of the jurors. The judge, played with gusto by Alan Barinholtz, was far enough away from our Hero that we were not concerned the earwig would be visible. The other two cast members were female; their hair covered their ears, so they were safe even at close quarters.

The creative team requested earwigs for other cast members, as the plot progressed. Regular earwigs would be noticed up close, however. Jen mentioned some micro earwigs she had worked with in the past, so small that you couldn’t see them. I asked her to order a couple of different models for us to test.

These units are small because they’re fed from an induction loop, so they aren’t as simple or quick to deploy as the traditional Roger units. The actor cannot wear a thin top, that would reveal the loop around the neck.

These actors had to wear two devices, a single-battery Lectrosonics transmitter for their mic, and a Sennheiser G3 receiver on their ankle, feeding the induction loop. The micro earwigs were a success by virtue of being so small. In fact, they were embedded so deep in the ear, we had to order a couple of rubber-tipped tweezers to pull them out!

Jake Szymanski, our Director/EP, guided the talent from a producer’s console, which allowed him to address individual earwig wearers by choosing one of three push-to-talk buttons. The earwig channels were routed to my cart before being fed to the transmitters, so I could monitor them. I knew there’d be quite a few “audibles” and unexpected changes, so I wanted to make sure our team would always be one step ahead.

Synching sound and picture

We shot with twelve cameras of various models and specs. Seven of them had SMPTE timecode ports, while the rest were either DSLR’s or GoPro-type cameras. We deployed seven Denecke JB-1 sync boxes. These have been the most reliable and convenient timecode devices I’ve ever used. They’re small and light, they have a clear readout, and their battery life is exceptional.

For the cameras that can’t take timecode, we had two Microframe Timecode Sync Masters. The camera team called them “pillbox slates,” and the name stuck. They’re small timecode displays powered by a 9V battery, without a clapper, which can be stored in a pouch or pocket, and flashed in front of cameras at the start of a take.

I’ve tested the Sync Masters and they’re pretty accurate but, because I haven’t quite tamed my OCD, we kept them jammed with Tentacle Syncs taped to the back, rather than trusting their internal clocks. We installed a 27” video monitor on top of my cart, with a nine-camera split screen. I checked their timecode readouts several times per hour. I’m happy to report that, in three months, I only saw a few scenes in which a camera was out of sync—and it was fixed within minutes.

The proof is in the pudding

I’m still amazed that Ronald, our Hero, never suspected any foul play, even though there were a couple of moments of utter panic when we thought he’d figured it all out. But, as improbable as it would seem, he never did. I think that’s a testament to how professional and dedicated every member of the cast and crew were.

A year later, thinking back, I’m very proud of what we were able to achieve. Even when I was frustrated with our results, Dan constantly reassured me that I had unreasonable expectations. He had a mantra, “We’ll try it again tomorrow.” It’s an apt phrase, considering what we were up against, and it helped me relax a bit, and enjoy the ride.

Obi-Wan Kenobi

by Julian Howarth

In a galaxy, far, far away and back in January 2021, I first got the call about a new Lucasfilm series to be made at Manhattan Beach Studios. I can remember how excited I was and the rush of being asked to work on a Star Wars project. This was the reason I wanted to work in film in the first place. I couldn’t believe that this might happen. Would the Force be with me?

This series would also showcase Industrial Light & Magic’s (ILM) virtual production platform, StageCraft, to help bring the worlds of Star Wars to life.

I immediately knew I had to glean what information and tips I could from fellow Production Sound Mixer, Shawn Holden, in terms of methodology and how to cope with the excessive reverb we would face on this type of stage. This is a circular LED volume surrounding the set, a very reflective surface around and above us at all times. Shawn had been working on The Mandalorian for the last two years, and I wanted to emulate her work as best as possible. Shawn was incredibly open, and as I’ve learned over the few years I’ve been in the US, our sound community is utterly supportive, helpful, encouraging, and just downright lovely.

First thing was to make sure we had the same sound baffles that we could deploy in ILM’s volume, and after that, it came down to a negotiation with Chung-hoon Chung, our brilliantly talented Director of Photography, and myself, as to where and when we can bring in the baffles onto set, so as not to affect the lighting or performances. “Get them in front of the actors and as close as you can,” said Shawn, she was not wrong. A lot of carpet was utilized, and we also ensured that set builds incorporated sound-dampening materials which would make sure this was a much less reverberant experience.

Soft prep meant getting together a crew that were keen and capable, and the size of this project meant we had to have a 4-person crew, sometimes expanding should the occasion call for it.

Ben & Yisel trying to keep cool
Yisel, Julian, Ben, and Yohannes
Yohannes Skoda and Ben Greaves

Firstly, and with every project I have been on in the last ten years, it was a call to 1st Assistant Sound, Best Boy Sound, Boom Operator, 2nd Unit Mixer, and problem solver Ben Greaves. Ben has been my partner in all things film sound for the majority of my time here in the US. He is solid on every level. I can make the equipment talk and sing, but Ben can turn a film crew into a collaborative family all intent on helping the Sound Department out, and vice versa. He is the greatest at that.

Next was our Utility Sound Technician, coming with us from Avatar: The Way of Water was Yohannes Skoda. He was to join us as Utility and all-around super sweeper, as a personality he fits right in, I know of no harder worker and closer friend.

This would be then supplemented with a sound trainee. The decision to use the Local 695’s Y16a training program was a simple one to make and an even easier decision for production to give the green light to. The trainees would alleviate the workload and keep us all in check and our feet on the ground. From the jump-off with such a long period of work, we decided that we could have two trainees join us for two halves of the show.

Yisel Pupo Calles and Chris Burr would fill these shoes admirably. Yisel joined for the first half, Chris would finish the show off. They started with basic tasks that expanded as they became more experienced and able. They were responsible for cable runs, a huge amount of carpet and baffle wrangling and looking after two VOG systems that leap-frogged across the stages we were filming on, amongst a large list of other responsibilities. It is my feeling that any trainees working with me should get a good lesson in preparedness, always be on the lookout for potential problems, and know what you put in is what you get out. Yisel and Chris both graduated after the show and are both established and hugely talented Utilities. If anyone hasn’t worked with one of Local 695’s vetted trainees, I cannot recommend it highly enough.

Finally, we put together a supporting sound crew for dailies and second units that would supplement when we needed. Erik Altstadt, Scott Solan, Cole Chamberlain, and Terrell Woodard would fill those shoes admirably. I was very lucky to have them.

There are a lot of hero costumes on a Star Wars set and some difficult rigs for radio mics, so during prep, Ben and I spent valuable time with the Costume Department deciding how we should rig them. At no point were we as a department deciding how to rig a mic while setting up for the scene on set—it was all decided weeks before that point.

During filming, Yohannes was three scenes ahead of what we were shooting. We were prepared for every eventuality and scenario. I really don’t like surprises and this forethought and planning from the whole team meant that surprises never happened. Hero costumes all had their own dedicated mic and a spare just in case. These mics were sewn into costumes, fitted in helmets and breastplates, and squeezed into body-hugging suits. The on-set dressers were amazing and hugely experienced and without whom again we couldn’t have done such an amazing job.

Waiting for the light
Ben and Yohannes carve it up
Yohannes Skoda – Boom (literally)
Top team: Cole, Ben, Yisel, Dan Moore (photobombing), Julian, Chris, Yohannes and Scott. Sadly not pictured here are Tyrell and Erik.

The people. It’s always the people.

Let’s face it. Sound is dead in the water without everyone else’s help; just being quiet around set, making camera rigs, dollies and props silent, working with costumes to fit radio mics, and generally giving us the time and space to do our work properly and professionally.

Grip Department, Lighting, Camera, Costume, Props, Art Department, AD’s, PA’s, Background Artists, Puppeteers, and Visual Effects, in fact, the entire crew as a whole. We owe them all so much for their patience and understanding of the importance of great sound to a film or TV show. I cannot thank the incredible crew enough for all their understanding and that they were so willing to help us out in any way. I put this down to a great crew but also to the fact that Ben, Yohannes, Yisel, and Chris make a huge impression around the set. Their affability and presence turn this kind of collaboration into an art form. It’s a marvel to see and watch.

I am not a ninja Sound Mixer. I thoroughly disapprove of that term. It minimizes what we do and what we can give to a production. We are a hugely creative addition to storytelling. I never just try to get through the day unnoticed, unseen, and unheard. I always make sure I am in sight and earshot of the Director and the DP. I am there to answer questions and offer solutions. I am there to cheer the crew along too, but with that said, I also understand that there are days when we as a Sound Department should take a less prominent seat. Knowing the difference, I believe, is what makes our involvement and input valued and often requested.

Skywalker Sound’s Randy Thom once wrote a little piece on his Facebook page, and I can’t stop thinking about it. He said, “In a new book on creativity, Rick Rubin says, ‘No matter what tools you use to create, the true instrument is you.’” This is precisely why mixers should be asking fewer questions about gear, and more questions about creativity as it relates to their own thought processes. When nearly all your questions are about tech, it means that you think you have the other bases covered when in fact, you don’t.”

This became apparent when discussing filming with our Director and fearless leader, Deborah Chow. During prep, Deborah talked about wanting to use quite a lot of music and sound FX playback on set for motivational purposes and setting a scene.

Deborah would ask for moody, marching, hero, scary, the light and dark side. Every description was an emotion or pace she desired. She was specific and so well-prepared, it was a joy to be asked for this type of collaboration. Every question I asked back was answered thoughtfully and in precise detail.

I remember the first time we used music, it was the show’s entrance of Darth Vader. He was walking down a street flanked by stormtroopers, while handing out cruel punishments to the planet’s inhabitants, and on the lookout for Obi-Wan. It was a massive setup, multiple cameras, hundreds of background actors, and a large crew. I found a moody and dark “Imperial March” section and it worked beautifully. The stormtroopers stood straighter and with menace, Darth Vader had a rhythm and dark purpose to his walk, even the crew felt an emotional connection while it played out, and I know it made a huge difference to the final shot.

This was to be used so many more times and became a fun and hugely enjoyable part of our days. Even to the final duel between Obi-Wan Kenobi and Darth Vader, where we played a version of John Williams’ “Duel of the Fates,” while they sparred. I know Ewan McGregor and Hayden Christensen were hugely appreciative of the additional motivation and emotive push that it gave to the scene.

Playback sometimes ruins whatever dialog and sound effects we want to get from a shot, and our primary job is to protect the performances of our actors. But we are also there to help a director elicit something different and exciting, too. I always get at least one clean take and a wild track with grunts, Foley, and breaths, for the action track. For the rest, we let the music play out.

Now for the kit and set up…

Digital recorders and wireless systems don’t make creative decisions. We do. My driving message is this: “Does it sound good?” That’s it. Not what I use, but how I use it and what my ears think. With a constant dialog from the beginning with post-production, the editorial team, and Deborah Chow, I kept affirming throughout filming that we were doing well and getting them everything the show and the Director needed.

I have the ability to record up to twenty-four tracks with my digital system, using two to three booms, combined with plant mics and the ability to have up to twenty individual radio mics at one time. Discrete IEM’s when needed, underwater speakers for use in water tank scenes, and many more little gadgets that help me get through any given day or problems that usually come up.
We have twio VOG systems. PTT mics were made available to the 1st AD and Deborah. We also provided comms for all our helmet wearing actors and BG artists, prosthetic wearing aliens, and their puppeteers. They could get a combination or individual feeds of AD’s, director, music, sound FX, or program sound.

I used Avid Pro Tools and Ableton Live for editing, sound design, FX, and music playback.

I know this is not the kit list you wanted but none of this would have worked if it wasn’t for the amazing people I had the honor to work alongside and learn from every day.

Yisel is originally from Cuba and moved to Los Angeles five years ago to pursue the dream of working on major motion pictures and TV shows that would allow her to work with sound mixers, actors, and directors that she admired. After her graduation from the International School of Film and Television, where she specialized in production and post-production sound, she headed to Barcelona to work in significant post-production houses for nearly four years. There she gained experience as a Sound Designer, Foley Recording Engineer, and Dialog Editor; a knowledge that serves her today to be a much better Sound Technician on set.

Chris Burr, born and raised in northeast Mississippi, in a small town called Columbus, just outside of Tupelo, moved to Los Angeles in 2020, right before the pandemic in the hopes of starting a career in the industry. He grew up as an only child, with his imagination and creativity running deep. He believes that if you dream it—and put in the work—you can achieve anything. Chris’s first sound job was the feature film Dog, and he hopes to continue this journey and keep encouraging others along the way.

Yohannes came into motion picture sound from a background in music production. Additionally, he worked as a Production Assistant and stand-in for many years, giving him an insider’s view of the collaborative nature of filmmaking. Over the last year, he has worked primarily as a Boom Operator, and three years prior honing his craft as a Sound Utility. He is currently working as a Boom Operator on a TV show and appreciates the unique set of challenges that come with the position. He really enjoys the energy of being right on set, working with camera, grips, and electric to solve problems and get the job done. Going forward, he would love to continue working as a Boom Operator for movies and TV shows.

Ben Greaves is my closest friend, ally, and confidant. Ben came to the US the same time I did, however, we didn’t know each other at that time. Ben started his audio journey as a child, accompanying his father to the Manor Studios. He would play among the cables and mixing desks while his dad was recording his latest album. Ben started his film career working with UK mixers, including Simon Hayes and Jamie Gambell, soaking up everything he could. He is dedicated to great sound and one of the best. Creative, collaborative, and concise.

Me? Well, I’m having a hoot and absolutely love what I do. I listen with my heart and my ears and sincerely hope I can continue to do so for many years to come.

Another Day, Another Dahmer

by Amanda Beggs CAS

Dahmer. Monster: The Jeffrey Dahmer Story. Evan Peters as Jeffrey Dahmer in episode 102 of Dahmer. Monster: The Jeffrey Dahmer Story. Cr. Courtesy Of Netflix © 2022

Without fail, one of the first questions I always get asked when someone finds out I worked on Dahmer – Monster: The Jeffrey Dahmer Story is, “How did you handle that subject matter?” Even for people who work in this industry, in sound, the top question hasn’t been about anything technical, “what mics did you use?” or “how did you film all the driving work?” People have been more curious about how I and the rest of the crew survived six months working on a relatively accurate show about one of the most prolific serial killers in the United States. With good reason, as at times, the subject matter did get very dark, the scenes very intense, and the prop food very … realistic. It was also a grueling shoot in terms of locations, night shoots, and multiple units shooting simultaneously.

Dahmer. Monster: The Jeffrey Dahmer Story. (L to R) Shaun J. Brown as Tracy Edwards, Evan Peters as Jeffrey Dahmer in episode 101 of Dahmer. Monster: The Jeffrey Dahmer Story. Cr. Courtesy Of Netflix © 2022
Dahmer. Monster: The Jeffrey Dahmer Story. (L to R) Michael Beach as Detective Murphy, Colby French as Detective Kennedy, Evan Peters as Jeffrey Dahmer in episode 105 of Dahmer. Monster: The Jeffrey Dahmer Story. Cr. Courtesy Of Netflix © 2022
Dahmer. Monster: The Jeffrey Dahmer Story. (L to R) Richard Jenkins as Lionel Dahmer, Molly Ringwald as Shari, Penelope Ann Miller as Joyce Dahmer in episode 108 of Dahmer. Monster: The Jeffrey Dahmer Story. Cr. Courtesy Of Netflix © 2022

Without fail, one of the first questions I always get asked when someone finds out I worked on Dahmer – Monster: The Jeffrey Dahmer Story is, “How did you handle that subject matter?” Even for people who work in this industry, in sound, the top question hasn’t been about anything technical, “what mics did you use?” or “how did you film all the driving work?” People have been more curious about how I and the rest of the crew survived six months working on a relatively accurate show about one of the most prolific serial killers in the United States. With good reason, as at times, the subject matter did get very dark, the scenes very intense, and the prop food very … realistic. It was also a grueling shoot in terms of locations, night shoots, and multiple units shooting simultaneously.

Before I go any further, I have to acknowledge and thank my crew of Boom Operator Zach Wrobel and Utility Sound Technician Saif Parkar, as well as the mixers and crew who came in to handle our second units. I was also very fortunate that Netflix responded positively to my request for a Y-16a trainee as a member of our Sound Department. Due to the length of the show, I was able to have three trainees; Britney Darrett, Leslie Metts, and Brandyn Johnson cycle through for about two months each, as well as host some incredible day-player trainees. Almost every single one of those trainees have gone on to become a full-time union Utility or Boom Operator, so I want to stress the importance of pushing for a trainee as a normalized member of the Sound Department! This is how we train the next generation of sound professionals.

With such intense scripts, I knew we were in for some emotional performances from our cast. Like any sound mixer, I place the utmost importance on capturing an actor’s dialog as authentically as possible, to avoid the need for ADR or looping. The biggest challenge for myself was staying alert and ready for an actor to jump from a whisper to a scream with no forewarning, and have that volume change from take to take. For Boom Op Zach, he had to work with the same level of attention, but with the added challenge of staying out of eyelines, while avoiding the countless reflections and shadows on our sets lit mostly with practicals. The all-metal gold bookshelf in Glenda’s apartment was a favorite for reflections. Saif, the Utility Sound Technician, was given the challenging task of wiring a main actor dressed in only a white T-shirt, fitted to his body; and to prepare for lots of physical exertion. During our first week, we realized the heartbeat of that actor was being picked up by the wire, and it was substantial enough that I was a little concerned. Luckily, a quick conversation with post let me know the heartbeat was removable.

Zach booming from the roof
Zach Wrobel booming
“Brokaw” mic POV
Saif Parkar booming from an unusual spot

As on any show, we have to work within the confines of the shot to capture quality sound. This show definitely gave us some challenges in regards to nontraditional coverage and shot design. This meant we had to rely on good sounding wires, creative booming, and many, many plant mics. On several occasions, there wasn’t even enough room to fit a boom pole to get the mic in the best spot, so Zach and Saif would resort to hand-holding the mics in the shock mounts, a move that was dubbed “the Brokaw.” Zach also had what I assume was a very exciting day of being strapped into a harness so that he could boom from the roof of one of our house locations. One of the benefits of having a trainee on the team means that the trainees could gain useful booming practice, under the supervision of the Boom Op, when grabbing off-camera lines. Post is always happy to get as many off-camera lines as possible, and it can sometimes cover the scene in the same way wild lines would, while saving production time. It’s also the perfect way to have a trainee work on skills that can only be acquired through physical practice, but in a less high-stakes environment like capturing on-camera sound.

One of the Sound Department’s worst challenges is always the dreaded “wide and tight” shot when multiple cameras are in play. Luckily, the option to either paint out the boom or simply do a plate shot for an upper third replacement has become not only more common, but also generally more well-received by cinematographers, directors, and producers. I had a conversation with our producers early on about whether we’d have the ability and budget to plan on painting out booms when it came to wide shots where getting a clean boom track was critical. They were very open and receptive. I always try and make that option the last resort, as I am aware of the potential costs each time we ask to break the frame, but there were a good number of moments on Dahmer where we absolutely needed permission to be in the shot, and luckily, we were given it. The interrogation/interview scene of Jeff by the two police officers was held in a room built on one of our stages—complete with two-way mirrors and windows. Because of the amount of dialog, and the emotional performances, the director wanted to cover the scene with multiple cameras, which of course, meant a wide two-shot, as well as singles. Breaking the frame allowed us to get the booms where we needed them, and still complete the scene as the director wanted.

I’ve been working with my Utility Saif for more than a decade. On one of our earlier movies together, the Boom Operator gave Saif the nickname “The Gardener” because of his ability to hide a plant mic pretty much anywhere. This is a skill I find highly invaluable, and then I lucked out twice on Dahmer because Zach is also a master gardener. DPA 4098’s have become such an integral part of my gear. With their small size, but directional pickup, a well-placed 4098 can rival a boom mic in some instances. The obvious choice is to place them in cars, which we did plenty of times, but we also hid them on set in various locations. Then there was the “desk stand” setup—exactly as it means, a desk stand that I’ve attached an Ambient QuickLok to the end where a mic clip would normally live. This allowed us to quickly drop any mic on a shock mount onto the set and place it on the ground, behind doors, on or under furniture, etc.

Plant mics everywhere!

Hiding plant mics is a skill, but it’s even better when you can get away with having a plant mic “hidden” in plain sight. On this show, that required collaboration with our Props Department. Because this was a period piece, and there were many scenes that required prop microphones, I met up with our props team early on and we discussed where it would be helpful to have working mics that also looked appropriate for the time period and scene. We had tabletop mics in the many courtroom scenes, handheld mics for reporters on the scene when Dahmer’s apartment was being emptied by the police, and lavalier mics for the many recreations of historical interviews that were planned, from Geraldo to 60 Minutes.

For the handheld mics used by reporters, my job was to source real working mics that would closely match the prop mics our Prop Master had already rented. A silver Shure Beta 87A was the winner. The tabletop mics picked for the courtroom scenes were luckily already working mics. These came in handy because our judges’ robes were made of a surprisingly loud material, so a wire placed on a judge just picked up a lot of clothing noise. The courtroom scenes also lent themselves to a lot of big wides, to showcase the entire room, so having some working mics directly in front of certain characters was very advantageous. Recreating the few sit-down interviews that Jeff and his family did was relatively easy, we just had to find a lav and clips that looked close enough to the ones worn originally by the real people.

To circle back to the main question I’m always asked, how did we keep our spirits up and push through the six-month shooting schedule? I think it really came down to the makeup of the department, and how we would take turns lifting each other up. Everyone would cycle through days that just really wore them down, and so the rest of the team was always there to try and provide either a moment of levity or just an ear for venting. We always made sure everyone was hydrated or had snacks, and we quoted dialog from the show incessantly. We’d all latch onto some phrase or line that sounded particularly ridiculous when said out of context, and then we’d just repeat it unremittingly, usually in a passable-to-awful Wisconsin accent. Then of course, we had to start or end each day with our favorite phrase, which I believe Saif came up with: “Another day, another Dahmer!”

Amanda’s main cart
Amanda’s insert car setup
The Amanda Beggs CAS underwater mic—everyone’s favorite
The Key Grip replaced my chair with a slightly more expensive one

Our Set Lighting Technicians started one of our favorite silly traditions—the rubber chickens. Our Dimmer Board Operator had one on his cart and would squeeze it at random times throughout the day, and eventually he brought in a whole bag of mini-chickens and handed them out. I kept mine zip-tied to the front of my cart. Everyone would be spread out across the stage, or location, and you’d hear one go off, and then this cascade of multiple chickens screaming would echo in return. It never failed to make us all feel better, as silly and dumb as it was. Our final shooting day was a fun one filled with John Wayne Gacy drowning someone in a bathtub—end on a high note, they say! As a wrap gift, I gave our Dimmer Board Op a giant chicken, and when squeezed, the chicken would yell for forty seconds uninterrupted. Needless to say … it was glorious.

But that’s how you do it, that’s how you survive half a year working on a project that highlights the worst and darkest of humanity. You surround yourself with good and talented people and you allow them to have the natural ebb and flow of human emotions without holding them to some ridiculous and impossible standard of perfection. I am very proud of the work we did on Dahmer, and I owe that absolutely to my team.

Our Dimmer Board Op and his giant chicken
(L-R) Britney Darrett, Saif, Amanda, Zach,
camera team took this fantastic group shot.

An Interview with Halter Technical—Featuring Doc Justice

by James Delhauer

Equipment in the film and television industries is highly specialized. The tools that we use are custom designed to their purpose, making it difficult to “shop off the rack,” as it were. This can make gear an expensive investment and, what’s more, many of the products we buy as part of our kits or equipment rental packages are not designed by the people who are going to use them. To be sure, they are developed with end users in mind and the most successful vendors have found success because of their ability to internalize feedback and incorporate it into their products. But a disconnect between developer and customer is not uncommon. That cannot be said for the products of Halter Technical, which have been developed for production sound workers by production sound workers. Following this year’s NAB Trade Show in Las Vegas, I had the opportunity to sit down with Halter Technical CEO and Founder (and Local 695 member) Doc Justice—who shared some insight into new production sound products like the Microsone Discreet Audio Monitoring System. 

Q: Alright, tell me about you. What’s your story? 

Growing up in Philly, my start in sound came as a DJ as a teenager in the mid-’90s. At first it was mobile parties, Bar/Bat Mitzvahs, weddings, and country club events. That led to nightclubs and even a stint in commercial radio. After college, MTV’s The Real World came to town and I got my first taste of production as a PA. From there, I moved out to LA to put my full efforts into mixing sound for unscripted TV. I worked mostly in large-scale house reality, competition shows, dating, and cooking shows. My specialty was working with large track counts with a lot of RF channels.
 
Q: This was before Halter Technical, right? 

Halter Technical was born on set. In reality TV, handing out an IFB meant giving Producers and Directors a coiled headset that so many of them just hated. These headsets weren’t made for IFB’s; they were designed as “listen-only” walkie-talkie headsets. They sound terrible, they’re not comfortable, and they’re just not made for producing TV. When I couldn’t find a better offering, I made one myself. That’s how the field monitor came to be. Once that started to take off, other sound pros asked me to make something for scripted work. That led to the release of the scene monitor. Then, people wanted something more substantial for scripted Directors and Producers. That became the elite monitor. Now, we have a line of headphones that are built specifically for these different jobs on set.
 
Q: And that led to the development of the Microsone? 

Each piece of gear is a tool to accomplish a job on set. Microsone Discreet Audio Monitoring System is our new take on an earwig, so on-screen or onstage talent can monitor audio, take cues, be fed lines, listen to playback; all without having to stop the action. 

Q: An earwig is pretty common in an audio kit, isn’t it? What makes this unit different from all the other devices on the market?

Microsone was born out of frustration. I set out to solve as many of the issues of previous systems as I possibly could. Our system works by connecting the Microsone (the earbud, itself) to our Control Pack via Bluetooth. The Control Pack is an IFB receiver that can be fed by any analog transmitter that you currently use, so you’re not tied down to a proprietary base station or frequency limited technology. So your long-range transmission comes from your transmitter to the Control Pack, and that audio is then retransmitted up to the user’s ear. We were able to really modernize the whole earwig concept and pack it with advanced features.
 
Like what? Give me some examples.

The Control Pack can receive VHF (174 MHz-217 MHz) and UHF (470 MHz-608 MHz) audio. This enables you to have as many isolated channels as you can coordinate. There are four banks of frequencies with seven channels per bank you can manually program. The top-seated 3.5mm jack works as an output to be used with a wired headphone like a typical IFB, but it can also be used as a line input jack to feed a source directly in without RF. Since we use Bluetooth, you could pair the Control Pack to any Bluetooth headphone to use as a wireless IFB. Or you could pair it to a Bluetooth speaker and have an instant wireless video village speaker setup.
 
So you can use the Microsone to listen to tunes at work? 

We have one customer who purchased a system because they work with a method actor who likes to have music fed into his ear to keep him in character. Now, he can have a Microsone paired directly with his own phone, and control his own music, even if he’s off set in his trailer.
 
Awesome. How do you handle volume control so you’re not blowing out an actor’s ear? 

The Control Pack has a volume knob that allows the user to set their own volume. They don’t need to call out to the Sound Mixer to raise or lower their volume. If the production or talent’s wardrobe doesn’t allow them to wear the Control Pack on their person, it can just be stashed nearby since the distance from it to the Microsone is typical Bluetooth range (10’-25’).
 
And what’s the power situation like? What kind of batteries does the system use. 

The Control Pack powers off two AA batteries. With the Control Pack receiving UHF audio and retransmitting it over Bluetooth, you can expect about twenty-four hours of use. If you’re using rechargeable AA batteries, those can be recharged internally using the USB-C port on the side of the Control Pack. That USB port can also power the Control Pack without batteries, which is great for permanent installations or powering off a bag kit. Anyone who has used an earpiece that takes hearing-aid batteries knows how frustrating they can be. The Microsone has a built-in rechargeable lithium ion battery that lasts for five hours of continuous music. It recharges in the charging case from 0%-100% in just forty minutes. Two Microsones are included with each system. You can transmit to both Microsones simultaneously or have a spare ready to be deployed on demand.
 
What else makes this a better investment than something from one of the other audio companies out there? 

The Microsone itself is built as one, completely sealed device. It can withstand a drop with anything breaking off it. It can be completely painted with makeup to match the talent without worry about sealing a battery door or corroding the inside. Beyond that, you’ve got all the versatility and power of the Control Pack as well.
 
Oh, that’s handy. I assume you don’t wash it in water afterward?

Cleaning it is as easy as wiping it with an alcohol pad. 
 

Q: That makes more sense. But let’s bottom line it. How much does the system cost? 

The entire Microsone D.A.M.S. kit, which includes two Microsones, the Control Pack, the charging case, a wall charger, and a USB Type C cable retails for US $1,200. Production sound professionals know that this is an incredible value, and something that can earn themselves a significant rental on.

Q: How does the Microsone fit in with the rest of the products you make at Halter Technical? 

All our audio monitoring solutions are built for professional use. Everything we do, and everything I do personally, is done with the goal of solving problems on set. Microsone is a great problem solver, and fits in place with a line of great tools that are developed specifically for us.
 
That all sounds awesome. And where can folk buy this gear? 

All our products are available from our wonderful retail partners. The full list of dealers can be found at our website at https://www.haltertechnical.com
 
Any sneak peaks at whatever your team is working on next? 

I can’t give away any secret recipes or anything, but I will say that all of our products exist due to user feedback. The only way we’re solve users’ needs is if people tell us what those needs are. I know from my own experience on set, the products I think would benefit people, but everyone’s experiences are different and diverse feedback is paramount for the company. That’s why we’ve tried to make ourselves extremely reachable through social media and through the website. For anyone who wants to talk shop, please reach out!
 
I would like to thank Doc Justice for his time and for sharing the latest from his company with us here at Local 695. From the time I spent with the Microsone, I would say that it really is a useful tool for a production sound mixer’s kit and, if this is an example of what’s to come at Halter Technical, I look forward to seeing what Doc and his team will bring to the table next. 

Maintaining Peak Performance

by Bryan Cahill

Pec Minor Stretch
Scalene Stretch
Shoulder Raise
Thoracic Extension

Like my 2003 Buell Lightning motorcycle, I still have some good days in me, but I need a lot of maintenance and occasionally, the replacement of an expensive part.

Whether you’re like me and have put in a few miles or you’re more like a late-model Honda CBR, you still need constant maintenance to stay at peak performance.

As I wrote in the 2022 Winter edition of Production Sound & Video, jobs that require raised arms such as boom operating may cause the development of thoracic outlet syndrome or TOS. Symptoms of TOS include pain or weakness in the shoulder and arm, tingling or discomfort in the fingers and arms that tire quickly.

Luke Kelly of Elemental Movement Personal Training recommends a few stretches that can be of benefit to all but especially Boom Operators and performed on or near set as quickly as going to get a cup of coffee. I perform them daily.

First is the thoracic extension:
Hands placed on a wall, a little wider than shoulder width. Think of squeezing your shoulder blades together at the bottom, raising the back of your hand upward and trying to sink your sternum to the wall.

Next is the pec minor stretch:
Stand with one arm against a door frame; start with your shoulder at about a 90-degree angle, now turn your body away from that arm, even turning your toes to face the opposite direction if needed. Repeat with the other arm.

And finally, the scalene stretch:
Open the palms so that they face forward, extend fingertips toward the ground firmly, look over one shoulder and breath. Repeat for the other side.

Additionally, you can add this related technique.
This time one shoulder will raise up toward the ear, and the ear will attempt to meet it. From here, leading from the chin, we turn our head to face the opposite direction of the raised shoulder.

Like regularly changing the oil on my bike, keeping the chain lubed and checking the tires, these stretches can help performance and might help prolong your career.

Safety First: Introducing the XO-Boom

by Eli Moskowitz

SAFETY FIRST
We hear these words on set all the time. Then we turn around and see production ignore safety to accommodate the pace at which the producers would prefer we work. For many Fishpole Boom Operators, this is a physical safety issue. Many Directors have stopped worrying about the length of a take because they no longer need to worry about the cost and supply of every foot of film now that cameras have become digital. Keeping your arms and hands up above your head is a taxing workout for the most avid gym-goer. Now add to that workout: the weight of the fishpole, the weight of the microphone and mount, the weight of the zeppelin and other wind reduction if you are shooting outside, plus the transmitter; no matter how small they are now; adds weight too… OK, now consider all that weight and the effort of holding it above your head for long periods while you are also moving around to cover multiple actors and trying to stay out of the shot and avoiding throwing any shadows. Phew!! It’s even exhausting just talking about it.

ENTER THE XO-BOOM
Over the years, there have been many attempts to create helpful safety rigs for those long takes when you can’t put your arms down with a long extension to your pole. Some of those other rigs were so over designed by a specific operator that no one else could really use it, while others tried to limit the amount of strain to the body but left the user almost locked into position. The XO-Boom from Cinema Devices is designed for everyone with safety and versatility in mind to assist on a modern production. Cinema Devices has had a wide variety of camera rigs to help with handheld stabilization and many Steadicam operators have used their products on set for years. Now the company has brought the Sound Department a safety rig of our own. At its core, the XO-Boom works as a “steadiboom,” with the padded vest distributing the weight evenly on your body. The height adjustable mast at the center of this design does the heavy lifting of holding the fishpole in the air, leaving the operator to focus on the task of getting the sound without worrying your arms will start shaking during a long take. The tensioned quick-release clam shell can accommodate the diameter of the standard fishpoles on the market with soft sound deadening foam on the inside of the shell. For counter force, there is a rubber foam inset hook connected to a latex elastomer tubing that secures to the vest and gives your pole balance when using the longer extensions. A pair of custom-designed Squid clamps made by Cinema Devices keeps that hook from sliding up and down the pole.

Since we are always on the move and even though you may prefer to boom right or left hand dominant, that isn’t always a luxury of the spaces we find ourselves shooting in. So, the XO-Boom is just as versatile. The mast and over-the-shoulder padded strap can be moved from one side to the other as fits your needs.

If it stopped there, that would be enough but Adam Teichman, the designer of the vest, took it a step further and thought of the ENG bag mixers out in the field, too. There are a pair of posts that screw into the bottom of the front of the vest to support your bag and a pair of clips that help keep your bag secure to the rig—all without adding any additional tension or strain to the user’s back or neck. The weight is distributed evenly to rest on the hips and avoid the risk of injury.

MY EXPERIENCE
I am a second-generation sound man who was lucky enough to grow up on Hollywood sets going to work with my dad, Edward L. Moskowitz CAS. My fascination with the magic of movies and television began with me sitting in the sound booth with my dad on Golden Girls and Empty Nest, watching the pros make television. I am one of the few lucky sound men doing the job that they imagined they would have when they were a kid. In my teens, I attended the performing arts magnet program at Pacoima Middle School and worked with other kids my age to bring our own short stories to life on the school’s audiovisual equipment. Looking back, I see what a great learning experience I had at a young age.

I feel privileged I had the opportunity to learn from my own father and worked with him on several shows at the end of his career, including sitcoms Anger Management and Ground Floor and the single-camera series The Guest Book. There were many other talented mixers, boom operators, and utilities who also taught me the tricks of the trade along the way. I moved between sitcoms, such as Call Me Kat with Dana McClure before he retired and continued with Elyse Pecora when she took over this past season, to joining Bruce Peter’s crew for the last few seasons of his career on The Conners and Bob Hearts Abishola. Whenever possible, I worked on single-camera series like Supergirl and Lethal Weapon, filling in on feature films, numerous commercials, pilots, and low-budget projects to hone my skills.

I joined the union in April 2012 after completing most of my Y-16A hours on iRob! (thanks to the UPM Tony Carey). Working my way up from a trainee utility, I took any opportunity offered to me as a second or third boom and this gave me the opportunity many times to work fishpole scenes. At the beginning of my career, I was usually the younger man on the crew and, like most young men, I never thought about my back or shoulders. I grabbed that pole, got out there, and showed them what I had. Luckily, I never injured myself in the over-11-year career I have enjoyed. And I plan to use this XO Boom to make sure that I never do; to protect myself from injury and ensure that I can continue my career for the many years I still have ahead of me.

DEVELOPMENT
I first met Adam Tiechman and his business partner Ariel Benarroch at the sound mixers swap meet hosted at Film Tools back in the summer of 2020. After the lockdown began, many of us found time to finally go through the gear we’d collected. The parking lot at the swap meet that morning was filled with some great deals on equipment, both useful and collector’s items. This was also the first time many of us had seen each other since the lockdown began. Adam and Ariel showed up later that day to talk about this new vest rig for the Sound Department. At that time, they lovingly referred to the vest as the “OK boomer.” I quickly saw the potential in what this new safety rig had to offer. Over the past ten years as a union member, I have worked in many different formats from sitcoms to single-camera and small budget ENG-style features, all of which have had long fishpoled scenes. I saw the opportunity for utilizing this incredible new device.

I told them I would speak to the mixers I was working with and hoped that I would have an opportunity to try it out on set. Working with sound mixer Bruce Peters made everything easier. He excitedly gave me the chance to bring the XO-Boom onto CBS’s Bob Hearts Abishola several times while we were on the backlot filming large exterior scenes. After each opportunity to use the experimental rig, I gave Adam my thoughts and suggestions for improvement, and he would update the prototype to incorporate my suggestions before asking when I could use the newer prototype on set again.

One day, we were doing a Nigerian funeral procession out on the backlot of Warner Bros. Studios, using a pair of Scheops stereo microphones with elbows in a custom shock mount to form the XY stereo needed to capture the singing and instruments used in the scene. While the Camera Department had their standard four studio cameras that day, they also needed a Steadicam operator to shoot the processional. When the Steadicam operator donned his vest rig that day, I pulled out the newest XO-Boom prototype to use with my fishpole. I was able to move and glide alongside the Steadicam with ease and the fishpole was at almost full 22-foot extension to allow me to get over the camera to the singers. Immediate relief and enthusiasm for this device came when I did not have to put the strain on my arms or back to keep that long pole up in the air for more than an half-hour. After the first take, I checked in with Bruce to see how it sounded and he was very happy; none of my footsteps had transferred any noise to the vest, boom, or mics as I moved with the camera. The production team at Bob Hearts Abishola has been very positive about my use of the prototype on set, and I look forward to bringing the newly built production line model out next season.

The XO-Boom is not just for use as a field rig. I recently needed it on the new Mike O’Malley NBC show We Thought We Were Done. When I arrived on set one Thursday morning, our AD came to speak to me about a scene that we would be shooting in two sets simultaneously. Normally, of course, that’s no problem. We are equipped for that; one Fisher Boom system in each set. But this was different. We needed both Fishers in the large apartment set where one character comes down the stairs and moves all the way across the set. Then someone would need to use a fishpole to pick up John Cryer’s dialogue once we transitioned into the second room, where he was making a “quick phone call.” We all knew that “quick phone call” meant that I would be standing there for a while with the cameras running take after take. So, we offered production a few choices: They could either rent another ped from Fisher and bring in another boom operator and pusher, or they could agree to a cost-effective safety rig option that would allow me to work the scene as a boom operator moving at the pace that the production moves at without having to worry about back or shoulder injuries that are more likely to occur when the take exceeds a few minutes. The producers were happy to hear about this new cost-effective and safer option and authorized my use of the XO-Boom. Everyone involved was pleased with this solution.

This past Spring, at the 100th Annual NAB Show in Las Vegas, was the world debut of the XO-Boom from Cinema Devices. I went out and walked around the show floor for several hours while wearing a full XO-Boom system, complete with an extended fishpole from K-Tek. The strain on my shoulders and back were negligible and I was no worse for wear the next day. The reaction from the show-goers was enthusiastic and excited; this is something we have all been waiting for.

Sound Awards 2023 59th CAS Awards

The awards for outstanding sound mixing in film went to:

MOTION PICTURE – LIVE-ACTION

Accepting the award, Foley Mixer Blake Collins CAS

Top Gun: Maverick
Production Mixer: Mark Weingarten
Re-recording Mixer: Chris Burdon
Re-recording Mixer: Mark Taylor
Scoring Mixer: Al Clay
Scoring Mixer: Stephen Lipson
Foley Mixer: Blake Collins CAS
Production Sound Team: Tom (Huck) Caton Boom Operator,
Kevin Becker Sound Utility, Jeff Haddad Additional Sound Mixer,
Eric Ballew Additional Sound Mixer, Zach Wrobel Additional Utility, Cara Kovach Additional Utility

MOTION PICTURE – ANIMATED

Guillermo del Toro’s Pinocchio
Original Dialogue Mixer: Carlos Sotolongo
Re-recording Mixer: Jon Taylor CAS
Re-recording Mixer: Frank Montaño
Scoring Mixer: Peter Cobbin
Scoring Mixer: Kirsty Whalley
Foley Mixer: Tavish Grade

MOTION PICTURE – DOCUMENTARY

Baz Luhrmann, Jens Rosenlund, David Giammarco, Paul Massey and Andy Nelson attend the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Alex J. Berliner/ABImages)

Moonage Daydream
Re-recording Mixer: Paul Massey CAS
Re-recording Mixer: David Giammarco CAS
ADR Mixer: Jens Rosenlund Petersen

NON-THEATRICAL MOTION PICTURES or LIMITED SERIES:

Obi-Wan Kenobi Part 1 EP. 6
Production Mixer: Julian Howarth CAS
Re-recording Mixer: Bonnie Wild
Re-recording Mixer: Danielle Dupre
Re-recording Mixer: Scott R. Lewis
ADR Mixer: Doc Kane CAS
Foley Mixer: Jason Butler
Production Sound Team: Ben Greaves Boom Op and 2nd Unit Sound Mixer, confidant, best friend, Erik Altstadt Boom Op, Yohannes Skoda Utility, Chris Burr and Yisel Pupo Calles Sound Trainees,
Scott Solan Boom, Cole Chamberlain Boom

TELEVISION SERIES – ONE HOUR

CAS Award Winners Phillip W. Palmer CAS, Stacey Michaels CAS, Larry Benjamin CAS, Kevin Valentine and Tara Paul attend the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Alex J. Berliner/ABImages)

Better Call Saul S6 Ep. 13 “Saul Gone”
Production Mixer: Phillip W. Palmer CAS
Re-recording Mixer: Larry Benjamin CAS
Re-recording Mixer: Kevin Valentine
ADR Mixer: Chris Navarro CAS
Foley Mixer: Stacey Michaels CAS
Production Sound Team: Mitchell Gebhard Boom Operator,
Andrew Chavez Utility Sound Technician

TELEVISION SERIES – HALF-HOUR

Kiowa Gordon, Andrew Garrett Lange CAS, Penny Harold CAS, Olivia Liang, Erika Kosi and Ed Moskowitz attend the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Alex J. Berliner/ABImages)

Only Murders in the Building
S2 Ep. 5 “The Tell”

Production Mixer: Joseph White Jr. CAS
Re-recording Mixer: Penny Harold CAS
Re-recording Mixer: Andrew Garrett Lange CAS
Scoring Mixer: Alan Demoss
ADR Mixer: Chris Navarro CAS
Foley Mixer: Erika Koski
Production Sound Team: Jason Benjamin, Timothy R. Boyce Jr.

TELEVISION NON-FICTION, VARIETY OR MUSIC – SERIES or SPECIALS

Marisa Davila, Charles Dayton CAS and Cheyenne Isabel Wells attend the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Alex J. Berliner/ABImages)

Formula 1: Drive to Survive
S4 Ep. 9 “Gloves Are Off”

Re-recording Mixer: Nick Fry
Re-recording Mixer: Steve Speed

CAS FILMMAKER AWARD

CAS Award Winner Alejandro González Iñárritu at the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Al Seib/ABImages)

Director Alejandro González Iñárritu

CAS CAREER ACHIEVEMENT AWARD

Career Achievement Award Winner Peter Devlin attends the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Alex J. Berliner/ABImages)

Production Sound Mixer
Peter J. Devlin CAS

STUDENT RECOGNITION AWARD

Brandyn Johnson, Sherry Klein, Timo Nelson, Colette Grob, Sophia White, Maria Clara Calle and Chelsea Rae Adams attend the Cinema Audio Society Awards at the InterContinental Hotel in Los Angeles, CA on Saturday, March 4, 2023 (photo: Alex J. Berliner/ABImages)

Timo Nelson
from The University of Texas at Austin

AMPS AWARD WINNER

All Quiet on the Western Front
Production Mixer: Viktor Prášil,
Re-recording Mixer: Lars Ginzel,
Re-recording Mixer: Stefan Korte,
Scoring Mixer: Daniel Kresco,
ADR Mixer: Jan Meyerdierks,
Foley Mixer: Hanse Warns
Production Sound Team: Ondrej Vondracek Boom Operator, Lukas Kuchar Sound Utility, Jan Mesany Trainee, Jan Sulcek & Peter Hilcansky Additional Sound Mixers

BAFTA WINNER

LONDON, ENGLAND – FEBRUARY 19: (L-R) Lars Ginzel, Viktor Prasil and Frank Kruse accept the Sound Award for ‘All Quiet on the Western Front’ during the 2023 EE BAFTA Film Awards, held at the Royal Festival Hall on February 19, 2023 in London, England. (Photo by Stuart Wilson/BAFTA/Getty Images for BAFTA)

All Quiet on the Western Front
Production Mixer: Viktor Prášil
Re-recording Mixer: Lars Ginzel
Re-recording Mixer: Stefan Korte
Scoring Mixer: Daniel Kresco
ADR Mixer: Jan Meyerdierks
Foley Mixer: Hanse Warns
Production Sound Team: Ondrej Vondracek Boom Operator,
Lukas Kuchar Sound Utility, Jan Mesany Trainee, Jan Sulcek
& Peter Hilcansky Additional Sound Mixers

OSCAR WINNER

Mark Weingarten, James Mather, Al Nelson, Chris Burdon, and Mark Taylor pose backstage with the Oscar® for Sound during the live ABC telecast of the 95th Oscars® at Dolby® Theatre at Ovation Hollywood on Sunday, March 12, 2023.

Top Gun: Maverick
Production Mixer: Mark Weingarten
Re-recording Mixer: Chris Burdon
Re-recording Mixer: Mark Taylor
Scoring Mixer: Al Clay
Scoring Mixer: Stephen Lipson
Foley Mixer: Blake Collins CAS
Production Sound Team: Tom (Huck) Caton Boom Operator,
Kevin Becker Sound Utility, Jeff Haddad Additional Sound Mixer,
Eric Ballew Additional Sound Mixer, Zach Wrobel Additional Utility,
Cara Kovach Additional Utility

Names in bold are Local 695 members

Ric Rambles

by Ric Teller

Satchel Paige gave this advice on how to stay young: “Don’t look back. Something might be gaining on you.” My advice: “Take a look, it’s gonna gain on you anyway.” For the record, I got to see him employing the Bat Dodger, the Hurry Up Ball, and other unique pitches at Duncan Field, in Hastings, Nebraska, while on a barnstorming tour.
Yes, I am that old.

In February, Murray Siegel, A2 emeritus (a word that is etymologically related to merit), and I were talking while on the way into the Grammys at Crypto.com Arena. I mentioned that it might be my last one. Don’t hold me to that, I’ve been fooled before … by myself. But dancing around moving band carts and hopping over a stage full of cables becomes more difficult year after year. Anyway, Murray reminded me that we are at a station in life, and work where “lasts” is a reality. We agreed that being aware of those situations gives us a perspective on where we have been. So, here’s to the last Grammys as a band guy, the last long wrap, the last terrible catered meal, the last 1,500-foot piece of fiber-optic cable, tangled to the point that it should be used as prop spaghetti for Godzilla if Godzilla eats spaghetti. There is no photographic evidence of that massive tangle. Keith Hall thought about taking a picture but was dissuaded. Fiber-optic technology has changed the way we make our shows and, for better or worse, has extended my career by lightening the physical workload. I suppose soon, fiber will run from a central hub to all locations on a production, linking to a magic decoder box that will provide connections to video of any flavor, audio (both directions), comms, timecode, and featuring a spigot serving a hot cuppa Joe from Eric Johnston’s Single Batch Coffee Roasters. If that was indeed my last Grammy wrap, I felt it. The next morning, I was able to get up and go to work on a Beach Boys tribute, but I’m not sure I was any help to Ray, Ozzie, Henry, and my Friend, Robyn. Don’t ask that lovely crew, they’re too kind to tell you the truth.

Pronunciation guide: Emeritus—put the accent on the second syllable. If you accent the third syllable, it sounds like an illness. It isn’t.

As I am writing this, the Oscars are approaching. That I am allowed to work with the excellent group of people that takes on that massive undertaking and that I get to be around the terrific Oscar orchestra is always a treat. Once upon a time, I was a musician. Not great, but good enough that I have a true appreciation for the amazing players that will gather at the Dolby. Some I have known for a very long time. Of my regular annual shows, only The Oscars and The Kennedy Center Honors have orchestras. I am grateful to get to work with both talented groups. The Orchestra Whisperer.

The Oscars and the Grammys employ an impressive number of our Local 695 members both on the main show and on the many red carpet shows. I’ve managed to avoid red carpet shows for many years and won’t do another, but about thirty years ago, while we were doing the Oscars at the Dorothy Chandler Pavilion, mixer Paul Sandweiss called the late Evan Adelman and me into his booth. If you never had the pleasure of working with Evan, you missed out. He was not only a terrific sound guy but an excellent person, missed by all who knew him. We had just finished dress rehearsal and Paul informed us that we were going to do a show on the red carpet before the Oscars began. I was the A2, and Evan mixed (that might have been the first time he mixed live on the air). I grabbed two Vega RF transmitters and receivers and two Sennheiser 416 mics. The RF’s were the main and backup hand mics for our host, Oprah Winfrey, and I lovingly tossed the 416’s up into a nearby tree to catch some of the pre-Oscar crowd noise. That was it. The whole setup. When we finished, Paul took the mixing chair for the main show, Evan ran inside to A2 with Murray, and I quickly wrapped the Oprah red carpet show and hurried inside to join them. Needless to say, the red carpet has become a bit more complicated since then.
There are a couple of reasons that these rambles are not technical in nature. The obvious one is that I’m not smart in that way. When it comes to understanding how things really work, I often don’t. Recently, I was patching a show and met with an unfamiliar issue. We ran a Tac-12 fiber cable from the Denali silver remote truck to the stage where we connected it with a Calrec Hydra, the stage box that connects with the Calrec console in the truck. Normally, when connected, there are blinking indicator lights on the Hydra that we call heartbeats. I was under the impression that when the heartbeats were blinking, the Hydra was connected. It turns out that in this case, it was true that we had heartbeats and fiber connectivity, but data was not passing. Fortunately, Matt Herchko, one of the terrific Denali engineers, showed us some persistent troubleshooting and before long, we had heartbeats and data. It took an engineer, not an A2, to figure this out which supports Joe’s Third Axiom: Once you know where the electrons go, you can’t work on the floor no more.

A couple of days ago, as Patricia and I were driving to the Valley (of no return) to meet friends for lunch, my wife asked if I had ever worked with Eric Clapton. Short answer, yes. Now, sitting at my Ashley Discount Furniture Hecho en China desk, I recall the first time. The mixer, Don Worsham, called to inquire about my availability on April 15. In 1987. He told me he couldn’t find any A2’s. Was I busy that day? Back then I was often available. He indicated that it would be a relatively easy show (we had done the Grammys about six weeks prior). I would set one band and hand out a few RF mics. I arrived early at the venue, The Ebony Showcase Theatre on Washington Boulevard. Like so many, that building is gone. I ran cables from the Greene Crowe truck. BT-1, I suppose. Who remembers? Then set up the band mics, including all guest instruments, a handful of RF’s for vocals, and some audience mics. Soon, we were ready for soundcheck and rehearsal. The day flew by. That show, B.B. King: A Blues Session, was produced and directed by Ken Ehrlich who has been involved with many of the memorable music shows in my career. The tight, eight-piece B.B. King band accompanied an all-star roster of guests. Phil Collins on drums, Dr. John on keyboards and vocals, Paul Butterfield played harp (the harmonica, not the instrument featured in Marx Bros. films), vocals were provided by B.B. King, Gladys Knight, Etta James, Chaka Khan, and Billy Ocean. And the ensemble was rounded out by a trio of guest guitarists, Albert King, Stevie Ray Vaughn, and Eric Clapton. That was a full day. A heck-of-a-day.

AC/DC picks. Photo: Patricia Pittington Teller

I’m not big on band swag and never ask to take a photo (although I did sneak one at the Grammys a while back)
If you know, you know.
but in early February 2015, the opportunity to ask for something presented itself, and I took full advantage. The aforementioned Ken Ehrlich booked AC/DC, one of Patricia’s favorite bands, to begin The Grammys that year, performing “Rock or Bust” and “Highway to Hell.” It was the best opening in my thirty times on that show. After the band rehearsed, I asked a guitar tech if I could take a pick for my wife. A bit later, he presented me with three, from Cliff Williams, Malcolm Young, and Angus Young. She says it is one of her favorite Valentine’s Day gifts.

My ramble in the spring issue extolled the virtues of Oboz Low Sawtooth hiking shoes, my footwear for busy shows. I regret to admit that I failed to acknowledge the friend who suggested these might be a good choice. I would like to publicly thank Patty Scripter’s husband for leading me down the path of comfort.

A few weeks back, I stopped by the Local 695 office where our president presented my 40-year pin.

Pin presentation with Jillian Arnold and Joe Aredas, Jr.

Thank you, Jillian, James, and Joe (I believe your dad was at my initiation). Forty is kind of a big deal for me. Real math. Craig will think that’s funny. Contemplating retirement, forty will be my last milestone. I started working shows in the fall of ’81 but waited to be initiated by a fellow Nebraskan, Roy Brewer, who was a friend of my dad’s. For those who missed my spring ramble, there is a terrific photo of my dad with a popcorn machine at one of his theaters. Given the chance, I’m sure some of my early coworkers at KTLA would have bet the farm against me making it this far. Forty years is a long time. A long, very enjoyable adventure. It would take an entire issue to list the mentors, coworkers, teachers, and especially the friends who have made the forty so gratifying, a word that is an etymological cousin to grateful. I am. Truly.

“Sound! Camera! Plates! Action!”

by Omar Cruz Rodriguez

Prep for TV series All Rise, Season 3.

My Hollywood journey began in high school, when my history teacher assigned a presentation project to our class. We had three options: we could give an oral presentation in front of everyone, we could write a paper, or we could make a video. I was one of the few who chose to make a video and, quite by accident, I fell in love with the process. After a buddy of mine showed me how to use Windows Movie Maker on my family’s home computer, I was hooked. After graduation, I signed up for the film program at my local community college and eventually got my first industry job at a rental house. I worked my way up from delivery driver to prep tech, where I got lots of experience working on specialty rigs, VR setups, and multi-cam Bullet Time rigs. As I became more proficient with specialty equipment, my manager began sending me on set to engineer the gear for a variety of high-profile music videos for artists, including Selena Gomez, Bad Bunny, and Migos. As it turned out, this work in specialty rigs became a primer for the sort of video wall engineering work that I do today.

Prepping the LED Walls at Paramount Studios

Since 2019, a growing number of film and television productions have been making use of video wall technology. By innovating on the sort of projection and playback work Local 695 engineers have been doing for decades, productions have the ability to incorporate large walls made up of interconnected LED panels into the set. Rather than utilizing green screens or going out on location, an LED wall allows filmmakers to simulate nearly any environment in a controlled setting. Not only does this eliminate the need for expensive and time-consuming post-production visual effects work, it allows, for example, a magic hour sunset that can last an entire shooting day or the ability to place a full moon in the exact spot you want. This gives the Director of Photography more creative freedom in their projects and allows for a consistent look between setups. For actors, the immersive nature of virtual production gives them something to play off of during performances. It’s a lot easier to react to a beautiful, awe-inspiring vista when it’s being simulated right in front of you. More importantly, the director now has the ability to see something much closer to the final product when capturing scenes. It helps productions feel confident they’ve shot the necessary footage to assemble a given scene and can even be used as a tool during pickups to recreate an exact environment as it existed months or years prior.

Our Flag Means Death Season 1, filmed at Warner Bros. Studios.

Video wall work makes up the majority of the work that I do today. In the past two years, I’ve worked on a variety of productions, including Station 19, The Idol, Sausalito, Bishop’s Birthday/Sugar, All Rise, After Party, The Deliverance, Good Trouble, Our Flag Means Death, and White Noise. The Deliverance, directed by Lee Daniels with DP Eli Arenson, was probably the most significant experience so far. I was entrusted with taking over the role of a lead LED playback operator. During the shooting process, many responsibilities fall on the playback operator and I needed to be on the ball in order to do my job correctly. That said, the experience was a huge success and it felt phenomenal to work with such an awesome team and to know how integral my contributions were to the process.

Most recently, I’ve started working with Stargate Studios—a VFX house that has expanded into virtual production work. Working with Stargate has been a great experience. It’s a welcoming environment focused on innovation and growth. Stargate CEO Sam Nicholson and his team have a long history of creating solutions for various productions, the most noteworthy of which include The Walking Dead and Grey’s Anatomy. I was hired by them to work as a media playback operator for the HBO series Our Flag Means Death, a comedy by Taika Waititi. This show was incredibly unique because the majority of the show was done utilizing a huge LED wall and virtual production techniques. The show had terabytes of ocean plates to manage, a 30 x 160 foot tall LED wall that wrapped around a huge pirate ship that held an entire cast and crew on it. Stargate was pivotal in engineering the demo that green-lit the virtual production aspect of the series. It really showcased how virtual production can help immerse actors and save production a lot of money in CGI and extensive post work.

Prep & setup for the TV series Station 19.
On set for the TV series Good Trouble.
A Car Process setup from Station 19.
Virtual production on the Apple TV series Sugar.

However, operating these video walls can present quite a challenge and requires input from multiple departments across the set. This means that a video playback engineer or LED technician needs to be something of a jack of all trades. Getting an image up on the screen and playing it back involves an understanding of cameras, their color sciences, video color spaces, lighting workflows, playback software, nonlinear editing programs, and so much more. The newest setups utilize advanced camera tracking and 3D animation, so even a little understanding of Blender, Unity, and Unreal is helpful. With so many different variables, it’s important to understand how different units interact with each other in order to produce different types of images.

The video wall workflow starts after a DP has selected the plates for the shoot and relays that selection to the LED wall team. It helps to know how the plates were shot in case they need a corresponding camera LUT applied and to ensure that they are optimized for playback. Every camera has its own unique color science, which in turn requires its own unique LUT. A LUT, or look-up table, is a file containing a mathematical value that is used to create a specific visual look or to work within a specific color space. Often, the effects are not interchangeable between cameras, meaning that an understanding of multiple cameras and their color sciences is required. Otherwise, you might throw an Alexa Day4Nite LUT on some Red footage and find that the LED wall looks like a scene from Mars. Then comes the setting up of the hardware, the LED walls, and the playback server. Setting up the LED walls takes a team, installing, leveling, and networking is tedious but very crucial. The playback server has its deep technical aspects but the main purpose is to provide tools for the DP and DIT to make any visual changes before and during the shooting process. Every show is different and no DP lights their scene the same way but no matter what show it is, I strive to be organized and prepared to make changes and have the tools at my disposal needed to achieve a desired look.

In the two years that I’ve been doing this sort of work, the process has been a continually evolving one. New panels, new playback solutions, and new network integration options are coming out all of the time. Moreover, as productions become more familiar with the capabilities of virtual production, even greater demands are being made of the technology. This creates something of an endless race for playback specialists, as we have to keep up with these innovations as our careers progress.

Pickup shots for The Idol on HBO.
L-R: Prep work for The After Party, Season 2.
Building the LED Wall for All Rise.
LED calibration & setup from All Rise.
A Car Process setup from All Rise.

Speaking of my career, there are a few people that I would like to shout out. Seth Fine, Shahrouz “Shawn” Nooshinfar, Storm Flejter, Lucas Solomon, and the rest of the crew at Lightning LED really took my experience to the next level. Seth got me into the union. His beliefs in proper pay and representation by a union really helped me feel validated that my work/skills really mattered. Storm, along with Seth, helped me be an all-around LED technician, not only running playback but setting up the walls also. Shawn is the Lead Engineer, who has mentored me a lot as well, taking deep dives into playback systems that I’m still learning all the ins and outs of today. Lucas, head of operations, and his son Connor Solomon are some of the hardest working people in the industry you’ll ever meet on this planet.

My journey in production also wouldn’t have been possible without my family. My siblings, parents, in-laws, and especially my wife have all been vital in giving me the support and opportunity to pursue this taxing career. Long hours and an unpredictable schedule make it hard to find reliable day care options, which has made family crucial in caretaking of our son. Finding the balance between work and life helps when you have a strong community to pick you up when you need a hand. I am fortunate to have such a strong support system, both at work and at home.

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Page 4
  • Page 5
  • Interim pages omitted …
  • Page 16
  • Go to Next Page »

IATSE LOCAL 695
5439 Cahuenga Boulevard
North Hollywood, CA 91601

phone  (818) 985-9204
email  info@local695.com

  • Facebook
  • Instagram
  • Twitter

IATSE Local 695

Copyright © 2025 · IATSE Local 695 · All Rights Reserved · Notices · Log out