• Skip to primary navigation
  • Skip to main content
  • Login

IATSE Local 695

Production Sound, Video Engineers & Studio Projectionists

  • About
    • About Local 695
    • Why & How to Join 695
    • Labor News & Info
    • IATSE Resolution on Abuse
    • IATSE Equality Statement
    • In Memoriam
    • Contact Us
  • Magazine
    • CURRENT and Past Issues
    • About the Magazine
    • Contact the Editors
    • How to Advertise
    • Subscribe
  • Resources
    • COVID-19 Info
    • Safety Hotlines
    • Health & Safety Info
    • FCC Licensing
    • IATSE Holiday Calendar
    • Assistance Programs
    • Photo Gallery
    • Organizing
    • Do Buy / Don’t Buy
    • Retiree Info & Resources
    • Industry Links
    • Film & TV Downloads
    • E-Waste & Recycling
    • Online Store
  • Members
    • ★ News & Announcements
    • ★ Membership Services
    • ★ Job Reporting
    • ★ Working Conditions Report
    • Available for Work List
    • Membership Directory
    • Rates ~ Contracts ~ C&B’s
    • Education & Training
    • MPI Hours
    • Safety Pass Training
    • Projectionist Contacts
    • Local 695 Elected Officers
    • 695 Member Discounts
  • Show Search
Hide Search

Features

Beginnings of Local 695 Part 3

by Scott D. Smith, CAS

This piece is a continuation of the article from the winter 2011 issue of the 695 Quarterly, which examined the early beginnings of Local 695. For those who toiled behind the scenes at the various studios during the mid-to-late 1930s, times were tumultuous. With the economy still reeling from the effects of the 1929 stock market crash, and unemployment in the double digits, Hollywood was not exempt from the crisis that gripped the rest of the nation. With much at stake for both workers and producers alike, a fierce (and bloody) battle ensued for the control of craft unions engaged in film production. In the end, the studios would be the ultimate winners, but there was no shortage of embarrassing moments for both sides.

While much ink has been spilled pertaining to charges of influence peddling during this period, I have tried to steer clear of any conjecture. Any opinions expressed herein are those of the author, and should not be construed as representative of the IATSE.

1935

Still reeling from the effects of the strike actions of 1933, Local 695 (and the IATSE West Coast locals in general) continued in their quest to negotiate a contract with producers. It was tough going. IBEW Local 40 continued to be a thorn in the side of 695, and they had lost a significant number of members to IBEW as a result. With membership dwindling and the possible extinction of the West Coast locals looming large, the International played the only card they had left—bring in the boys from Chicago.

The Chicago Connection

George E. Browne began his show business career in Chicago, having been elected in 1932 as the head of Stagehands Local 2. His assistant and right-hand man was one William “Willie” Bioff, who had an illustrious career as a small-time criminal, running prostitution and minor protection rackets in Chicago’s Levee district.

In the early 1930s, after hitting up a local theater chain for $20,000 in exchange for labor peace, Bioff and Browne went to a local club to celebrate their coup. It was during this drunken outing they had the misfortune of running into a gentleman by the name of Nick Circella, a member of Frank Nitti’s gang, who, along with Al Capone, controlled much of the Chicago mob during the Prohibition years. With the end of Prohibition in 1933 causing a severe dent in their cash flow, the Syndicate needed to come up with some creative ways to keep their empire afloat. The film business suited their needs perfectly. Bioff and Browne were subsequently invited to join the organization. The only acceptable answer was “yes.” Using his position as head of Local 2, Browne was able to exert control over local theater owners by threatening action by the projectionists.

Industry cartoon from the 1930s. From The Story of the Hollywood Film Strike in Cartoons. Cartoons by Gene Price, book by Jack Kistner. From the collection of Dr. Andrea Siegel.

During this period, most of the major theater chains were still owned by the studios. In 1934, Browne, with the backing of the Chicago mob, ran in an uncontested election to head the International. Bioff, as his right-hand man, would accompany him to New York.

Having managed to seize control of the International, Bioff and Browne then went to the heads of Hollywood studios, threatening to disrupt the operations of studio-owned theaters unless they bowed to their demands.

Studio heads, having just lived through an expensive halt in production, were anxious to avoid any more labor problems. A previous, albeit brief, projectionists strike in Chicago had already cost the studios a significant amount of money and they didn’t relish the thought of further disruptions in either production or exhibition.

Studios Go Closed Shop Jan. 2

Thus read the headlines in the December 16, 1935, issue of Variety. After months of wrangling with the National Labor Relations Board and IBEW Local 40 over jurisdiction of soundmen, Local 695 and the International managed to regain representation of studio workers for most crafts.

This was a major coup on the part of the International, and brought at least 4,000 members back into the folds of the IATSE. While the tactics associated with this action would come back to haunt them, it did, at least for the time being, put the question of representation to rest. The move apparently caught many by surprise, including the cameramen, who just 10 days previously were still trying to sign members of camera Local 659 into the ASC guild.

However, the closed shop conditions did not remain in place very long. By April of 1939, the leaders of the International announced the return of an open shop policy on studio lots. This move was designed to head off a looming battle over charges that the IATSE was acting in collusion with producers to control labor rates and conditions.

1936—The Deal

In 1936, with the events the previous year still looming large in his mind, Joseph Schenck, head of 20th Century Fox, as well as the producers’ liaison for the Hollywood majors, was called to a meeting in New York with Willie Bioff and George Browne. At that meeting, Bioff declared that “I’m the boss—I elected Mr. Browne—and I want from the movie industry $2 million.” Schenck, astounded by the demand, began to protest, but Bioff warned him: “Stop this nonsense. It will cost you a lot more if you don’t do it.”

Two days later, at a second meeting, Bioff took him aside and confided: “Maybe $2 million is a little too much… I decided I’ll take a million.” In the end, Schenck agreed to pony up $50,000 a year from each of the majors and $25,000 from the smaller studios. Mr. Schenck later took a small bundle containing $50,000 in large bills to the Waldorf-Astoria hotel, dropped it on a bed, and looked out the window. Sidney R. Kent, president of Twentieth Century-Fox Film, came in and did likewise.

A year later, Schenck received another call from Bioff, and repeated the routine. This would continue until May of 1941, at which point Bioff and Browne were indicted and found guilty of extortion in federal court. They were subsequently given sentences of eight and ten years respectively, along with a fine of $20,000. Richard Walsh took over as President of the International. Joseph Schenck, for his part in the scandal, received a sentence of a year and a day, but received a Presidential pardon after serving four months. When faced with charges for his participation in the scandal, Nitti put two .32 caliber bullets in his head while standing in a suburban rail yard. Bioff, not long after his release, was blown up, along with his car, in the driveway of his home in Phoenix. Thus came to an end one of the most scandal-ridden periods in the history of the IATSE.

Local 695 Survives

While the actions of Bioff and Browne brought disgrace to the IATSE, the members of the individual locals continued in their fight for fair wages and working conditions. This effort on the part of the members would result in a new, more democratic IATSE Constitution. In addition, to their credit, some members spoke out against the rigged election of Browne as head of the International. For their trouble, they were frequently subject to beating by Bioff’s henchmen and “blacklisted” from working.

Tommy Malloy (no angel himself), who headed Projectionists Local 110 in Chicago, was one of those who had protested the influence of the mob during the wildcat projectionists strike of 1935. In response, his Packard, with him at the wheel, was riddled with machine-gun fire on Lake Shore Drive. The message was clear to both studio owners and union employees alike: go along with the program, or face the consequences.

With the issue of jurisdiction settled, at least for the time being, Local 695 went back to the task of organizing its membership, and signing up new members who worked in areas related to sound recording and reproduction. This included not only production sound and re-recording crews, but maintenance technicians and theater sound personnel, as well as those working at laboratory facilities.

One such group was the engineers and technicians who worked for ERPI (Electrical Research Products, Inc.), which was the engineering arm of Western Electric. Most of these men were part of the Western Electric engineering group which handled installation of sound equipment in studio facilities, and the installation and maintenance of theater sound equipment provided by Western Electric. Local 695 had previously signed many of the men who worked for RCA Photophone, and the signing of the ERPI engineers in June of 1936 further bolstered their ranks.

While these hard-won gains helped to establish Local 695 as the primary bargaining agent for production and re-recording soundmen, they would continue the fight for the representation of all soundmen working at theaters and laboratory facilities well into December of 1936.

1937

While Local 695 continued in its efforts to organize those working in sound-related crafts, the fight to maintain representation of soundmen was far from over. On April 30th of 1937, the Federation of Motion Picture Crafts (FMPC) staged a surprise walkout. The FMPC was essentially a coalition of unions under the leadership of Jeff Kibre and covered about 6,000 members in various crafts, including art directors, costume designers, lab engineers, technical directors, set designers, scenic artists, hair and makeup artists, painters, plasterers, cooks and plumbers.

Kibre was a second-generation studio worker. His mother, a divorcée who had moved from Philadelphia in 1908, worked in the art department of some of the studios. After studying English at UCLA, and failing in his bid to become a screenwriter, Kibre joined Local 37 and took a job as a prop maker. He was reportedly a likable man and had a talent for making those around him feel as though he understood their problems. He was also an avowed Marxist and Communist, but apparently did not follow the party line, preferring to make his own determinations as to the correct course of action. As such, the Communist Party leadership refused to support his actions, which left him on periphery when it came to organizing.

While the April 30th walkout against the studios eventually failed, Kibre was not totally out of the picture. With the help of attorney Carey McWilliams, Kibre reorganized under the banner of the IATSE Progressives, and began a campaign to investigate the mob ties of the International.

While Kibre’s efforts to clear the IATSE of mob influence may have been laudatory, his ties (however loose) to the Communist Party ultimately worked against him. To his credit, however, Kibre’s actions led to the resignation of Willie Bioff, and well as the end of the 2% assessment fee levied on all members of the IATSE by George Browne after he had been installed as head of the International.

In the end, Kibre’s attempt to organize various crafts failed amidst the continued allegations of Communist influence, which were picked up on and exploited by the media during the late ’30s and early ’40s. He also received numerous death threats during this period, to the extent that he required a personal bodyguard around the clock. Despite his failure at fully organizing studio workers, he did manage to negotiate a deal to leave town if the IATSE leadership agreed not to persecute the membership of the democratically oriented United Studio Technicians Guild. Upon his departure, Kibre went to work for the CIO fishermen’s union.

Unfortunately, the media attention surrounding Kibre’s Communist Party affiliation provided a further distraction for the studios to exploit, serving to deflect attention from their own role in influencing labor negotiations, as well as their mob ties. This unfortunate scenario played right into the hands of the producers, who were only too happy to instigate any unrest within the labor movement.

It was probably due in part to this unwarranted attention (along with Jeff Kibre’s continued actions against the IATSE) that the membership of Local 695 took the unprecedented position to vote against the autonomous local leadership during a meeting held on December 22, 1937. Apparently, members felt that they had a better chance of maintaining their current wage structure (paltry as it was), if they let the International handle bargaining with the producers.

It wasn’t until a contentious three-hour meeting, held nine months later on September 16th of 1938, that more than 400 members of Local 695 would finally nominate a new set of officers to the Local, thereby returning control to the officers and members (although the actual election was deferred until the 28th of the month). Likewise, three other key IA locals (Camera Local 659, Laboratory Technicians Local 683, and Studio Mechanics Local 37), also voted to return control of their unions to local leadership. Once again, Harold Smith was voted business representative for Local 695.

The question of certification of Local 695 as the exclusive bargaining agent for soundmen, which was initially filed with the National Labor Relations Board (NLRB) on October 12th of 1937, would continue to drag on into 1939, with no clear resolution.

Keeping Score—A Look at Wages

Given the current economic times we are living in, it is instructive to make a quick comparison of wages during the late 1930s. Below is an illustration of what a sound crew might expect to make on studio-based productions after new wage scales were put into effect in April of 1937, with equivalent comparisons to 2010.

Clearly, nobody was getting rich at these wages, especially when one takes into account that only very few of those members working in 1937 would be fortunate enough to work 42 weeks a year.

In comparison, it was reported in the September 17th issue of Variety that director Frank Capra received a salary of $100,000 each for three pictures, two bonuses of $50,000 each, plus 25% of the profits. While Capra was certainly an exception, director Rouben Mamoulian was reported to make $50,000 per picture, which is still nothing to sneer at.

Likewise, it is interesting to note that in September of 1938, Technicolor reported gross earnings for the first eight months of $862,612 (approximately $13.2M in 2010 dollars), which was nearly double the earnings for the same period in 1937. Somebody was making money—despite a national economy that was still faltering. (The national unemployment rate in 1937 stood at 14.3%, rising to 19.0% in 1938.)

It is therefore understandable when stories such as these hit the press, some crew members who toiled long hours in production might begin to feel that they were being taken advantage of. A similar parallel exists today when comparing the salaries of corporate CEOs to those of the workers who produce value for their companies.

1939 and Beyond

After having just approved the return to autonomous control of Local 695 by its newly elected board in September of 1938, the members would reverse this decision six months later. Fearful of losing the gains that had been made over the past years in wages and working conditions, the membership felt that the only leverage they had with studio management was the threat of a walkout by the projectionists.

Therefore, the members of 695 (along with Business Agent and International West Coast rep Harold Smith) felt letting the International handle the bargaining for a new Studio Basic Agreement would offer greater leverage than what they might be able to muster on their own. However, in a nod to local membership, it was agreed that any contract negotiated by the International would be ratified by the membership of the individual locals.

While the tactic of having the International control the negotiations may have been a good move in the short run (it took a threatened walkout of projectionists on April 16th of 1939 to even get producers to agree to come to the table), ultimately it placed a lot of power in the hands of the International, which at this time was still headed up by George Browne.

However, despite the events that would take place in federal court two years later, it is probably fair to say that Local 695, as well as most of the West Coast IATSE locals, would have not been able to survive the union-busting tactics of producers without having the projectionists support them. While some of the tactics employed by IA leaders during this period may be questionable, one must also remember that the studios employed their own set of “goon squads” which were equally unsavory in their tactics.

Ultimately, the greed of studio bosses was the factor that forced the rank-and-file membership of craft unions (regardless of their affiliation) to vote for measures that they might otherwise think twice about. Surely, most members of 695 would not have willingly handed over control of their local to the International unless they felt that was the only option left open to them.

While both the International and individual locals have to share some of the blame for the events that took place during this time period, if studio bosses had come to the bargaining table instead of trying to circumvent the rights of workers, things may have turned out differently.

© 2011 Scott D. Smith, CAS

Radio Mike Redux

by Jim Tanenbaum, CAS

NOTE: At the bottom of this online article, you will find the Appendix that is referenced here and in the print edition of the 695 Quarterly.

If you haven’t read David Waelder’s excellent articles in the last two issues of this magazine, please do so at once. In my 44 years of mixing, I’ve watched radio mikes evolve from almost unusable to amazingly reliable, but they still require a knowledgeable sound person to perform properly. David’s information is exactly what is needed, and I would like to add a few more points. Also, the other end of the system (transmitter and mike) needs some explaining too.

RECEIVERS

The directional characteristics of log-periodic (sometimes erroneously called “Yagi”) antennas are different in the vertical and horizontal planes. (Log-periodic antennas are wideband; Yagis are fixed frequency – see Sections 3.1 and 3.2 in the online Appendix.) They are more directional in the plane of the elements, thus, when the antenna is mounted with the elements vertical (as it usually is), the gain falls off more rapidly at about 30 degrees to 45 degrees above and below the horizontal. This is desirable because the actors are not often located high above the ground. The horizontal pattern is much broader, sometimes down only 5-6 dB at ± 90 degrees. As a result, it is not necessary to “track” the actors with the antenna if they move slightly, as I have seen some people do. (Note that TV antennas are oriented horizontally, because of the need to precisely aim them at the TV station’s transmitting antenna, and to reduce reflected signals from other directions – “ghosting”, although that is no longer so much of a problem with digital TV.)

If you have an interfering signal, you can swing the receiver’s antenna and try to null it out. Chances are, the actor will still be within the front lobe of the antenna’s pattern. If not, you can relocate the antenna to get the actor ‘in front’ of it while keeping the interference in the lowest gain direction. This works better than reorienting the antenna horizontally because the null is no deeper, and now the actor may have to be tracked. Important: the greatest null direction is not directly to the sides or rear of the antenna’s the pattern is more like a hyper-cardioid or short shotgun mike’s, at about 135 degrees rearward to the left and right. When you have some free time, set up a transmitter in a fixed position and then rotate the receiver antenna while watching the receiver’s signal strength meter. This will give you a feeling for your particular antenna’s pattern. Be sure to do this outside in an open area, so reflections won’t confuse the results. And, if you have more time, move the transmitter to another location and repeat the procedure. Check for the front acceptance angle as well as the location of the rearward nulls on both sides.

Circularly-polarized antennas are indeed good at receiving signals that have had their polarization angle changed by reflection(s), but there is a low-cost alternative. If you are using two 1/4-wave whip antennas, simply orient one 45 degrees to the left and the other 45 degrees to the right, instead of both vertically. Right-angle BNC or SMA adapters are the easiest way to do this if the antennas do not have right-angle connectors themselves. For a pair of sharkfins, modify their mounting brackets to angle their upper edges outward by the same amount. This puts the antennas at a right angle to each other, so at least one will pick up the signal strongly no matter what its polarization angle.

Regardless of what type of antenna you use, keep the cable connecting it to the receiver as short as possible because most coaxial cable has a greater loss than sending the radio signal an equal distance through the air. See Section 4.2 in the Appendix.

TRANSMITTERS

As to transmitters, there are a number of things you can do to improve the signal that arrives at the receiver antenna:

1. Most intervening objects block the direct signal path, and, since UHF waves are small (about one foot), it doesn’t take a very large object. This includes people, especially the actor wearing the mike. If the actor will be facing you throughout the scene (i.e. facing the receiver antenna on your sound cart), mount the transmitter or at least the antenna (see 3. below) on the front of the actor’s body.

2. Another improvement comes from spacing the bodypack’s antenna as far from the actor’s body as possible. In addition to mounting the transmitter under the outer layer of wardrobe if possible, slipping a length of rubber or plastic tubing over its antenna will increase the radiated power considerably. Automotive supply stores sell tubing for windshield washer fluid that is the correct size: about 1/4-inch O.D. x 1/8-inch I.D.

3. As David mentioned, raising the receiver’s antenna helps. This is also true of the transmitter’s antenna. If you have to mount the transmitter on the actor’s ankle, use an extension to get the antenna higher on the body.

A simple extension antenna can be made from a length of miniature coaxial cable: RG-174 type, with a braided shield and a stranded center conductor.

Start by stripping off several inches of the outer jacket at the end of the coax, being careful not to cut or even nick any of the shield braid wire strands. The length removed should be about an inch and a half more than the length of the whip antenna for the frequency block you are using. Don’t include the length of the connector’s metal shell. (Or you can use the antenna-length Table in the online Appendix. Pick the center frequency of your block.)

Next, carefully push the cut end of the braided shield back to expand it, and continue pushing the shield until it inverts over the remaining outer jacket. Smooth the inverted shield braid out – it should now be the correct length (or slightly longer, in which case trim it back). Cut the now-exposed insulated inner conductor to the correct length, then cover the shield braid and inner conductor with a length of shrink tubing.

After you have successfully completed these steps, cut the coax to a length of five to six feet (to reach from an ankle-mounted transmitter to the shoulder-mounted antenna), and attach the appropriate transmitter-antenna connector to the other end.

4. It also helps to raise the boom operator’s transmitter antenna if using a wireless link. Butt plugs are one solution. If a bodypack transmitter is being used, the extension antenna described above can be mounted on the boom operator’s headphones. I use this method and often get a solid 1,000-foot range. (Zaxcom makes a filtered remote antenna for specific blocks, which also helps to reduce interference with receivers used in a bag.) It is also possible to mount the transmitter as well as its whip antenna to the headphones, although this adds more weight and bulk.

5. One more caution: recently, large (12′ x 12′) metalized cloth scrims (silver or gold) have come into widespread use. Although coated with metal, they absorb radio signals rather than reflect them. Not only will they completely block the signal from an actor behind them, but actors standing in front of one (with transmitters mounted on their backs) will have almost all of the radiated signals absorbed, with resultant R.F. dropouts. This caused me no end of trouble until I figured things out. (For the technically inclined, the characteristic impedance of the metalized fabric is about 50 ohms – see Section 4.3 in the Appendix.)

MICROPHONES

Once the transmission and reception of the radio signal has been optimized, there are also techniques to improve the quality of the audio:

1. Mike mounting position: Basically there are two choices: torso or head.

Torso: Usually, the lavalier is mounted on the chest, located over the sternum (breastbone). This position is a good compromise – any lower and there is too much ambient sound; any higher and the upper voice frequencies are reduced by the “chin shadow,” and there is also an excessive drop in level if the head is turned to the side.

Head: Extra-small lavs like the Countryman B-6 can be hidden in the hair above the forehead. This keeps them “on mike” regardless of any head turns. If the actor wears glasses, concealing the tiny mike at the hinge point is another possibility. If a baseball cap is part of the actor’s wardrobe, the mike can be mounted under the visor. A plastic hard hat is even better because the transmitter can be secured inside the hat, just above the suspension. With both hats, the mike can be concealed under a sheet of felt (see 5. next page) that is glued under the visor or brim. If the bump from the mike is visible (be sure to remove any EQ sleeves from the B-6), use two layers of felt, with the inner layer cut out to accommodate the mike and cable.

2. Cable strain relief: A taut cable can pull on the lav and cause it to rub against the clothing. Even if it doesn’t, mechanical noise introduced anywhere along the stretched cable will travel to the mike where it will be heard. A full 360-degree loop in the cable, secured with strips of tape both below and above it will break this transmission path. Sennheiser makes a line of lavaliers, such as the MKE-2, that use stainless steel wire instead of copper in the cable. While this construction is extremely rugged and reliable, the stiff steel conductors can carry mechanical noise down the entire length of the cable. Even two loops sometimes does not prevent it from reaching the mike. Using these mikes on studio news anchors usually presents no problems, since they speak up and are relatively motionless. Actors in a dramatic scene, with lowered voices and extensive body motion, often cannot be recorded successfully with these lavs.

3. Mounting lavs directly on the actor’s skin: Individually-packed alcohol swabs are useful in removing skin oils before taping down the mike. There are three types of medical tape available that work well for different situations. The one I use most often is “3M Micropore,” a plastic tape perforated with many tiny holes. These serve to allow perspiration to escape rather than lift the tape by hydraulic pressure. They also make the tape easy to tear cleanly. While all three types are hypoallergenic, for actors who express a concern about their “sensitive skin,” a version of tape made from paper with a less aggressive adhesive may be used, but will require a greater area of contact to remain in place. It is porous but not as much as Micropore. For applications involving abrupt and vigorous body motions, or where the transmitter must be taped to the body, there is a cloth tape that has a much greater tensile strength and a much stronger adhesive. (Avoid body hair if at all possible with this tape.)

Most men have a depression in the center of their chest that is a good spot for the lav. For women, between the breasts (unless they’re pushed together) is ideal, possibly attaching the lav to the center of the bra. If the clothing rubs against the mike, there are two choices: double-sided tape between the cloth and the skin, or attaching one or more “bumpers” to the skin near the mike to keep the cloth away from it. A piece of makeup foam works well for this purpose. Trim the foam to a smoothly-rounded contour on the side where the fabric will contact it and use “TopStick” double-sided adhesive toupee tape on the flat side to attach the bumper to the skin.

4. Chest hair: Some men have a thick mat of chest hair with the consistency of steel wool that rubs on the back of the lav. (Robert Urich was extremely cooperative and shaved a patch of his pelt down to the bare skin every day for me, but you are unlikely to encounter such generosity.) The best solution is to have the actor wear a cotton T-shirt or tank top, but if that is not possible, tape a 6″ square of felt (see 5. below) to the body hair behind the mike, using the paper tape mentioned (see 3. previous page.) You will need lots of tape and use the alcohol swabs liberally. If the actor won”t go along with this, taping two or more layers of felt to the wardrobe so that they cover the back of the mike will help to a certain degree.

5. Windscreens: Foam windscreen material is not very effective when used in thin layers next to the mike. The mesh “ball” windscreen provided with some lavaliers (e.g. Sanken COS-11) is better, but is too large to hide under most wardrobe. I have found that a layer of wool felt provides considerable protection without attenuating high frequencies excessively. Important: you must use 100% wool felt; wool-polyester blends or 100% polyester felt is very noisy. (See Illustration on page 22.)

For most installations, cut the felt into strips about 3/8″x 1″ for B-6s and 5/8″ x 1-1/2″ for Sankens and Trams/Sonotrims.

Next, cover exactly one-half of the strip with a piece of TopStick double-sided tape, notched to clear the business end of the lav. Place the mike on top of the tape, with its end just shy of the middle of the felt strip and the cable running down the center of the strip.

Finally, fold the strip over the mike and press the edges together along its length. This will space the fold in the felt slightly away from the end of the mike to improve the windscreen’s performance.

Buy as many different colors of felt as you can – this will help in concealing the mike, especially when a leather jacket (or other sound muffling material) is involved. If you can match the color of the jacket’s lining, it is often possible to position the mike very near the opening. The various shades of felt are also useful for windscreening and/or concealing planted mikes.

Tram, Sonotrim, and other flat lavs that mount with “vampire clips,” have a grill on one side that can be mounted facing the clip so the solid back of the mike faces forward and helps block the wind. The gap between the grill and the clip can be filled with a thin sheet of foam windscreen material, or felt for even more protection.

6. Clothing noises: If you have any input in preproduction as to wardrobe materials, natural fibers such as cotton, linen, wool, and even silk, are preferable to synthetics like polyester. These plastic fibers are much more rigid and will carry sound through the fabric much more readily. Unfortunately, wardrobe people like synthetics because they are wrinkle-resistant and easier to clean. If you encounter this problem on the set, isolating the lav with a piece of makeup foam will help. Latex works best but has recently been replaced by a synthetic to avoid allergic reactions. There are also commercially-produced cylindrical mike sleeves available in black or white foam.

TopStick works well to tack rubbing layers of clothing together. A supply of various sizes of safety pins is also useful. Neckties have multiple layers that can rub together and be picked up by a lav mounted underneath. To complicate matters, the backs of most ties are sewn shut, so you cannot get inside to tape the layers together. You can use a safety pin to immobilize all but the front layer, and sometimes the tie’s pattern will allow you to snag the front layer as well. There is a “silk” safety pin available from dressmakers’ supply stores that is very small and has a flat-black coating, which is ideal for this purpose. (White, pink, and other painted colors are also available for use with sheer wardrobe.)

For completely intractable clothing noise, it is sometimes possible to stick a B-6 out through a button hole and support it on its cable, half an inch away from the fabric. This technique works especially well if you have B-6s in all the available colors. You can also use colored markers on a white mike to match various colors. “Dry-erase” markers are the easiest to remove, but be careful that the color does not rub off before the shot is over.

Two often-neglected sources of noise are flapping zipper tags and the circular springs inside the female part of snaps that rattle when the snap is unfastened. These can be amazingly loud when the lav is nearby. A small piece of double-sided tape will secure the zipper tag to the body of the zipper, and another piece can be wadded up and stuck inside the snap opening. Warning: be sure to remove all the tape from wardrobe items when the shot is over.

7. “Soundproof” wardrobe: Zipped-up leather jackets (when under the collar is not an option) and down-filled parkas are two of the most difficult items to deal with. It is sometimes possible to locate the lav behind the zipper, so the sound can reach it through the gaps in the zipper teeth. If the teeth rub against each other audibly, asmall amount of Krazy Glue applied to the teeth immediately in front of the mike will stop that. Another possibility, if the wardrobe person will permit it, is to cut a short section of the stitching that fastens the zipper to the jacket and bring a B-6 out through the gap, leaving the end of the mike flush with the edge of the leather bordering the zipper. Down-filled parkas (or other insulation) are almost impossible to mike successfully, especially nylon ones. The audible noise made by the sleeves rubbing against the torso is so loud that even using a boom mike it is often impossible to get an acceptable track. The muffling effect of the insulation adds to the problem because any part of it that gets between the mike and the actor’s mouth will absorb most of the high-and-upper-midrange frequencies. The only saving grace is that most scenes involving such heavily-insulated clothes usually have the actor also wearing some kind of headgear, with the possibility of hiding the mike there.

MYSTERY NOISES

1. If metal objects in the vicinity of the transmitter antenna happen to rub against each other, they can produce static in the audio signal. This occurs because they act like antennas and pick up some of the RF energy from the transmitter. This produces microscopic sparks between them where they touch, and this in turn produces a static radio signal over a wide range of frequencies, including the audio band. This signal can enter the transmitter’s audio circuits where it will be combined with the audio from the mike. Lavs and transmitters with plastic cases are particularly susceptible to this problem. Either separate the offending objects or insulate them where the meet with a piece of tape. (You could also solder or clamp them firmly together.) Some car seats have internal metal springs that rub together. Moving the transmitter from the actor’s back to the front of the body usually solves the problem. A bag transmitter can cause this problem too, unless its antenna is located far away from the other items in the bag, such as on your headphones.

2. Modern automobiles and trucks are equipped with special resistive spark plug wire to suppress ignition interference. But many hot rodders replace it with solid copper ignition wire to improve performance, and this causes the vehicle to radiate a considerable amount of radio interference. Unfortunately, I have encountered this on some camera cars. Motorcycles with magneto ignition systems also produce this type of interference, unless they’re upscale models with a built-in radio. Auto stores sell plug-in suppressor resistors that you can temporarily install between the spark plugs and the cables that attach to them. (Unfortunately, some recent vehicles have the spark plugs hidden under plastic shrouds, or worse, buried under miles of smog control or other plumbing.)

3. A single AC– or battery-power supply can transfer interference between multiple units connected to it unless the individual outputs are isolated with EMI filters. Most commercial power distribution systems incorporate filters but not all. The audio input cable to a transmitter used for a camera hop can carry RF energy down its length to whatever is feeding it. (So can Comtek transmitters.) A cylindrical ferrite RF choke snapped over the cable will block most of this, and should be located as close to the transmitter end as possible. Keep it in place with a nylon cable tie, and cover it with shrink tubing.

4. Be sure that the mounting hardware for all transmitter mike input connectors is tightened securely. A loose collet nut on the mike plug can also cause problems. Broken shield wires anywhere along the cable are another point of entry for interference. Periodically check your lavs by listening as you wiggle the cables down their entire length, from mike to plug, while they are connected to the transmitter.

5. Interference from other transmitters (taxicabs, local paging systems, walkie-talkies, etc.) can cause several types of problems. Audible noise, either whistles or the actual program material, affects analog radio mikes. Muting (audio dropouts) occurs in digital systems, both hybrid (Lectrosonics) and full digital (Zaxcom). Both analog and digital systems can suffer R.F. dropouts if the interfering signal is powerful enough to swamp your receiver’s front end, and analog radio mikes can also have distortion introduced in their audio if they don’t lose your transmitter’s signal entirely. I have found it very useful to carry a small handheld analog scanner receiver to help identify the source of the interference when using
digital radios.

In closing, let me tell you a secret: radio mikes work partially by magic, and I have found that a few drops of goat blood applied to the receiver antennas at midnight under a full moon improves their performance by at least 20%. The color, sex, and age of the goat don’t seem to matter, but the animal must be alive when you obtain the blood.

Text and photos © 2011 Jim Tanenbaum, all rights reserved.


APPENDIX
BASIC RADIO ANTENNA TECHNOLOGY

1.0 Radio waves are a form of electro-magnetic energy, like light or gamma radiation.  They consist of rapidly varying transverse electric and magnetic fields, oriented at right angles to each other.  In a vacuum, they travel at the speed of light, about 300,000 Km/sec or 186,000 miles/second, denoted by the letter “c”.  In air or other substances, they travel slightly slower.  The length of a radio wave is given by λ= c/f, where “λ” (the Greek letter lambda) is the wavelength, and “f” is the frequency.  Frequency is measured in Hertz (cycles/sec) and multiples of 1,000: Kilohertz (KHz), Megahertz (MHz), Gigahertz (GHz), etc.

For example, a 300 MHz signal has a wavelength of 300,000 Km/sec / 300 MHz, or 300,000,000/300,000,000.  The dimensions are m/sec / cycles/sec, or m/cycle, so λ = 1 meter, or about 40 inches.

1.1 In addition to frequency/wavelength, radio waves have another parameter known as “polarization”.  This refers to the orientation of the axis of the electric field to some reference such as the surface of the earth.  A vertical whip transmitter antenna produces radio waves with a vertical polarization, and this signal will be most effectively received by another vertically-oriented antenna.  However, the polarization of a signal can be changed by reflection.  Refection off a horizontal surface can rotate the polarization axis up to 90 degrees from the vertical.  Reflection off a vertical surface can also rotate the polarization axis, depending on the angles involved.

It is also possible to generate a circularly-polarized radio wave, whose polarization axis rotates (either CW or CCW) as the wave travels.  Because the required antenna is large, it cannot be used with concealed bodypack transmitters, but it is sometimes used with receivers – see Section 3.3 below.

1.2 The “Inverse-Square Law”, describes energy that falls off as the inverse square of the distance or E = 1/d2.  (E.g. 1/4 the power at twice the distance; 1/9 the power at three times the distance, 1/16 the power at four times the distance, and so on.)  Theoretically, this law applies to a radio signal transmitted by any antenna, whether omni- or highly-directional, but in the real world, the ground and other nearby objects (including people) can absorb some of the signal and reduce its level even faster.  At best, the Inverse-Square Law can be used for an estimate of the minimumloss.

The signals transmitted by both omni- and directional antennas obey the 1/d2 rule, but the signal level will be higher at any given point from the directional transmitter antenna.

1.3 The power level of the transmitter also has an effect on range, but not as great as some mixers believe.  To double the range (using the I-S Law) requires four times the power, or going from 50 mW to 200 mW, with the resultant reduction in battery life.  (Doubling the power is only a 3 dB increase, because of the nature of decibel arithmetic.)  Moving the receiving antenna closer is preferable, if you can move the receiver along with it to avoid a long run of antenna cable – see Section 4.2 below.

2.0 Radio transmitters and receivers couple to radio waves by means of antennas.  There are two types of antennas: electric and magnetic.

2.1 The simplest electric antennas are constructed of a straight length of wire (called a whip) connected at one end to the transmitter or receiver and free on the other end.  The length is ¼ of the wavelength of the desired frequency, usually written asλ/4.  If you look at a single cycle of a sinewave, you will see that it starts at zero, reaches its maximum positive value at λ/4, returns to zero at 2λ/4, reaches its maximum negative value at 3λ/4, and finally ends back at zero at λ.  Thus a λ/4 whip antenna will produce the maximum voltage at its end, and have the largest current induced in it.  A shorter or longer antenna will produce less voltage and current at the same frequency because it intercepts the wave at a point before or after the maximum.  A vertical a λ/4 antenna is omni-directional in the horizontal plane, with a null at the top and reduced output below the horizontal.  If space is limited, the full length of wire can be coiled up in a helical configuration to shorten the overall length (the typical “rubber ducky”).  It will appear to be the same length (and frequency) to the radio signal, but the output will be somewhat less that an equivalent straight antenna because its smaller size doesn’t capture as much of the signal’s energy.

The performance of a simple λ/4 whip can be improved by the addition of a secondλ/4 whip underneath, pointing downward.  This configuration is known as a “dipole” (and now is a half-wave antenna, or λ/2), and has an omni-directional pattern in the horizontal plane and a figure-8 pattern in the vertical plane, with the nulls at the top and bottom.  (This dipole element is at the heart of many advanced designs, to be discussed below.)

If a λ/4 whip antenna is mounted on top of a metallic surface (called a “ground plane”) whose dimensions are at least as large as the antenna’s, it will appear to be “reflected”, and act similar to a physical dipole.  It is also possible to have several ground wires sticking out radially from the base of the whip instead of a solid sheet of metal for a ground plane.

2.2 The following table gives the length for λ/4 antenna elements.  Look in the row designated by the first digit of the desired frequency, and then in the column headed by the remaining digits.  (E.g. for the cell in the third column of the fifth row, 7.0 inches, the frequency is 400 + 20, or 420 MHz.)  For exact lower frequencies between the table’s 10 MHz intervals (e.g. 14 MHz), or frequencies outside the table’s 10 – 990 MHz range, divide 2,946 inches by the desired frequency in MHz.  (E.g. for 1,000 MHz: 2,946/1,000 = 2.9 inches.)

λ/4-ANTENNA LENGTH IN INCHES

MHz00102030405060708090
00n/a294.6147.398.273.658.949.142.036.832.7
10029.526.824.622.721.019.618.417.316.415.5
20014.714.013.412.812.311.811.310.910.510.2
3009.89.59.28.98.78.48.28.07.87.6
4007.47.27.06.96.76.56.46.36.16.0
5005.95.85.75.65.55.45.35.25.5.0
6004.94.84.84.74.64.54.54.44.34.3
7004.24.14.14.04.03.93.93.83.83.7
8003.73.63.63.53.53.53.43.43.33.3
9003.33.23.23.23.13.13.13.03.03.0

2.3 For low-frequency applications (<50 MHz), magnetic antennas are preferred because they are much smaller than a λ/4 electric antenna, which would be 60 inches or longer at those frequencies.  Magnetic antennas are coils of wire, wound on a plastic or ferrite core.  They are directional with a 3-dimensional figure-8 pattern, whose nulls are aligned with the winding axis.  These antennas are most commonly encountered in portable A.M. radios (about 0.5 – 1.5 MHz), and will not be discussed further here.

3.0 More complicated (electric) antenna designs are said to have “gain”, but this is not the same as the gain produced by an amplifier.  Instead, it refers to comparing their performance to a simple λ/2 dipole antenna.  Two of the most common types are Yagi and Log-periodic.  These two antenna types are different in both construction and functioning.

3.1 Yagis are tuned to a specific frequency and have very little performance above or below it.  This type of antenna has a “driven element”, a λ/2 dipole with one or both arms insulated from the supporting strut (unbalanced or balanced circuit) and at right angles to it.  Behind them is a “reflector” element: two arms of slightly greater (about 5-10%) length that are both grounded to the strut.  In front of the driven element are “directors”: pairs of slightly shorter (about 5-10%) arms also grounded to the strut.  All the directors are the same (shorter) length, and the more of them there are and the longer the strut, the more directional and higher gain the antenna is.  The spacing of the reflector, driven element, and directors is constant, but can vary greatly, from λ/10 to λ/2 in different designs.  Gain ranges from 4 dB for shorter antennas to 12 dB for ones 10 λ long.

Here is a very basic (and over-simplified) explanation of how a Yagi antenna functions.  The reflectors are longer (lower frequency) than the driven elements, thus the incoming radio wave is too small to go around them, and is reflected back toward the driven element.  Technically speaking, the reflector reacts “inductively”, causing the phase of the current induced in it by the radio wave to lag behind that of the voltage.  The directors are shorter (higher frequency), so the radio waves coming in at a small angle are “snagged” as they pass around them, and are bent toward the driven elements.  Waves coming in at progressively greater angles are deflected away.  Waves arriving directly on axis are not affected.  The directors react “capacitively”, causing the phase of the induced current to lead that of the voltage.  Like the microphones mentioned in the article, there is some sensitivity to waves arriving from the rear.

The use of Yagis in production is limited to fixed-frequency applications, such a remote feed link, and for this application the elements are usually oriented horizontally.

3.2 Log-Periodic antennas are designed to operate over a range of frequencies, 2:1 or even 3:1.  There are a number of pairs of elements, of differing lengths, arranged on the supporting strut with the shortest element at the front and the longest at the rear.  Unlike the Yagi, the spacing between the elements decreases toward the front.  This type of antenna does not have particular elements with assigned functions.  Instead, all the elements are insulated from the strut and connected together, with alternating pairs connected out of phase.  It is easy to see this on a “shark fin” printed circuit model.  On one side, the upper elements of pairs 1, 3, 5… are connected together, and also connected to the lower elements of pairs 2, 4, 6…  The other side has the upper elements of pairs 2, 4, 6… connected together and also connected to the lower elements of pairs 1, 3, 5…  This effectively “shorts out” all the antenna elements, so every pair of elements can act like a reflector or director as in the Yagi configuration.

With one exception: the pair of elements whose length most closely matches the desired frequency – they will now become the driven element.  In their case, the RF signal they intercept will be so much larger than the out-of-phase signals from all the other pairs that it overwhelms them.  Now all the longer elements behind the driven element act like reflectors, and the shorter ones in front act like directors.  As in the Yagi, the more directors there are, the greater the gain and directivity, and this occurs at the lower frequency end of the range.  The total number of element pairs in the antenna design has a significant effect on the gain for another reason: with more pairs of elements, the length of the pair acting as driven elements will more closely match the wavelength of the desired frequency, and have a voltage closer to the maximum.

The compromise in making the Log-Periodic antenna frequency agile is that it has less gain than a Yagi of equal spar length, typically 3-6 dB for a frequency range of 2:1.  Also, the gain is not constant over the frequency range.  You should check your particular antenna’s performance (as mounted on your cart) at different frequencies to find if there are any “dead spots” (gain of only 1-2 dB, or sometimes even less).

3.3 “Circularly polarized” and “helical” designs are currently the most expensive types, and offer the ability to handle radio signal with polarizations of any angle.  However, simply angling the two conventional antennas of a diversity pair outward about 45 degrees each will accomplish much the same effect at no additional cost.

4.0 “Impedance” is a characteristic of electrical components and/or circuits that contain capacitance and/or inductance in addition to resistance.

4.1 Radio mike antenna systems are designed to have an impedance of 50 ohms (Ω, the Greek letter omega), and use 50 Ω coaxial cables, most commonly type RG-58A/U.

Video systems are designed as 75 Ω systems and use 75 Ω type RG-59 A/U cable.  If video cables are used to extend radio mike antennas, there will be a reflection of some of the signal away from the mismatch point with the resultant small loss of power.  (Standard RG-59 cable can be recognized because it is slightly larger in diameter than RG-58.)  There may also be a mechanical interference between 50 Ω and 75 Ω male and female connectors that can damage one or both of them if they are interconnected.

4.2 In addition to impedance, coax has a characteristic signal loss of so many dB per foot, and this may sometimes exceed the loss caused by transmission of the radio signal through the air.  For any given cable type, the loss increases with frequency.  For standard RG-58 50 Ω cable types, this loss at 400 MHz ranges from 8 to 12 dB per 100 feet.  At 700 MHz, the loss increases to 12 to 15 dB.   If you must have a long coax run, use RG-8 low-loss 50 Ω cable.  It is much larger in diameter, but has only about 2.5 dB loss per 100 feet at 400 MHz and 3.5 dB at 700MHz.

Instead of extending the antenna cables, moving the receiver with its directly-connected antennas closer to the transmitter and sending the audio back on XLR cable will avoid all the cable losses.  The only drawback is that you do not have immediate access to the receiver’s meters and controls.

4.3 Most objects in the real world tend to be either insulators (very high impedance) or conductors (near zero impedance).  Insulators will generally allow radio signals to pass through without too much attenuation.  Conductors will block radio waves by reflecting them away, but again without too much loss, only a change in direction and sometimes polarization angle.  The problem is with certain substances that have an impedance near 50 Ω as they will absorb a large amount of the radio signal.  The human body, some vegetation, and the metalized scrims mentioned in the article are prime examples.

Cable Connection: Interconnecting equipment & wiring

The Cable Connection – Part 3
Interconnecting equipment & wiring

by Jim Tanenbaum, CAS

Interconnecting your battery-powered equipment with other department’s (or your own) AC-powered units is another major trouble spot. Hard-line feeds to video assist are the main offenders. Their AC power often leaks back and produces a hum or buzz in your audio. A supply of 1:1 line-level audio isolation transformers should be in your kit. They are also useful when tapping into the output of an existing house P.A. system. (Sescom offers units with plastic housings, so the previously-mentioned problem of grounded connector shells is nonexistent.)

A video hard-line feed to your cart can also cause a problem, but video isolation transformers are available to correct it (though they cost considerably more). IMPORTANT: Do not confuse video isolation transformers with video hum-bucking transformers designed to eliminate disturbances in the video image caused by AC power leakage, as many of them do not provide isolation of the output signal from the input.

In general, isolation transformers should be inserted as close to your cart as possible, both to reduce the capacitive loading of the secondary and to minimize pickup of additional interference with the cables on your side of the transformer. Additionally, this practice reduces the chance of other departments accidentally collecting your transformers along with their gear during wrap.

You may require mike-level splitter/isolation transformers if you need to get a separate feed from a particular mike before it goes into the house mixer board (or if their system is too low quality). Remember that a splitter transformer will drop each output -3 dB from the input, so the house mix panel gain will have to be adjusted accordingly.

An audio ground loop problem area concerns the “duplex” cables run between the cart and the boom operator. The cable contains two circuits: a feed from the boom mike and an audio return to the operator’s headphones. A typical duplex cable has two individually-shielded 2-conductor cables inside. The shields are covered with a plastic jacket so they are insulated from each other. Duplex cables are often terminated in 5-pin XLRs, or in some cases, the boom operator’s end is permanently attached at the connector box. This box has a 3-pin female XLR to receive the fishpole plug, and a quarter-inch phone jack for the headphones. If the duplex cable is not attached directly to the box, there will be an additional female 5-pin XLR for the cable and a male panel connector on the box. The most common wiring scheme is:

Pin 1 = Both Shields
Pin 2 = + Mike
Pin 3 = – Mike
Pin 4 = + Phones
Pin 5 = – Phones
Shell = Connected to Pin 1

For starters, the line-level headphone return is about 60 dB (1,000x) above the mike signal. Next, headphones are unbalanced, using a TS plug. Finally, the metal connector box has the metal collar (sleeve contact) of the phone jack mounted directly to it, and also the shells of the XLR-3 (and the XLR-5 if used) connected to it. Usually, this scheme works okay because the boom operator is being fed the same audio coming from the mike, so any ground-loop-induced crosstalk is inaudible. Or at worst, it adds a small amount of additional high-frequency boost from capacitive coupling, which can easily be dealt with a little HF roll-off on the mix panel.

I wasn’t aware of any of this when I first encountered the problem. It was on a show where I had two boom operators, each receiving a common mix headphone return. When I PFLed (soloed) each boom mike, I could hear a faint crosstalk from the other mike. I immediately knew it was capacitive coupling, because the low frequencies were missing, but wrongly assumed it was occurring within my mix panel. I quickly unplugged one of the duplex cables and the crosstalk disappeared, so my suspicions were confirmed (wrongly). Murphy was insidiously at work here—the duplex cables were easy to reach; the 3-pin XLRs at the rear of the mixer were not. Fortunately, the next thing I did was to pull the mixer all the way out and move the 2nd boom from Channel 5 to Channel 1, leaving the 1st boom in Channel 6. To my surprise, this didn’t affect the crosstalk. Then I unplugged the mike from Channel 1 and left it unconnected—when I listened to Channel 6 again the crosstalk was still there, unchanged. Aha! The crosstalk is in the duplex! As a check, I plugged the 2nd boom back into Channel 5 and unplugged the 1st boom. The crosstalk was in that duplex too (see the circuit diagram below).

Here’s what happened: the IR drop in the two wires of the headphone circuit, which has much more current as well as much more voltage, was raising the far end of the headphone feed above ground. Since the headphone jack effectively connected one side of the circuit (the sleeve) to the metal housing of the connector box, the excess voltage drove a ground loop current back down both shields to mix panel on the sound cart, where the headphone was also grounded to its chassis. This was the signal that was capacitively (and slightly inductively) coupling from the shield to the inner conductors of the mike cable. The mike circuit was balanced, but balanced circuits are never perfect. The solution was simple—I replaced the headphone jack with one that had a plastic mounting collar and “floated” (insulated from the metal housing) the headphone ground. No more crosstalk. (Naturally, the plastic jacks are not as durable and reliable as the metal ones.

If you have the room, however, you can enlarge the jack mounting hole from 3/8- inch to 1/2-inch diameter with a stepped single-flute drill bit, and insulate the metal jack collar with two 3/8” x 1/2” plastic shoulder washers, one on each side of the mounting surface.) If you send a stereo feed to the boom headsets, using the two inner conductors for the left and right channels and their shield for the common, insulating the jack will only work if the cable is wired directly into the box, so the headphone circuit shield can be kept isolated from the mike shield. If there is a 5-pin XLR at the box, and both shields are connected to Pin 1, insulating the jack won’t make any difference, because the headphone circuit shield and the mike circuit shield are connected together at the connector. You can, however, float that end of the mike circuit shield, which will help somewhat, because now the headphone shield will have to capacitively couple to the mike shield first, reducing the amount of voltage on the mike shield. Also, there will be no headphone return current flowing down the mike shield, so there will be no IR drop, and thus, even less voltage. (Of course, since now the headphone current is only flowing down one shield instead of two, its IR drop will be twice as much, but the crosstalk to the mike circuit will still be less.)

For me, this is now all academic, because I use radio links coming and going, and there is no longer any possibly of ground loop crosstalk. Or so I thought.

Recently, I had another crosstalk problem. Because the various pieces of equipment on my cart are still connected with cables, I wasn’t home free. Eventually I found the problem, using the techniques described below. I’d made some mistakes early on when building my cart, and never thought to go back and look for potential trouble spots after I learned more.

SOUND CART CONSTRUCTION TECHNIQUES TO MINIMIZE PROBLEMS

1. (Assuming your sound cart has a metal frame.) Make sure that all portions of the structure of your cart are firmly bonded together. Welding or brazing is best, but securely tightened bolts and nuts will do—be sure to use lockwashers under the nuts. (Or elastic-insert nuts that are inherently vibrationresistant.) Periodically inspect fasteners for looseness and retighten. If your cart has portions that fold on a hinge or pivot, especially if there are non-metallic anti-friction washers in the rotating joint, bypass it with a short flexible wire jumper, as described in Item 6 below. If you are using a plastic rack mount case, make sure all rack mount strips are solidly bonded with heavy wire jumpers to any separate connector strips or other metal panels containing circuit components or devices. This is especially true for RF or video circuits.

2. If equipment added over the years has resulted in a rat’s-nest of cables, it’s time to disconnect everything and rewire neatly. Keep power cables away from audio ones, and timecode or other digital signal cables away from analog signal ones. Keep video and RF coax cables away from everything else. Interconnecting cables should not be any longer than necessary, but take into consideration that you might have to pull a unit out for troubleshooting or maintenance while it is still connected.

3. Unfortunately, there is no standard among the various equipment manufacturers regarding the wiring of their audio input and output connectors. Some of them tie Pin 1 to the chassis/ case of their equipment, and some do not. You can check each connector with an ohmmeter—a reading of 0-1 Ù indicates a solid connection and a reading over 10 MÙ indicates no connection. Also unfortunately, you will often get a reading of 10-100 KÙ or higher, or a reading that initially is near zero, but quickly kicks upscale to some high value. These two conditions obtain when there are electronic components connected between the circuit ground and the device’s chassis, usually for RF interference suppression. When you first set up your cart, you can treat this situation as a floating ground. 4. If you have a patchbay for mike-level signals, be sure that the backside is shielded by being completely enclosed in a metal housing. 5. “Empty” braided shielding is available from professional electronics parts distributors. It comes as a flattened-out tube, and in this form, is often used for high-current jumper or ground straps. You can open it up and run cables that need to be shielded through the central opening, to keep interference either in or out as the case may be. The hollow cylinder of shielding increases to its maximum diameter when the ends are pushed together, and decreases to its minimum when they are pulled apart. Maximum flexibility occurs somewhere between these two extremes. I shield all my powering cables this way, with the shield grounded to the equipment case and floating at the battery or power supply end. (To be honest, for low-current devices I simply use shielded heavy-duty 20-gauge mike cable instead.)

6. If you don’t have rack-mounted equipment, with the front panels securely bolted in place, ground all your equipment enclosures to the frame of your sound cart. (Or to each other if you don’t have a metal cart.) You can use one of the device’s enclosure screws to make the connection. Use the shortest possible length of 12-gauge or 14-gauge stranded cable with terminal lugs crimped on each end. I use a “ring” terminal for security on the cart end, and a “hook” terminal at the equipment end so the screw just needs to be loosened rather than removed completely (with the possibility of being dropped and lost) to disconnect the cable. Then insulate all of the enclosures where they might touch the metal frame of your cart (see Item 8 below).

7. Additional shielding is occasionally required. Aluminum foil tape (used for sealing metal air ducts) is readily available from heating and air-conditioning equipment dealers in 2-inch or 3-inch widths. This can be used to seal joints in equipment cases where interfering signals are entering or escaping. It can also be wrapped around a bundle of cables that are permanently installed in your cart. The only problem is in securing a good ground connection to the foil. Several inches of tape at the end of the wrap can be folded over to make a tab which is then screwed to the grounding point. Unfortunately, aluminum oxidizes and becomes an insulator, so the screws will have to be periodically loosened and retightened. Using copper foil adhesive-backed tape solves this problem, because copper oxide is not an insulator. Also, ground wires can be soldered directly to the tape. The only drawback is that the copper tape is difficult to find, and may have to be bought online.

8. An often-overlooked source of static comes from rubbing metal contacts near sources of RF energy such as Comtek or radio mike transmitters. An equipment chassis or even an isolated piece of metal, especially if it has a dimension close to a quarter-wavelength of the RF, will pick up some of the radiated energy. Now, if this piece rubs against another piece of metal (whether grounded or not), there will be tiny (invisible) sparks between them that will re-radiate the single-frequency energy as wideband static, and this can infiltrate the wiring on your cart. Rubber or plastic mats on the metal shelves of your cart will help to prevent this, but two metal objects on the mat that touch each other can still cause trouble. I first experienced this with two large screwdrivers sitting on a wooden workbench next to a radio mike transmitter I was testing. They produced static in the audio whenever their shafts touched. The RF noise they radiated was picked up by the transmitter’s audio circuits. NOTE: Some car seats have internal metal springs that rub, and cause static in radio mikes worn on the actors’ back. Relocating the transmitters to the front of the actors’ body usually eliminates the problem as well as increasing the radiated RF power.

9. One other insidious RF problem occurs because of the “skin effect” in which RF energy rides along the outside of cable shielding without penetrating into the inner conductors. Whatever is feeding the transmitter may be affected by this. The audio input to a Comtek transmitter is a good example. The factory-supplied input cable incorporates a ferrite RF choke (which acts like an inductance) at the transmitter end, but you can buy ferrite hollow cylinder chokes from electronic parts dealers for making up your own cables. With an existing cable, you can either remove the connector at one end to slip on the choke and then reattach it, or buy a “split” choke where the ferrite cylinder comes as two halves in a plastic housing that snaps around the cable.

10. The RF bypass capacitor and resistor combination mentioned earlier can also be used on specific cables to deal with pickup from transmitters located on the cart.

11. Secure all the loose cables with nylon cable ties. For ease of servicing, you can purchase reusable ties with manual release tabs.

The next issue will provide specific troubleshooting advice for crosstalk and also address safety considerations. This will conclude the series.

Text and pictures ©2012 by James Tanenbaum. All rights reserved.

“International Sound Technician” – November 1953

Evolution of the “MIKE” Boom
by William R. Edmondson (M.G.M.)

Read the complete original November 1953 article here.

In this issue of the 695 Quarterly, we take another look at the technology that helped form the basis of film sound recording as we know it today.

While today’s microphones are smaller and lighter compared to the monster ribbon and condenser mikes used during the early years of film sound recording, the issue of how to get those mikes where they need to be is one that still vexes us. With this in mind, it is interesting to take a look back at an article authored by William R. Edmonson of M.G.M., titled “Evolution of the ‘Mike’ Boom,” excerpted from the November 1953 issue of the International Sound Technician (the forerunner of the current Quarterly). In this article, the author outlines the daunting task faced by studio sound engineers in their quest to respond to the needs of film production, which required microphone booms and rigging systems designed to follow the actors on set.

In this fascinating look at the early attempts made by studio sound technicians, you will find photos and references to the early boom designs that later became the basis for the Mole-Richardson Model 103B studio boom, as well as the amazingly lightweight (for 1953!) J.L. Fisher Model 2 location boom, designed by James Fisher while he was on staff at the Republic Studios sound department in the early 1950s.

Sixty years later, many of these booms (with some modifications and upgrades) are still in daily use on soundstages around the world, a testament to the skill of their designers. While not as prevalent as they once were during the early years of film and television production, when a scene calls for a mike that can be moved easily around a set and rotated on a continuous 360-degree axis, there is nothing that can substitute for a perambulator boom with a good operator.

Read the complete original November 1953 article here.

–Scott D. Smith, CAS

Digital Asset Management for Sound

by Scott D. Smith, CAS

Introduction

As a production mixer, sooner or later (if it hasn’t happened already), you will receive a call that goes something like this: “Hello, this is Charlie (usually some overworked and underpaid editorial assistant) calling from the editorial room of Clueless Pictures. We are going through the sound elements for delivery to sound editorial for the show Mission: Impossible XXXVII, and it appears we are missing the iso tracks for shoot days number 200 to 225. We wanted to check to see if there is any possibility that you might have backup files for those days.”

A brief silence ensues while you try to remember exactly what show he is talking about, as it has now been about six months since production wrapped. You respond, “Geez, I don’t know, I will have to check and see. That was some time ago—there might be a backup at the shop. Didn’t they make backups of the dailies in editorial?” More silence, and Charlie replies, “Um, I guess not. I don’t know—I was hired on after the fact. We only have what was delivered to us for ingest into the Avid. We were under the impression that backups were being made on set.”

What the Hell Is Digital Asset Management—and Why Should I Care?

Virtually unheard of 15 years ago, Digital Asset Management (referred to as “DAM” in the trade), is the catch-all term used to describe the process relating to the storage, access, retrieval and migration of digital media files. While “Digital Management” systems have been in existence since the invention of IBM punch cards and magnetic data tape systems, the terminology related to Digital Asset Management systems typically involves files described as “Rich Data” or “Rich Media.” These could include image files, video files, audio files, CAD files, animation and the like.

In the “bad old days” of analog sound recording (including that of film-based cinematography), the “assets” of a production typically consisted of sound elements recorded on magnetic tape or film and optical sound negatives, along with various picture elements (such as camera negatives, interpositives, internegatives, opticals, etc.).

Sound Elements

1. 3M 1/4” audio tape (on hub)
2. External hard drive
3. Quantegy 480 1/4” audio tape (7” reel)
4. FPC 16mm magnetic film
5. Audio Devices 35mm magnetic film (1000’)
6. Western Digital pocket hard drive 
7. San Disk CF card caddy
8. Maxell DVD-RAM disk
9. Zaxcom Deva hard drive
10. Maxell DAT tapes
11. Jaz drive & cartridge
12. 3M 200 1/4” audio tape (7” reel)

Properly stored, these elements could last for many years, allowing for the restoration and “versioning” of films. They are, however, prone to degradation. Photographic elements in particular are notorious for issues related to color dye fading (with the exception of Technicolor IB), and the base materials used for both film and tape suffer from problems of shrinkage and warping. Further, triacetate base film and tape stocks suffer from problems related to “vinegar syndrome.” There is also the well-known issue of “binder hydrolysis” (known as “sticky shed”), which can render a magnetic recording virtually unplayable unless treated. These problems are not confined to just analog recordings either. All tape-based digital recordings (PCM, DAT, DASH, DTRS, etc.) suffer from similar issues. The only difference in the case of these formats is that problems in reproduction will render the recording completely unplayable; the digital converters simply mute when they encounter data past the threshold of error correction. Analog recordings, on the other hand, have a much better chance of being recovered (albeit degraded), even in situations where the carrier material is damaged.

For example, a 35mm magnetic recording could suffer issues related to base warp, incorrect head azimuth, and vinegar syndrome, but in the hands of an experienced sound archivist, will still provide a reasonable facsimile of the original recording. Conversely, a digital tape suffering from base damage can render it totally unplayable, with no chance recovering any part of the signal!

While file-based digital media avoids the pitfalls noted above, it is not without its problems. The most obvious of these is that if the physical carrier containing the data (hard drive, LTO tape, optical media) becomes damaged even slightly, it could result in the total loss of the program. Therefore, any successful digital-based archival strategy requires at least one backup of all the assets deemed to be important. This means that the storage requirements are virtually doubled, as is the storage cost. Further, rapid changes in media file formats and conversion technologies can quickly render both files and management systems obsolete, further adding to the overall costs pertaining to both migration and storage.

What Is a “Rich Media” File?

A “Rich Media” file is distinguished from more traditional digital files in that they typically are visual or audio data of some type, as opposed to files which are primarily text based (such as a Word document), or contain only binary code. While these file types are not mutually exclusive (for example, a Word file might have embedded images contained along with the text), Rich Media files are usually composed mostly of visual and/or audio elements, and are sometimes contained within a file “wrapper” or container. This “wrapper” might contain additional metadata or data that interfaces with a specific program used in conjunction with the file being addressed.

A prime example of a file wrapper is the MXF (Material eXchange Format) file standard, which allows for additional metadata (such as timecode) to be embedded along with video and audio data. While the video portion of the file could be encoded with any number of codecs, the wrapper itself is designed (at least in theory), to allow exchange among any systems which support the MXF file platform.

In a similar fashion, there also exist variations of some basic media file types (such a variations of audio WAV files), that can be played out without the need for a specific proprietary program, but may contain additional “chunks” of data within the file header. The BWF file format that is now used almost universally in audio recording for film and video is typical of this kind of “extended” file structure. Further, a file such as a PDF may contain embedded photos or Flash video, in additional to text elements. It is because of this blurring of distinctions between what might be construed a “Rich Media” file, as opposed to a basic “document” or text file, that has further muddied the context used to describe DAM systems.

In practical use, however, DAM systems are typically employed to manage large files encoded with visual or audio data, anything from a simple MP3 or Flash video, all the way up to full uncompressed hidef video. So while there could be a variety of file formats and codecs contained within a DAM system, an overall integrating structure is still needed to manage the competing file types. This has given rise to some highly complex DAM systems, which in many instances are expensive proprietary solutions designed for large clients such as broadcast media outlets.

The Dalet News Suite, manufactured by Dalet Digital Media systems, is a typical example of a dedicated system. This system employs a unique “container”-based approach to handling news content that might be repurposed for various channel outlets, allowing a producer or editor to access raw footage, and re-edit it for subsequent distribution in different markets. Similar collaborative tools exist in the world of post-production, such as the Avid Unity MediaNetwork system.

While these systems may vary in the way they are designed to access and move data based on user needs, they share a common thread in that they rely on a standardized file structure (using either external descriptive data or embedded metadata), to handle the task of determining exactly what a file contains. Therefore, while a cursory look at the file names contained in a project folder may reveal just a useless array of arbitrary letters and numbers, the database tied to those files allows the system to provide the user with a wide array of pertinent data relating to that specific file. In the world of film or video, the data might include such things as scene and take number, timecode, shot description, the date it was shot, camera metadata  and comments from the director. There could also be additional data files such as lookup tables (or LUT’s), which allows the look of a shot to be controlled during the final color-grading of the production.

For these systems to function as intended, it is crucial that the metadata coding, file-naming conventions and folder structures be followed without any variations. Without this, the system will be incapable to tying together the various descriptive data with the corresponding files. If this correlation is lost, then the system will be unable to manage the tracking and movement of data as it makes from program origination through distribution.

That’s Great—My Head Hurts. What Does This Have to Do With Sound?

As film and video production has moved from an analog world to the realm of digital, the way of both image production and sound recording has changed radically. The tools used to manage workflow in the analog world are wholly unsuited to digital production. A typical example is how films are edited. No longer are there a bevy of assistant editors charged with tracking film elements using edge coding and a code book. Instead, all of this information is contained within a database linked to each of the individual audio and video files that make up a finished show. These may include raw picture files from set, associated production sound files, picture FX files, music files, title files, etc.

Just as all of these elements needed to be tracked in a master code book in the analog world, along with the editor’s cutting copy of the script, the same provisions apply to the myriad files that constitute a final program in the digital realm. It is therefore of crucial importance that a consistent overall structure for handling these elements be adhered to. Without this, at best, files will be impossible to manage, and at worst, may not play at all, due to file incompatibilities.

With file formats and systems constantly evolving, what used to be a pretty straightforward task 15 years ago has now become a minefield, with any number of problems lurking to trip up the unwary. In the analog days, a production mixer could pretty much rest assured that a tape submitted to post would get handled properly (assuming that the log was marked clearly). Worst case, maybe the heads were out of azimuth, or perhaps the wrong track would get transferred. This would happen long before sound editorial was involved in the show.

Despite some areas of standardization for audio file formats (with BWF mono or poly files being the generally agreed-upon format for most production), there still exists a wide variety of standards for items pertaining to timecode, sample rates and bit depth. Further, there is not as yet a fully defined format for how metadata is encoded in the BWF file header. Nor is there any industry standardization regarding file-naming conventions.

Therefore, it is vitally important that an agreed-upon set of conventions be established prior to production, and be adhered to throughout the run of the show. This is especially important when multiple units are involved in production, as material from all units will still need to be ingested into a common platform for editorial. Typically, these specifications will be supplied by editorial. If a show is just starting, it is good policy to shoot a sync test with the cameras and recorders to be used during actual production, and have editorial verify that everything plays well together. At the very least, it is important to get the required information supplied in writing from post-production.

In addition to defining such issues as file format, timecode, sample rate, etc., it is of equal importance to determine exactly who will be responsible for the task of file backups. Despite the endless meetings and phone calls that usually precede a production, this seems to be one area that no one wants to deal with. With so many people involved in the handling of files from the set to post-production, the area of data backup (especially of audio files) seems to get lost in the shuffle. While there are a variety of ways to approach the issue, as a mixer, it is important to define exactly what is expected from you in relation to making file backups of daily production material.

Despite the fact that file-based recording for production sound has been around for at least 15 years now, it is worth noting that most studios still don’t have a clear-cut policy as to how audio and video files are to be archived. In the days of analog production, no one would expect the mixer to maintain a set of backup tapes for a show. Yet somehow, due the perception surrounding file-based recording, the expectations have changed. While this makes absolutely no sense, frequently, certain assumptions are made by post-production in relation to production audio file backups. One of these assumptions has to do with backups.

Liability and Piracy—Why You Should Cover Your Ass**s

Despite the reams of documents that accompany the start of most productions (deal memos, non-disclosure agreements, safety policies, non-discrimination policies, auto mileage reimbursement, cell phone usage and hoards of other items the studio attorneys have dreamed up), there is seldom anything pertaining how production files are handled. While many productions prohibit the taking of personal photos on set, there is almost never any mention of what becomes of the files recorded by the sound department, which typically reside on one or more hard drives or removable media. Although it is understood that such recordings are the property of the production company, what exactly becomes of them is frequently ignored completely.

While this may be a non-issue in most production situations, there have been a few cases that might give one pause. Stories abound regarding instances where digital audio workstations have been rented from a supplier for the recording of a music artist, and subsequently returned with all the sessions files left intact! This allows anyone with access to the drives to simply copy the files and distribute them as they wish. This is essentially the equivalent of handing over the multitrack master session tapes. Occasionally, this oversight works in one’s favor (such as the instance where Michael Jackson’s recordings were found on a DAW hard drive after his death), but it can also become a major headache when the material ends up in the wrong hands.

Likewise, the failure to have a clear backup strategy in place can have equally heart-stopping consequences. There is no shortage of stories regarding hard drives containing crucial production elements being lost, damaged or stolen, resulting in days (or weeks) of work being lost. You do not want this happening on your watch.

Strategies to Help You Sleep Better

Despite the fact that many productions don’t have a clear-cut approach to file management, this does not mean that you shouldn’t take an active role in defining your responsibilities when it comes to the delivery and backup of production files. As everyone knows, when the manure hits the fan (and it will!), production will come looking for a fall guy. Don’t be that guy.

If production has not provided clear guidelines for how files are to be managed, you need to take it upon yourself to define your role in terms of how files are to be stored and delivered. Despite the fact that this is not exactly part of the job description, it is necessary to protect yourself when things blow up. To not take an active role in outlining your responsibilities is to leave yourself open for liability, which is not a situation you want to be in.

So, what are the specific steps you need to take in this regard?

1. If production has not already outlined all the steps for file handling, prepare a basic memo that outlines what you intend to do. This should include what medium files are recorded on (IE: on hard drive, CF card, or both). How they are delivered on the day of production (IE: CF cards handed off to DIT, CF cards delivered off to camera, DVD-RAM disks, etc.). If a film break is done during the day, will files be appended to the same roll or will a new roll be started?

2. State how logs will be delivered (paper copy, file, or both).

3. Outline what steps (if any) you will take in regards to making file backups, i.e., if recording to hard drive, will you make a daily backup or weekly backup or none at all?

4. If you are expected to make incremental backups on a daily or weekly basis, make note of how much additional time you expect this to take, so that you don’t start receiving questions from payroll about your timecard.

5. If files are being recording to hard drives (belonging to either yourself or a rental company), state what you intend to do at the end of production. If a production expects you to keep files after the end of shooting, clearly state what your liability is in this regard. You do not want to put yourself in the position of being liable in a case where production may come asking for backups, and you discover that you don’t have the files they are asking for.

If you are renting equipment from an outside rental company, it should be clearly stated that all data will be wiped from the hard drives before the equipment is returned. This will cover you in a situation whereby something from a shoot may suddenly turn up on the Internet. Additionally, if you are expected to hold onto files after production, you need to state  that you are doing this as a courtesy to production, but in no way are you liable for their safety or piracy. (This is SOP for labs and post-production houses.)

6. If you do keep files after production has wrapped, state for how long you will keep them. (In this regard, it is also a good policy to notify post-production of your intent to wipe drives before you do it). No matter what the strategy, do not load files on any computer or drive connected to the Internet! No matter what your level of protection from hacking, this will prevent you from becoming a casualty of data theft. Files should always be stored on a separate hard drive, preferably kept in a safe place.

If you are operating under some kind of company structure (LLC, LLP or Corporation), you should submit these guidelines under the auspices of your company, so as to limit your personal liability. In no circumstances should you sign any document from production which holds you personally liable for the loss or piracy of media!

This memo should be delivered to the unit manager, the production supervisor, and editor. If delivering by email, make sure that they acknowledge receipt! (Personally, I prefer to make a printed copy and deliver it as well. This will save you in situations where somebody says, “I never got the memo.”)

When delivering files to the production office, be sure to have the recipient sign to acknowledge delivery. This will provide a clear chain of custody in situations where something gets lost. If sending media by a courier or shipping company, it is important that you request a signature upon delivery.

This may seem like a lot of extra effort, but the digital landscape has completely changed the way we operate and allows scenarios that would seldom occur in the world of analog recording. (For example, a production company would never expect the mixer to maintain copies of 1/4” production tapes.)

Having said this, you will of course be a hero if you take it upon yourself to make file backups of your own accord, and receive the call from post-production looking for them six months later! In this regard, however, you do not want to accidentally open yourself to liability in cases where files may end up in the wrong hands.

Housekeeping

Despite the move to digital, it has not relieved us of the burden of paperwork (in some ways it has made it worse). We still need to submit a sound report, whether as a paper log or digital file. In addition, it is now expected that we include basic file metadata in the header of each take. This usually consists of scene number, take number and track name, along with any basic notes. Unfortunately, this arrangement doesn’t always allow for easy changes after the fact.

While some recorders allow the metadata header to be edited after the fact, there are occasionally limitations as to exactly what fields can be changed. For example, in a quick scene change, you may accidentally forget to change the name of a track, so the file will bear the names from previous scenes. While tools such as BWF Widget allow the user to modify the metadata on external media after the fact, it does not change what is contained on the hard drive. Therefore, if you produce a file backup from the hard drive, it will contain the same errors, forcing you to make corrections on both the daily file media and the backup. Not how you want to be spending your weekend!

Further, in most current file-based recorders, metadata is stored as both a bext data chunk and an extended iXML header. If changes are to be made, they will usually need to be done separately for both. Therefore, it is always helpful to pay attention to the metadata that is recorded during shooting, so as to prevent the hassles of trying to correct it after the fact (easier said than done when it’s hour 14 of a grueling production). Hopefully, new tools will be introduced soon which will allow for easier modification of file metadata after the fact. The delivery of logs is equally fraught with complications that we never had to deal with in the days of analog. If paper logs are delivered, they frequently get separated from the data files during post-production (or get delivered after the fact). This is especially the case when sound files are delivered from picture editorial to sound editorial, which may be done over network drives, with no physical delivery of media. All the careful notes you made for sound editorial are now stuffed away in a box somewhere.

To keep yourself from being a victim of this scenario, it is helpful to provide a digital log of some sort along with audio files (this could be in the form of a scan of a paper log, a PDF of a machine generated log, an Excel file or text file). No matter what route you choose, having a log file kept with the media will always be appreciated by the folks in post. However, it is best to stay away from formats that are dependent on specific operating system platforms, as it is impossible to know in advance what systems might be employed down the line.

Further, as studios begin to archive productions on mass storage systems for repurposing of content, it will allow for the easy retrieval of the sound files along with their associated logs, without resorting to searching through paper logs.

Summary

As the digital landscape continues to evolve, it will become increasingly important to be cognizant of how the material recorded during production will be handled down the line. While practices put in place during the analog era generally remained the same for decades at a time, the same cannot be said for digital media. New technologies for both production and post-production can change almost overnight, with subsequent impact on how the scenario for production sound is played out. This is especially true when it comes to the physical media that data is being stored on, both during production, as well as subsequent archiving. Already, we have seen at least three major transitions for the physical delivery of sound files in the past 15 years (Jazz Drive, DVD-RAM, and CF cards), with more to come. It will be increasingly important for sound crews to be well versed as how data is recorded and delivered on various media, each of which has its own idiosyncrasies. As the production world becomes more “data-centric,” our role in how sound is recorded and delivered will have a major impact on how accessible it will be for future generations.

© Scott D. Smith, CAS

The Cable Connection: Balanced and Unbalanced

by Jim Tanenbaum, CAS

As mentioned earlier, there are two basic cable types: balanced and unbalanced. But there are many variations on these two themes.

BALANCED

Balanced cables, which can be used for either balanced mike-level or line-level signals (or unbalanced mike or line signals, for that matter), consist of two (or three) insulated conductors surrounded by a metallic shield and an outer jacket of rubber or plastic. They do not have a standard impedance, but are usually close to 110 Ω. The inner conductors are composed of many thin individual wires twisted around each other for flexibility. The outer shield is available in various configurations, two of the most common are twisted and braided.

Twisted shields have many thin wires wrapped spirally around the insulated inner conductors. While this arrangement is more flexible than braiding, with repeated flexing the shield wires tend to separate, creating gaps for interference to enter and/or the inner conductors to bulge out. Their initial 95% coverage can fall to 70% or less. One attempt to overcome this is to have two layers of spiral wrapping, in opposite directions. Still, separations manage to occur, with the same problems as the single-wound shields. To further enhance their flexibility, most twisted shield cables have thin outer jackets of PVC plastic. While more supple, PVC-jacketed cables are also more easily damaged by abrasion, cutting or crushing. And they get really stiff in cold weather.

Braided shields have many thin wires woven (in an alternately over-and-under pattern) into a tube that encloses the inner conductors. This type of construction is durable, but somewhat less flexible than a twisted shield. The effective coverage ranges from about 85% to 95%. With repeated use, the individual shield wires will break, eventually causing increased susceptibility to interference and static when the cable is moved, particularly when phantom power is present. The Belden Company offers a line of mike cables that have a more open braid for flexibility, and then underneath, a layer of cloth impregnated with a conductive carbon compound to provide almost 100% shielding. The only drawback is that the black goo sticks to the shield and makes it difficult to solder. Finally, many braided shield cables are offered with rubber jackets that do not get as stiff when cold. IMPORTANT: Natural rubber quickly cracks when exposed to oil or smog — be sure to buy synthetic rubber (e.g. EPDM, Neoprene, Hypalon) jacketed cables.

A third type of shielding involves a wrapping of aluminum foil or aluminized plastic film, with one or more bare ground wires running alongside to provide a means of connecting to it. This type of cable is limited to permanent installations, as it is not very flexible, and sharp or repeated bends in the same area can cause the aluminum shield material to tear.

As mentioned above, some balanced cables have a third inner conductor. This will be discussed later.

A special class of balanced cable has recently been introduced for digital signals:Unshielded Twisted Pair (UTP) cables (such as CAT-5 and the newer CAT-5e) that consist of four twisted pairs of conductors twisted around each other and covered in a plastic jacket, without an overall shield, because digital signals can tolerate much more interference. For really difficult EMI (Electro-Magnetic Interference) environments, shielded twisted pair (STP) cable is available. The insulation thickness and spacing of the conductors is rigidly controlled, and these cables have an impedance of 100 Ω ± 15%. Each of the four pairs has a different twist pitch to minimize crosstalk. They are nowhere as flexible as some of the cables described above and are chiefly intended for fixed installations. Nevertheless, many mixers use them in production for digital audio, timecode, and even video for their monitors. IMPORTANT: If you are working with an existing installation, or cables with preattached RJ-45 connectors, be aware that there are two different color-coding “standards”: T568A and T568B. They differ by interchanging the plug pin positions of the green and orange wires, and the white-w/green-stripe and white-w/orange-stripe ones. Since the conductors at the cable ends are connected to the plug contacts one-for-one, either type of cable may be used with either type of jack.the confusion arises if you try to wire a CAT-5 cable directly into a circuit board and use the wrong color-code chart for the plug pin connections at the free end of the cable.

UNBALANCED

Unbalanced mike cables consist of a single insulated inner conductor surrounded by a metallic shield. These cables do not have a standard impedance, but can range from about 50 to 250 Ω. Shielding may be spiral or braided, and jackets plastic or rubber.

A second type of unbalanced cable is coax (coaxial). Like the mike cable, it has a single center conductor surround by a metallic shield. Unlike the mike cable, however, the physical dimensions and insulation composition are rigidly controlled, in order to maintain a constant impedance along its entire length. This is necessary because coax is used for very high-frequency signals, and a change in impedance can cause a loss of power by reflecting some of it back down the cable. Even with a constant impedance, the high-frequency signals are attenuated significantly as they travel, so in addition to an impedance specification, coax is rated for signal loss, in dB/100ft at various frequencies. Coax cable is identified as belonging to various groups or ”Types” primarily by impedance and Outside Diameter (O.D.). Within a given Type, there are cables with stranded or solid center conductors, foam or solid dielectric (the plastic insulation surrounding the inner conductor), and braided or foil shielding. The first term in each of the preceding three pairs represents the more flexible construction.

RG-8 Type is 50 Ω, low loss (≈ 3 dB/100ft @ 450 MHz), and about 3/8-inch O.D.

RG-6 Type is 75 Ω, low loss (≈ 4 dB/100ft @ 400 MHz), and about 3/8-inch O.D.

RG-58 Type is 50 Ω, medium loss (≈ 7 dB/100ft @ 450 MHz), and about 1/4-inch O.D.

RG-59 Type is 75 Ω, medium loss (≈ 7 dB/100ft @ 400 MHz), and about 1/4-inch O.D.

RG-174 Type is 50 Ω, high loss (≈ 15 dB/100ft @ 450 MHz), and about 1/8-inch O.D.

RG-58 is sometimes used to extend radio mike receiver antennas, but its loss is often more than the inverse-square loss of the radio signal traveling the same distance through the air. For this application, RG-8 would be a better choice if more than five to 10 feet is needed.

IMPORTANT: There are subtypes: RG-58A/U Type is slightly different than RG-58 Type, and RG-8x is considerably different from RG-8. Read the manufacturerfs data sheets carefully for the particular cables you are considering.

EVERY CABLE HAS TWO ENDS

To use a cable, it must be terminated with some kind of connector. (Unless it’s soldered directly to a circuit board.)

BALANCED

The most commonly encountered (balanced) microphone connectors are 3-pin XLR (originally a model designation in the Cannon Brand, but now used generically). These connectors have a metal shell and three insulated contacts. The “standard” wiring is:

Pin 1 = Shield

Pin 2 = + Audio (a.k.a. Hi, In-phase)

Pin 3 = . Audio (a.k.a. Lo, Out-phase)

Shell = Ground (most models offer a way to connect the plug shell to Pin 1, with the exception of the old Cannon XLRs. If you use these, solder a length of bare busbar wire to Pin 1 and run it out the back, between the rubber strain relief and the U-clamp.)

This standard is based on Pin 2 of the microphone going positive with respect to Pin 3 when pressure on the front of the mike is increasing. A further standardization is that outputs are on male (having solid pin contacts) connectors (called gplugsh when on a cable) and inputs are on female (having hollow receptacle contacts) connectors (called gsocketsh when mounted on a panel).

Murphyfs Law ensures that things are not so simple. In the 1960s,the first Nagra recorders used male mike-input connectors, necessitating the use of mike cables with female connectors on both ends. (Extension cables were female-male.) Some European equipment manufacturers followed suit, with male in and female out. Other Euro devices have female inputs and male outputs. There also are places in the eastern U.S. (and elsewhere) where the functions of Pins 2 and 3 are reversed, so check carefully when using equipment not your own. (There also are pin-swapping issues with normal and gred doth T-power microphones, but that is beyond the scope of this article.)

More recently, the Switchcraft Company brought out the ”TA” line of miniature connectors, originally intended to follow the U.S. standard practice of male out and female in. But the panel mounted female TA connectors were so much larger than the male ones that radio mike manufacturers were forced to use male TAs for their microphone inputs, and put the females on the mike cable. WARNING: Since some brands also use male panel connectors for receiver outputs, the possibility exists for accidently plugging in an electret lavaliere microphone to a line-level output and destroying the mike.

UNBALANCED

Unbalanced microphones normally use .-inch mono phone plugs (TS, or Tip and Sleeve). They often are high impedance, and are not usually encountered in professional work, although you may have the occasion to tie into them when they are used as props, or if you have to make a field recording of a local small-town musical group or public speaker. There is no industry-wide standard, and some of the microphones may be quite high impedance. Impedance-matching transformers are available, and may include a housing with a .-inch phone jack in and an XLR male plug out.

Unbalanced line-level signals may also use .-inch mono phone plugs, or the smaller and flimsier phono (RCA) plugs. (There are some high-quality semi-pro phono plugs, but even they become unreliable after repeated insertions.) The RETMA (consumer) line-level standard is -10 dB at 47 KΩ, but many manufacturers ignore it. You can make up simple wired adapters to interconnect unbalanced and balanced devices, but using a balun transformer (see below) will allow longer runs of cable and block common mode interference.

IMPORTANT: Studio patch panels use a plug that resembles a standard ¼-inch stereo phone plug (TRS, or Tip, Ring, and Sleeve), but there are dimensional differences (particularly at the tip), and you can damage a patch panel if you attempt to plug a standard TS or TRS into it. You may also get an intermittent connection. Itfs a good idea to make up (or buy) several adapters so you will be able to tie in to a patch panel if the need arises. A good configuration is a TRS patch plug with its send circuit wired to a female XLR and the return circuit wired to a male XLR.

While on the subject of different types of ¼-inch plugs, if you need to patch into an aircraft pilot’s headset, their ¼-plugs are much shorter (about ¾ inch) and have two ring contacts (TRRS or Tip, Ring, Ring, and Sleeve). You will have to make or rent/buy an adapter in advance.

The wiring is:

Tip = + Mike

Ring 1 = + Headphones (mono)

Ring 2 = – Mike

Sleeve = – Headphones

Another dimensional problem involves 1/8-inch (3.5 mm) phone plugs. The mono plug is slightly larger in diameter than the stereo plug, especially the tip portion. Depending on the particular manufacturer, a mono plug may not enter a stereo jack, of if it does, it may bend the contacts so a stereo plug will no longer work properly. (The jacks on Comtek receivers are designed to accept either type.) The smaller 3/32-inch (2.5 mm) plugs do not seem to have this incompatibility. Both these sizes of stereo plugs are used with some cell phone headsets, and some of them use a double ring plug. Again, you will need an appropriate adapter to patch in.

Coaxial cables are usually terminated with BNC (Bayonet-style) connectors. IMPORTANT: Because of the relationship between size and impedance, BNC connectors for 50 Ω and 75 Ω cables are slightly different in dimension. Using connectors with one impedance on a cable with a different impedance cannot only cause signal reflections from the impedance mismatch, but also can be damaged when a 50 Ω connector is mated with a 75 Ω one. For limited space applications such as radio mikes, SMA and even smaller SSMA threaded-style connectors are used. IMPORTANT: Radio mikes use “normal” SMA connectors, with a male pin in the cable-mounted connector (the one with the threaded collet). The more common SMA connectors used on computer Wi-Fi equipment are “reverse,” with the cable-mounted connector having a female receptacle for the male pin in the panel-mounted connector. Therefore, you cannot use a Wi-Fi SMA cable to extend a radio mike antenna.

TRANSFORMERS (NOT THE MOVIE)

Transformers have many uses, but here we are concerned with only four of them: changing impedance, converting between balanced and unbalanced circuits, blocking some kinds of noise, and splitting signals. A particular transformer may be designed to perform one, two, three, or all four of these functions.

A basic transformer consists of two coils of insulated wire wound around the same (usually iron alloy) core. Laminated iron sheets are used for low (e.g. audio) frequency cores; powdered iron alloy (ferrite) for medium to high frequencies. Air cores (wound on a plastic bobbin if the wire is not stiff enough to keep its shape) are used for even higher (radio) frequencies. If both coils have the same number of turns, a signal fed into one coil (the primary) will appear at the terminals of the other coil (the secondary) relatively unchanged. The second signal will, however, be electrically isolated from the original circuit. This removes most C-M noise.

An isolation transformer is usually 1:1, and will have an additional layer of non-magnetic metallic shielding over the secondary winding to block capacitive coupling of the electrical field produced by the noise on the primary winding. The entire transformer may be mounted in a shielded enclosure, with input and output connectors. In this case, the shell of the input connector must be electrically isolated from the shell of the output connector to block transmission of the C-M noise by this route, because XLR (and many other type) connectors often have their metal shells connected to the cable shielding, and thus ground loop current could bypass the electrical isolation of the transformer by flowing through its metal housing.

A transformer designed to change impedance will have a differing number of turns on the primary and secondary. The formula is: √ZP/ZS = NP/NS (N is the number of turns, and the subscripts P and S denote the Primary and Secondary windings.) e.g. To change 600 Ω to 150 Ω, an impedance ratio of 4:1, the square root of 4 is 2, so the primary will have to have twice as many turns as the secondary. (The actual number of turns required is determined by the impedance, frequency, core characteristics, power level, and other factors, again beyond the scope of this article.) NOTE: Theturns-ratio is always defined as primary (input) turns divided by secondary (output) turns.

A balun (BALanced-UNbalanced) transformer is used to convert a balanced circuit to an unbalanced one, or vice versa. At the same time, it can also change impedance if required. A typical application is connecting a 75- Ω coax (unbalanced) to a 100- Ω CAT-5 twisted pair (balanced). It may also provide the functions of isolation and blocking C-M noise. Changing a circuit from unbalanced to balanced will not remove T-M noise that is already there, but may prevent more from entering. IMPORTANT: Always put the balun as close as possible to the unbalanced source, so the cable run is made in balanced format.

A simple balun will have the two ends of one winding connected to the two conductors of the balanced circuit, and the two ends of the other winding connected to the center conductor and shield of the unbalanced circuit. The shield of the balanced circuit may or may not be connected to the case of the balun and/or the shield of the unbalanced circuit. An even simpler balun has only a single winding, with a center tap. The balanced circuit is connected to the two ends of the winding and the unbalanced circuit has the shield connected to a center tap of the winding and the inner conductor also connected to one of the winding ends. Obviously, this type of balun does not provide any isolation or blocking of C-M interference, and is mainly used in antenna circuits.

A splitter transformer has a single primary winding and two identical 1:1 secondaries. Since a splitter transformer is a passive device, each output will be -3 dB down from the input. Similar to those of isolation transformers, the two outputs will be electrically isolated from the input and from each other, but only if the transformer’s XLR connector shells are insulated from its case. Many commercial units do not have this feature, but it is possible to remove the connectors, enlarge the hole if it contacts the protruding back part of the connector, place an insulating plastic film between the back of the mounting flange of the connector and the splitter case, and reattach it with plastic screws. NOTE: Many splitters have a “ground-lift” switch, but this breaks only the connection between Pins 1 of their input and output connectors. Unless you have cables with the connector shells floating, or insulate the splitter connectors as just described, the ground-lift switch will be ineffective.

WARNING: If you use a simple Y-cable instead of a transformer to split an audio signal to feed two other devices (e.g. a recorder and a Comtek transmitter), there will be no isolation, so signals from one can get into the other (RF in this case), and the audio may be completely corrupted.

All types of transformers have certain parameters that must match their intended application. Only the ones relevant to this article are discussed here.

Transformer parameters:

1. Level: Mike or line. Mike-level transformers will overload and distort if used with line-level signals because the core will be completely saturated with magnetic flux lines well before the input signal reaches its maximum voltage. Line-level transformers can be used with mike-level signals, but the higher winding impedance might cause loss of high-frequency response when connected to certain types of output circuits.

2. Impedance: ranges from low (50 Ω) to high (>10 K Ω). Impedance matching is more or less critical depending on the nature of the circuits involved.

Typical values are: Input/Output Impedance, isolation: Mike-level = 150 Ω /150 Ω. Line-level = 600 Ω/600 Ω

Input/Output Impedance, impedance matching: Hi-Z Mike to Lo-Z Mike input = 6 K Ω /150 Ω

Input/Output Impedance, balun: Twisted-Pair to Video Coax = 100 Ω /75 Ω.

NOTE: The “impedance rating” of a transformer does not refer to the actual impedance of the windings inside the transformer itself, but rather the impedance of the input and output circuits it is designed to work with. The input impedance of a transformer will be the actual impedance of whatever the output winding is connected to, divided by the square of the turns-ratio. The output impedance is the input-circuit impedance (such as a 150 Ω microphone) multiplied by the square of the same turns-ratio.

3. Power Handling: The higher the power, the larger the diameter of the coil wire and the larger the core cross section, in order to handle the larger magnetic flux.

4. Frequency Response: Transformers do not respond equally to all frequencies. To give good performance over a range of frequencies requires certain design parameters. The lower the frequency, the larger the core must be. The higher the frequency, the lower the winding inductance and distributed capacitance must be. These two factors oppose each other, so transformer design must of necessity involve trade-offs. The particular core material is also a function of frequency. Professional transformers easily are flat within ±0.5 dB from 20 Hz to 20 KHz.

5. Distortion: All transformers produce some amount of distortion, primarily because of core saturation, hysteresis, and signal phase shift. A “good” transformer will have 0.01% distortion, an “excellent” one will have 0.003% or less. Most audiences aren’t aware of even 0.1% distortion in a movie soundtrack so this is usually not a problem.

6. Isolation: Electrically shielding the secondary winding from C-M noise on the primary is tricky because the alternating magnetic field will induce “eddy currents” in the metallic shield. Any design features that reduce this will decrease the efficiency of the shielding. However, most isolation transformers you will encounter provide adequate isolation.

7. Shielding: Overall electrical and magnetic shielding to protect the transformer from outside interference is somewhat easier, because that shield can be placed far enough from the core to avoid most of the external flux lines. IMPORTANT: Most inline transformers (e.g. isolation) are not magnetically shielded, so be careful where you place them. Avoid motors and power transformers. If magnetic interference is a problem, rotating the transformer 90 degrees to the magnetic field may reduce it sufficiently, if not, move it farther away.

IMPORTANT: Remember that a transformer is a passive device; it cannot give out more power than it receives. The input signal is characterized by voltage, current, and its circuit’s impedance. You may chose any one of these to change at will, but then the others will automatically alter to compensate. e.g. You can raise the voltage of a 150-Ω mike-level signal a thousand times to that of linelevel, but now the output impedance will be so high (150 Ω x 1,0002 = 150,000,000 Ω) that a 600- Ω line input would effectively short-circuit it. You could use a so-called infinite-impedance deviceto “see” the full higher voltage, but now the extra power comes from its amplifier, not the input signal.

SO HOW DOES THIS ALL WORK IN THE REEL WORLD?

Let’s start with the XLR cables. I have most of mine 50 feet in length, with some 25-footers for shorter runs. Also, an assortment of 1-, 2-, 5-, and 10-footers. If a longer cable gets damaged in a single area, it can be cut there and turned into several shorter ones. When cables have been in use for some time, they will develop so many breaks in their shield wires that they become susceptible to picking up interference or creating static when they are moved. Discard them, even if the problem seems to be in just one or two spots.the rest of the cable will fail shortly thereafter. Whether or not to reuse their XLR connectors depends on how much wear and tear they have accumulated. One thing that can be done peremptorily to extend the life of cables is to periodically “circumcise” them, cutting off the connectors and about two inches of cable, and then reattaching the connectors. Cables tend to fail at the flex point where they enter the connector much sooner than elsewhere. As soon as two or three of your cables have gone bad at their plugs, it’s time to service the lot.

Actually soldering the cables to the plugs is a skill beyond the scope of this article, but Local 695 offers an excellent training class. One thing to keep in mind is that shrink-tubing does NOT make good strain reliefs, because when shrunk it is too stiff and simply transfers the stress point to the far end of the shrink-tubing. Use plain PVC tubing (available in many sizes from electronic supply stores) instead. It is much more flexible and will form a smooth curve to more evenly distribute the stress. I save the sections of the outer plastic jacket I strip off various cables while attaching plugs, and use them for strain reliefs on smaller diameter cables.

Some brands of microphone connectors offer a means of connecting to the metal shell and some do not. There is still a considerable controversy over whether to ground the connector shell (sometimes called body) or not, and if grounded, whether to ground the shell at only one end of the cable. There is no simple, always-correct answer.

Here are the possibilities (using 2-conductor cable with the balanced audio always connected to Pins 2 and 3 at both ends):

1. Shield connected to male and female Pin 1; male and female connector shells floating.

2. Shield connected to male and female Pin 1; male connector shell connected to Pin 1; female shell floating.

3. Shield connected to male and female Pin 1; female connector shell connected to Pin 1; male shell floating.

4. Shield connected to male and female Pin 1: both male and female connector shells connected to Pin 1.

If 3-conductor cable is used, there are three more possibilities:

5. Third wire connected to male and female Pin 1; shield connected to male connector shell; female connector shell floating.

6. Third wire connected to male and female Pin 1; shield connected to female connector shell; male connector shell floating.

7. Third wire connected to male and female Pin 1; shield connected to both male and female connector shells.

IMPORTANT: Some people advocate not connecting the shield (and/or the third inner conductor if present) to Pin 1 at both ends of the cable, but then differ among themselves as to whether the sole connection should be made at the male or female end. In the following discussion, I will assume the standard configuration in which a male plug will be connected to an input and a female to an output. To begin with, if the cable is to be used with phantom-powered mikes, there must be a current path between both Pin 1s, so any further discussion is moot. If phantom powering is never a consideration (WARNING: “never” is not a valid term in Hollywood), connecting the shield to the male’s Pin 1 will usually provide the greatest protection from EMI (e.g. radio station) pickup; but it can also increase the amount of T-M noise which had previously been C-M. Connecting the shield to the female’s Pin 1 will usually provide the greatest protection from continuing the transmission of C-M noise without converting it to T-M; but now increasing the susceptibility to EMI. I do not believe these purported “benefits” of breaking the Pin 1 interconnection outweigh the potential disadvantages, especially the lack of compatibility with phantom power, and in the production environment. But if you really, really must break the Pin 1 circuit, connect the shield to Pin 1 at the female (input) end.

What to do? Consult a Ouija board. Actually, you could do worse. Or you could make up cables in each of these configurations, and try them oneby- one.

Here’s what I do: most of my cables are 2-conductor, and wired as per Number 4. I have made up several 3-conductor cables wired as per Number 5. On those occasions when I have encountered problems with the 2-conductor cable, the first thing I try is replacing the T-power microphone (e.g. a Sennheiser MKH406) with a phantom-power one (MKH40), or vice versa. This usually eliminates the trouble. IMPORTANT: Sennheiser’s new aluminum-cased mikes have a problem that is often attributed to a bad cable: The case is grounded by a screw near the plug that tightens against a bare patch of aluminum. In about a year or so, the aluminum oxidizes and forms an insulating layer, destroying the integrity of this grounding function and ability of the case to intercept interference. Loosening and retightening the screw a couple of times restores the effectiveness of the connection.

On those occasions when swapping mikes didn’t remedy the problem, I have substituted a 3-conductor cable. But in only two instances was there any improvement. Most of the situations occurred in proximity to AM radio broadcast towers (antennas), and the signal strength was simply so high that nothing could keep it out. One time it was possible to move the recorder very close to the mike, and connect it with a much shorter cable. Interestingly, grounding the sound cart’s chassis to a nearby cold-water pipe made matters far worse. I haven’t used a full digital system in this environment yet, so I don’t know if it will be any more resistant.

NOTE: To help block AM radio or other high-frequency interference, inline 50 to 70 KHz low-pass filters are available that can be inserted next to the mixer’s or recorder’s mike input receptacle. Some sound mixers RF bypass the inner conductors with 0.01-0.02 ìF capacitors inside the male XLR plug. You need a disc ceramic type (low internal inductance) and to keep the two leads as short and straight as possible. Adding a 1/10-watt 50-Ω resistor in series with the 0.01 ìF capacitor will help match the cable impedance and reduce the amount of energy reflected back into the cable. Solder one capacitor-resistor combination between Pin 2 and Pin 1, and another between Pin 3 and Pin 1. If you can get “chip” capacitors and resistors (as used on SMT circuit boards), they have no leads at all, just tinned ends, and are even smaller. Using chip components will make it much easier to install the parts.

I have been able to conduct some experiments on stage with buzzes from H.M.I. lights, and found that both 2- and 3-conductor cables were almost equally affected. Crossing these power cables at right angles was of no help. Only separating the two cables with an apple box worked, but there is always the danger of having the mike cable pulled off the box to land back on the H.M.I. cable. It is better to re-route your audio cable to avoid crossing any electric cables if at all possible.

Another common problem occurs with outdoor cable runs. Electricity always takes the path of least resistance—literally. (However, some of the current will still flow through other paths that have higher resistance). For example, if lighting units are set on the bare ground, there may be a flow of leakage current through the soil between the lamp stands and the grounding point of the generator. Now, if you have a run of interconnected mike cables lying on the ground along this path, some of the AC current will leave the soil where one of the cable connectors is located and flow along the mike cable’s shield until it leaves at the connector at the other end of the cable, closer to the generator. This is a case where having the connector shells floating would protect you, but it is easy enough to cover the connectors with gaffer’s tape. (IMPORTANT: Be sure to leave a folded-over tab to make removal of the tape quick and easy.)

While damp ground can be dealt with by gaffer-taping the connectors, protecting them from actual liquid water requires more extreme measures. The best one is not to do it in the first place: if you know in advance that you will need a long cable run underwater, make up a single continuous length cable. (You can always make several shorter ones out of it afterward.) If only mud or dirt is the problem, Neutrix makes a line of heavy-duty mike connectors. The male has a stainlesssteel barrel which resists deforming when stepped on or run over, and the female has an external rubber boot that mates with the open end of the male shell and also covers the latch button. This combination keeps out non-liquid contamination, and if you apply some silicone sealant inside the cable strain relief, will handle liquid splashes as well (as long as the sealing lip of the rubber boot is not damaged).

For last-minute emergency waterproofing of a pair of mated connectors, “Rescue Tape” brand silicone self-fusing tape can be used (www. rescuetape.com). Start a spiral wrap around the cable, about six inches from a connector, pulling the tape until it is fully stretched (about three times its original length). Completely overlap the first turn, then be sure to overlap the remaining turns almost half the width (be careful to avoid bumps from creating a third layer). Wrap over the two connectors, being sure to maintain the almost 50% overlap. Continue wrapping six inches into the next cable. Finish with the last turn completely overlapping the previous one. Squeeze all the tape with your hands to ensure complete adhesion of the layers. If you’ve done this properly, the connection should be good for submersion under several feet of water, at least for a short time. WARNING: Test your technique in advance. Unfortunately, removing the fused mass afterward is difficult. Slice through it with a sharp blade, gradually going deeper with each pass, and being careful not to nick the cable jacket or connector shells.

Text and pictures ©2012 by James Tanenbaum. All rights reserved.

Editor’s Note: The next installment will take up issues of interconnecting equipment and optimal sound cart wiring.

Up “The River”: An Hawai’ian excursion

by Steve Nelson, CAS

The common wisdom tells us that Puerto Rico is a great place to shoot a pilot but not the series. It’s too far, the weather is dicey with a hurricane season, not enough infrastructure, language issues, too exotic, in our case, the landscape required too much expensive CGI to give it the right Amazonian look, all the usual complaints that make me wonder why so much production has left Los Angeles. So when we got our midseason pickup order, no one was surprised that we would take the show to Oahu, despite the very generous 40% incentive offered by Puerto Rico to stay.

There are some favorable reasons to go west: only five hours direct flight from Los Angeles, relative lack of hurricanes, more English, miles not kilometers, better sushi and cycling, more infrastructure including a recently built actual soundstage and enclosed water tank (which the network wants to tie up), a better jungly look and more varied, easily accessible, locations. In Puerto Rico, however, we had an actual navigable river—once we hauled the boat, at high tide during the full moon, over the sand bar at the river’s mouth—while it is well known that there are no rivers in Hawai’i, at least on Oahu. Hawai’i’s incentive is only 15%–20%. The unvoiced thought was that our show is a supernatural thriller kind of like that other show whose name we tried to avoid mentioning that shot for six seasons on Oahu, and since no one could think of a similarly successful show coming out of Puerto Rico … aloha! (I prefer Puerto Rican rum, but I guess that’s not enough reason to stay.)

IF WE’D KNOWN THEN WHAT WE KNOW NOW or THE SAME ONLY DIFFERENT

As a seven-episode midseason replacement, we’ll start a bit later than the rest of the network season, late in August and go for about three months on Oahu.

There weren’t many returning to continue the voyage up river: Our DP, John Leonetti, and his key grip, key makeup, accountant, producers and line producer, execs, our Puerto Rican 2nd 2nd AD (long story), our director, Jaume Collet-Serra (for the first episode), and me. Knox White was not available to make the Hawai’ian scene so I enlisted my old friend, Tom Hartig, to join me. I’ve traveled and worked in many faraway places with Tom and there are few better companions, and no better boom operator and set runner, so I was excited to have him back. On the recommendation of Richard Lightstone, I contacted local sound utility Jon Mumper, who, since he wasn’t working on the other show going at that time, Hawaii Five-O, was happy to join us. Jon was the only member of the sound department to survive all six seasons of Lost and, like so many of the crew there, his career and skill set was forged in that crucible. He came through it fine and was a solid asset to my crew. He is amazingly stoic about things; I guess after six years of that show—so many stories!— everything else seems easy. We also had local John Reynolds on hand for our second units.

This was not my first visit to Hawai’i but my first time working there. I had resisted the calls to work on Lost, now would be my time. P.R. was a fun place to work; though always exotic it was not always easy. There has been so much filmmaking for so many years in our 50th state that it felt very normal to be there. Normal, but not to be taken for granted; there were so many locations where the natural beauty was absolutely stunning. Even when you’re stuck in the usual horrible traffic jam, there’s something nice to look at, at least a rainbow or something.

We would not be staying in a hotel this time; that was not an option. Rather, we would be given a housing allowance, a rented car and per diems and invited to find our own place to live. Negotiating with landlords can be a little tricky considering all the uncertainty of production work. Tom and I decided that we would pool our resources to get a house together. With the help of our production staff we found one in Kailua, the nice beachside town where President Obama stays, on the windward side about midway between the stage near Diamond Head in Honolulu and the Kahana Valley where our “river” was located. The out-of-town crew was mostly split between Honolulu/ Waikiki and our side of the island. A word of advice to anyone who is driving there: watch and obey the speed limits on Oahu. They change suddenly and arbitrarily and there are many cops in unmarked cars lurking about. And on the advice of our teamsters, peel off the rental company bar-code ID sticker and put on a local bumper sticker so as not to look totally like a tourist and invite break-ins.

Our friends in Post-Production and our bosses were quite happy with the work we’d done on the pilot, so I decided to keep the same basic approach and try to fine-tune it and to anticipate some of the challenges a new season and location might bring. On the pilot, Production had been willing to cover a week or so of rental for my over-the-shoulder rig but for the longer run, the network was unwilling to subsidize additional equipment I thought necessary to achieve the results they liked in the pilot. The network, in this case ABC, picks a number that someone believes is appropriate for sound equipment rental with little regard for actual job requirements. Then you must provide competitive bids to justify the number they gave you. As “employee vendors” we are in a special category; it seems that the networks would rather pay more to an outside, approved, vendor than to us. There are exceptions, but it is becoming more challenging to provide adequate technical support at these rates. Nevertheless, we all love a good excuse to buy new equipment, especially if we know it will be necessary for a job and good to have for the future. So when it came time to gear up for the Hawai’i shoot, I found a great deal on a slightly used Zaxcom Fusion 10, which is basically a Deva 5.8 without the DVD burner. The Nomad was not yet available, and while it makes some tremendous advances in a very small package, I definitely needed all eight knobs that the Fusion sports, so it was a good choice for the job.

When using Zaxcom recorders in a bag, you want to avoid using their very noisy onboard slate mike. Robert Kennedy, former Coffey Sound specialist, helped with a clever solution that works really well with the Fusion and Deva and requires no modifications. Using the Disk Mix and Output matrices, it is easy to route an external headset mike to slate, public Comteks and also to IFB for a private line to my crew. This requires only a custom cable using the “Camera” connector on the Deva. Although I had to give up one mike input, this is much more versatile than on the pilot when I had only the Comtek for everybody, and it avoids using the bulky 25-pin output connector and snake.

I also acquired a Venue Field and loaded it with the appropriate VRT modules and a larger Petrol bag that fit better than the smaller one I’d had in Puerto Rico. For Hawai’i, I got a couple of dipole antennas instead of the sharkfins, thinking that would still increase my gain and not take up so much space. Also, I discovered that Petrol uses these little clips to attach all those pouches to the bag; I got the matching clips from Petrol, screwed them to the dipole mounting hardware and I could easily mount (and unmount) the antennas right to the bag. Ultimate range was not quite as good as the log periodics, but the lack of directionality is helpful, especially when talent is moving in three dimensions all around and I’m not sure exactly where they might be.

Since the Fusion lacks a DVD burner, I thought this would be a good opportunity to begin to wean myself off DVD-RAMs and move toward other media. At this point, the network still wanted archivable media. (I don’t blame them; it is to my way of thinking a significant leap of faith to hand over your day’s work on a tiny piece of media that will come back blank, having been downloaded to a drive. As our work evolves, this will be an ongoing discussion, but for now it seems to be someone else’s problem.) The compromise was that when I worked off the cart I would deliver, as usual, two DVDs, one multi-track and one single-mix track, and when I was in mobile mode, I would deliver a multi-track Compact flashcard. Since the picture and sound transfer was happening back at the production office, the media turnaround was pretty quick. Certainly digesting all the picture data was the main concern. (On the pilot, I just finished as of this writing, I used no DVDs at all. I delivered to DIT the two flashcards and within minutes he’d downloaded them onto the drive with picture and off they went for syncing.) Although I’ve always admired the robust nature and flexibility of our DVDRAM disks, these C/F cards are fast, flexible, and in the long run, very cost efficient.

NEVER THE SAME RIVER TWICE

In Puerto Rico, the boat playing the part of the hero boat, the Magus (think poor man’s Calypso), was a real watercraft, artfully aged. But that was another ocean. We also had something more like an actual river there and we could steam for quite a while before turning around. On Oahu, our main river location was in the beautiful Kahana Valley, up the windward coast, which features not a river, but the Kahana Stream. It comes down from the wet mountains and runs shortly to the sea, suitable mostly for kayakers and stand-up paddlers. Though not very deep, this stream is prone to flooding during the rainy season and there is a low bridge under the road at its mouth. The valley is incredibly lush with suitably tropical foliage and gorgeous mountains rising dramatically. The solution of how to get the Magus in here was to build our own and assemble it on site. Hawai’i is the birthplace of surfboards, so that technology was adopted. The construction crew shaped blocks of Styrofoam and covered them with fiberglass—just like a surfboard. They built a steel superstructure on top and constructed our floating set, which bore a striking resemblance to the original. The whole thing, about 60 feet long, with a main and upper deck, only drew about 18 inches of water, but it lacked power, a rudder and keel. It also had no head and no smoking section (much to the distress of Jaume Collet-Serra, our director). Nevertheless, fully loaded with cast, crew and equipment, we could push or pull this rig in a similar fashion as in Puerto Rico, about a quarter mile up and down our beautiful river before we ran out of room. (You can easily see it on Google Earth.)

There were some advantages to working on our floating set (not really a boat). It was more spacious and easier to work on than the original; it had less sharp metal pieces to bark your shins on and wider passage along the gunwales. The first time it rained we discovered it was not really watertight but that was quickly remedied. With less metal, it was more transparent to RF. It was the lack of steering that gave me the most grief. Lacking rudder and keel to help keep it on course, plus a less than symmetrical hull, it tended to move like a pig on ice when pushed, that is, anywhere but where you wanted, and often aground. The remedy? A pair of Zodiacs on either side of the bow keeping us on course. This was not a happy solution for sound; those little outboards are noisy and the operators were not too clued in to the dialog being more concerned with keeping things moving forward and out of the weeds. In addition, the main powerboat pushing us was not the same gentle giant that we had back east, and the generator it carried was also louder, giving us more noise to bury in the final mix. So began the ongoing negotiations with Marine and Electric: changing the positions of the Zodiacs, building housings for the generator (worked great until it overheated), swapping generators. Constant vigilance was required to keep things to a manageable roar, as were fingers on faders to keep it out of open mikes. Next season, if there is one, I will demand a steerable floating set!

I brought along one of those cute Backstage Mini-Magliners, thinking that it might be nice to have a platform when on the Magus or when four-wheeling through the bush. We did bring it on board and it was helpful, but as we were constantly moving around the boat, it seemed that there were always too many people aboard and not enough space so it was in the way. In prep there had been talk about a Sound Gator for getting to difficult locations. In that case, we would have tied the Mini down and used the Gator as a mobile unit, but as it turned out they were, for the most part, very gentle with our locations. We shot a lot in what we called “parking lot jungles”: park the truck, roll the gear a little ways, and there you are. If you worked on Lost, you would be familiar with many of these places. We did use the Mini for some of these locations, particularly because it was easy to just grab the bag, leave the cart and go. Next time, I’d like to try the Zaxcom Mix-8 with the Fusion. That way, if I am able to find a comfortable place to sit and work, I’ll get the ease of working with faders instead of knobs and if I have to run, just unplug it and go. Plus, I think it’ll be great for insert car work.

Sooner or later in our line of work, you are going to encounter a situation where, one way or another, your actors are going to get wet. If your show takes place in a boat—on a river—in the tropics—the odds go way up. This is rarely not a problem for us. There are many ways to get actors wet but probably only two main categories: submersion (or, I suppose, immersion) and water from above, i.e., rain. Both are special sources of pain for us. Let’s take the case of rain, the sort generally provided by our special effects brothers. How does the rain get up into those towers? By using a very noisy pump. So if you are able to keep your boom mike dry and free of raindrop impact noise (Remote Audio’s Rainman is a very nice upgrade from the old hog’s hair special), then chances are you’ll be fighting SFX noise and the ambient sound of “rain,” but you might be lucky enough to get a nice clean close-up. If you’ve been paying attention (see last issue, Part 1), you’re aware that this is probably not an option for us on The River. We’re pretty much left with Plan A: Wire ’em all.

The climactic scene of one episode involved a rainstorm of biblical proportion, at night, on the boat. The saving grace was that the boat would be docked. As the weather gods would have it, rain was forecast for that Friday night. Another opportunity to show off the beauty of the Aviom system as I stayed in the dry comfort of our truck (with my wife and daughter who were visiting), while we rain-bagged the RF Cart and put it at the water’s edge near the Magus.

The scene involved much physical action and much water and cameras in all the usual places—handheld and mounted. Thanks to Tom we were well stocked with industrial-strength condoms, which keep the transmitters dry under most circumstances, but given the expected deluge, we decided to take it to the next level of water resistance and rented some Lectrosonics MMs. (The new Lectro WM looks like a great new alternative for a water resistant transmitter.) Choice of lavalieres in this wet situation is important, though in my opinion not as important as one might think. The Sanken COS-11 is perhaps more water resistant than advertised. I had learned this on the pilot, where an actor (no names on location!) accidentally put one in the water during an unexpected dunking in an improvised scene. It made quite a sound when it hit and was apparently inert, but after carefully drying it and leaving it be awhile, it actually came back to us. (Sanken now has the COS-11D for moisture and reducing digital interference.) Nevertheless, the COS-11 is not my first choice for getting wet; that honor would go to the Countryman lavs. Their Classic Omni (like a smaller Tram) is good for water work, as is the B-6, which most of us carry, though probably without the right screw-in connector for the MM. Another choice, the Lectrosonics M-150 or 152, the lav that Lectro used to include with the purchase of a transmitter, is surprisingly good in the wet and if it goes down, it won’t feel like such a loss.

After this dissertation on which lavalieres to use in the rain, I will share this: If you must have the mike hidden under their clothes, it doesn’t much matter which lavaliere you deploy in a scene where the actors get really wet. Once the wardrobe becomes saturated, although the waterproof mike will be safe, it will not deliver natural sounding audio. As the clothing approaches saturation the material becomes less acoustically transparent and, though it might not sound “underwater,” the frequency response is far from flat. You might get lucky with take one, or if the actors start each take with dry clothes, you’ll get another chance. Some might not get as wet as others, so perhaps you can pick up dialog on another actor’s mike, or maybe it’s possible to sneak in a plant mike. With a tiny lav like the B-6 it might be possible to hide it in plain sight, out from under the wardrobe, especially at night, but then, of course, it is susceptible to wind noise and the possibility of taking a direct hit from a raindrop. In this case, the action was meant to be wild and chaotic in the dark with a King Lear storm blowing, which actually gave us some latitude. With a combination of “all of the above” and some good luck we managed to get what we needed. So much of what we do requires multiple options, quick reactions and improvisational skills. On top of all that, layering in more wind and rain fx in the mix can help cover a multitude of sins.

During our relatively short season—seven episodes, eight days per—we had some fun telling scary stories. We got to hit some classic horror tropes: jealous ghosts, animated dolls, mysterious and unfriendly natives, zombies, demonic possession and the series finale, an exorcism. There was also the plot development of some secret quasi-governmental organization that was behind all the mystery as Dr. Cole searched for the source of the magic in this beautiful but threatening place where the laws of nature have no sway. We used all kinds of special effects: rain, wind, fog (not easy on a river on the windward side of the island), water tank and more, and quite the arsenal of visual effects. As always, the mission of the sound department, producing appropriate and useable tracks, is complicated by all this trickery. In the case of The River, we really had to step it up because of the multitude of cameras. We got to do a little recording session, luckily on stage, for playback on the moving boat. It featured guitar, accordion and a vocal duet. We also provided playback of scary creature sfx and eerie music for actor motivation, nothing too complicated but rendered more challenging considering the location and the cameras. As was the case on the pilot, our work was made easier with the help of a very talented and supportive crew and production staff and a wonderful cast who themselves had to go through some pretty rigorous paces. Also, we were working in some of the most beautiful places ever!

The virtues of Hawai’i are well known to anyone who has traveled there; most of us go for vacation, but there are many far worse places to be shipped off to work. One hears so many stories about the challenges of a location and the specter of Lost looms large there (“Do you remember the season it rained for 40 days straight?”) but we were fortunate with the weather, late summer to Thanksgiving, and it wasn’t quite the mudfest I had feared. To be sure, we had rain, but as in Puerto Rico, the locals know how to deal with it and you’ll be covered almost before the first raindrop lands. Heat and humidity are always a problem in the tropics and shooting on the windward side meant that we had some wind as well. Okay, and when I was on my bicycle, sometimes it felt like I was riding though a carwash, hot, then rain, then wind, and always sweat. Of course, I was commuting on a bicycle along the beach, over some mountains, through some towns on one of the world’s great destinations. Really, who can complain? You’re in Hawai’i, and I’m happy to say that it hasn’t been ruined for me as a vacation destination.

The circumstances of working on a distant island are similar to other faraway locations only sometimes more so. There are limitations: you can’t just send a driver to the rental house to get what you need, so you make do with what you can find there—or plan ahead. For us, there is only one suitable sound truck on the island and it was already in use by Hawaii Five-O. We got by with a (slightly) modified cube truck. Well, actually two, as the lift gate on the first was deemed unsafe. Then the supposedly safer gate on its successor broke while in use. Somehow no one was injured and the gear escaped unscathed and the gate was repaired. Which brings me to a major difference between shooting in Hawai’i and Los Angeles. Just before leaving I had completed the latest required Safety Passport classes. I arrived ready to implement all the good new rules and guidelines I’d learned only to find that not only were they not required there, but even normal safety guidelines are ignored, if not scoffed, by the locals. Is it really such an imposition, for example, to keep the fire lane on stage clear of obstructions? I am told that this is a concern at many of the newly popular production centers out of Los Angeles. I would like to see all union crews and signatory productions held to the same safety standards. As I write it has been determined that despite all our best efforts we will not be returning for another voyage up The River. I guess we’ll never find “the source.” We had our fans but alas, not enough to make the network cut. My wife is particularly disappointed as she was looking forward to spending her teaching sabbatical in our house in Kailua.

Doing this show was quite an adventure, both in terms of location and the challenging nature of the work. Admittedly, it was a bit surprising at this point in my career to strap on all this gear and go running around like an ENG or reality show guy. It wasn’t that we in Sound were doing anything especially innovative, just more of it than would be considered normal in the context of a one-hour network episodic show. (Not quite like American History X where Tom and I had to “invent” wireless boom to compensate for the antics of director/camera operator Tony Kaye. Seems pretty basic these days, but with the non-diversity VHF RF of yesteryear it was pretty challenging!) Of course, when you consider the up-to-14 cameras we used on The River, the whole notion of “normal” is left far behind. (We might have shot a million feet of film onAHX, but it was all one camera.) Once you get your head around it and commit to this crazy way of working, you just keep moving forward. It wasn’t full-on every day; the reality TV mode was often interrupted by the old normal, which meant working off the cart, albeit with more lenses than seems right. And now that I have this studio-in-a-bag set up, it is amazing how appropriate it is for other gigs.

I’m happy to say that the unorthodox working style resulted in tracks that were a component of what I think is the best sounding television show I’ve ever done. The supernatural themes provided a broad canvas for our very creative cousins in post, Paula Fairfield, supervising sound editor, and Dan Hiland and Gary Rogers on the board at Warner Bros., plus a very suspenseful score by the great Graeme Revell made for a very effective sonic environment. Our dialog editor, Jill Purdy, was very happy with the tracks we provided and used only the bare minimum of ADR.

Thanks to all for the challenging and fun adventure and the excellent opportunity for growth in my craft! Adíos and aloha!

When Sound Was Reel-9: Digital comes to the cinema

by Scott D. Smith, CAS

In the previous issue of “When Sound Was Reel,” we examined the development of optical recording technologies for general 35mm film releases. In this installment, we cover the next generation of technology designed for discrete multi-channel theatrical exhibition.

Digital Comes to the Cinema

Although the development of Dolby Stereo, as conceived for encoding four channels of audio onto a two-channel release print, was a huge step forward in terms of sound quality for standard 35mm film releases, it could still not rival the quality of a good four-track or six-track magnetic release (at least when the film and heads were still in good condition). Besides the advantages of a superior signal to- noise ratio and wider bandwidth, discrete magnetic recording systems were also free of the compromises inherent to the 4-2-4 matrix, which meant for Dolby Stereo there would always be some crosstalk issues between channels.

While this issue was minimized by careful channel placement during re-recording, it was still a far cry from the luxury offered by independent mag tracks on either 35mm or 70mm film. This was not lost on the engineers at leading suppliers to the film industry, who realized that further development would be needed if they were going to maintain a lead in the marketplace.

Kodak Gets Into the Sound Business (Again)

Despite the early efforts Kodak made in developing stereo analog optical soundtracks (which later became the basis for Dolby SVA) in the late 1980s, they once again saw an opportunity to advance the state of the art when it came to sound for release prints. As 16-bit digital audio became an accepted standard for consumer audio, Kodak, in partnership with Optical Radiation Corporation, invested a significant amount of money in developing a six-channel system that could record discrete digital soundtracks onto standard 35mm prints.

Determining how much data could be jammed into the area presently occupied by the analog soundtrack was the first design hurdle. Kodak engineers worked on developing a new sound negative film stock that had sufficient resolution to encode a data block only 14Xm in (14 micrometers). While the print stock used during this era could support the miniscule size of the data block, a new fine grain negative stock (2374) was needed to handle the recording of the data. With this aspect of the system solved, they determined that a 16-bit PCM signal could be reliably encoded using data compression. In practice, the system eventually employed a Delta-Modulation scheme, whereby the original 16-bit audio was compressed onto a 12-bit word. Even with this compression scheme, though, the bit stream rate worked out to be 5.8 MB/per second, nearly four times the data rate of a standard audio CD. The resulting system was named the Kodak Cinema Digital Sound system (CDS for short).

This presented a real challenge when it came to reliably streaming data from a reader on a standard projector. Because of this, early installations incorporated modifications to the projector transports to provide more stable scanning of the optical track across the sound head.

After a period of initial development, Kodak and ORC premiered their system with the release of the film Dick Tracy in June of 1990, in both 35mm and 70mm versions, two years before the release of Batman Returns in Dolby Digital. While the system was generally well received, it had one fatal flaw: no backup track. Since the digital soundtrack occupied the full area previously inhabited by the analog soundtrack, this meant that any failure of the reader would result in no sound being heard at all. (It should be pointed out that the engineers involved in the development were in strong opposition to this approach, but management dismissed their concerns.) It was this aspect of the system (along with the nearly $20,000 theater conversion costs) that would ultimately spell its demise two years later, with only nine films having been released using the system.

Thus came to an end, Kodak’s second foray into sound recording systems for film.

Dolby Digital 1.0

In about 1988, nearly a decade after the release of Star Wars, Dolby engineers began development work on a completely new soundtrack format, one that would no longer rely solely on analog recording for release prints. At this juncture, nearly five years had passed since the introduction of the CD players into the consumer market, notably Sony’s CDP-101. Just as the quality of consumer audio systems outpaced the typical sound system found in theaters during the 1950s and 1960s, the introduction of digital audio to the marketplace would once again lead the film industry into a new series of engineering challenges.

As part of their engineering mandate (no doubt strengthened upon witnessing the demise of the Kodak/ORC system), Dolby made the two decisions pertaining to the Dolby Digital system design:

• The system had to be backwards compatible, and

• The soundtrack had to be carried on the film itself (i.e.: not on a separate medium, such as a separate interlocked player or dubber).

These mandates posed some serious constraints as to how much data could be recorded onto the film. Since the format had to be backwards compatible, this meant the existing optical soundtrack had to remain in place. Since there was no option for moving the picture image, this meant that the only significant area left was either between the perforations area and the outside of the film (an area about 3.4mm wide), or between the perforations themselves (about 2.8mm wide). Dolby engineers chose the latter as the area that held the most promise, reasoning that the area outside the sprockets was more prone to damage.

Moreover, Kodak (and others) used the area outside the sprockets for latent image key codes, making it unsuitable for soundtrack imaging. While the theory that the space between the perforations was a more protected had some merit (as experienced by Kodak with their CDS system), the unfortunate reality was that many film projectors still produced a significant amount of wear in the perforation area, making robust error correction and a backup analog soundtrack a must.

Despite these hurdles, Dolby engineers managed to produce a system that was quite reliable, given the constraints that they had to work with. In its original configuration, Dolby Digital consisted of a bit stream encoded at a constant rate of 320 kB per second, with a bit depth of 16-bits. While the bit rate was only about one-quarter that of a standard audio CD, it did for the first time, make possible a system which could record six discrete audio channels on a 35mm release print and was also backwards compatible with existing 35mm analog soundtracks.

However, despite Kodak’s exit from the market, Dolby engineers were not the only ones in the game when it came to digital soundtrack development.

DTS Throws Down the Gauntlet

While engineers were toiling away at Dolby, other entrepreneurs were looking at similar possibilities for marrying a digital soundtrack to film. A notable example was Terry Beard, who ran a small company called Nuoptix, which had specialized in producing upgraded recording electronics for analog optical recorders. These systems became the basis for many of the Dolby Stereo variable-area optical recording systems installed as Dolby Stereo achieved a greater market penetration.

Beard chose to take a slightly different approach to the conundrum of how to fit sufficient data onto the minuscule area available on standard 35mm prints. Instead of recording the audio signal directly onto the film, he chose instead to simply record timecode in the very small section between the picture frame and inside edge of the analog soundtrack. This timecode was in turn used to slave playback from a specially modified CD player that could carry up to six discrete channels of high-quality audio. Known as “double system” in industry parlance, this approach had been used in the past for specialty releases like Fantasia, all the Cinerama films, as well as and the original Vitaphone disk releases from Warner Bros. In general, this approach was not well received in the industry, due to problems associated with separate elements for picture and sound. Besides the possibility of an element becoming lost or separated, there were huge synchronization headaches.

Beard, however, was convinced of the viability of the format, and continued to press on in development of what would become known as DTS. After a chance encounter with Steven Spielberg, Beard had the opportunity to showcase the system to him in August of 1992. After some further work and demos to execs at Universal, Spielberg was convinced of the viability of the system, and by February of 1993, Digital Theater Systems was officially formed, with Spielberg himself signing on as one of the investors.

With Jurassic Park scheduled for release in June of that year, Beard and his team had only four scant months to assemble enough units to supply theaters to support the wide opening planned. Undaunted by this nearly impossible deadline, Beard and his staff managed to deliver 900 processors to theaters by the second week of the film’s run! Fortunately, most of the technology needed to actually go into production had already been vetted, so the primary hurdle was simply building enough units to supply the theaters.

While this “double system” approach was still not generally well received by distributors (disks could be lost or damaged), it did provide for high-quality reproduction of discrete soundtracks with a minimal amount of data compression. In the early 1990s, there were few options available for compressing audio data onto limited carriers. After reviewing the options, Beard chose a system developed by Audio Processing Technology (APT) out of Belfast, Northern Ireland. Data reduction is a tricky business. The APT system was unique in that it used only a predictive mathematic table to encode and reconstruct the data, as opposed to techniques employing “masking” of the signal. As the system offered an off-the-shelf solution, it made it very attractive to DTS, as it meant they didn’t have to develop their own data-reduction system.

A further feature of DTS was that it could seamlessly adjust for any missing frames in the film, automatically compensating for the lost timecode by providing a large buffer between the disk and the system output, which could make up for dropped frames.

In its original configuration, the DTS system had been designed in two versions; a full six-channel discrete system, as well as an economy two-channel version, which could utilize the same encoded Lt/Rt signal as analog Dolby Stereo. This was, in fact, the version that was delivered to most theaters during the initial Jurassic Park run. However, there were some problems involved with properly setting up the processors in this configuration, and in the end, it was decided that only six-channel systems would be installed.

The deployment of tracks was the same 5.1 approach as used by Dolby Digital, so the only expense incurred by theaters was the installation of the timecode reader on each projector, along with the DTS disk player.

OK, That’s Three. Let’s Add Another Format!

While most industry observers would likely contend that jamming three audio formats onto a single piece of film was probably sufficient, that is not the way the film business works. Not wanting to be left behind, execs at Sony/Columbia decided that they too needed to develop a multi-channel digital sound format for theater exhibition. However, by this point, space was running out on the print, so the only option landscape left was the area between the sprockets and the outside film edge, as well as a very small space between the picture frame and inside of the analog soundtrack (which was already occupied by the DTS timecode).

Undaunted by these constraints, Sony engineers contracted with Semetex Corp., a manufacturer of high-precision photodiode array devices, to design a system which could resolve miniscule amounts of data from the area between the sprockets and outside edge of the film. This system would become known as Sony Dynamic Digital Sound, or SDDS. However, unlike Dolby Digital and DTS, the system boasted eight independent channels of audio! In practice, however, few films ever took full advantage of the full eight-channel capability, due to the costs associated with both mixing and equipping theaters with additional speaker systems.

In its original implementation, the Sony system used a 7.1 speaker system. However, unlike 5.1, the Sony system utilized five fullrange screen channels, along with stereo surrounds, a layout similar to the original Todd-AO six-track 70mm format (with five screen channels but only mono surrounds). This was quite different from what eventually evolved into Dolby Digital 7.1.

Similar to both Dolby Digital and DTS, the system also required some data compression. To achieve the needed data rate, Sony utilized the ATRAC data compression scheme, which allowed for a compression ratio of about 5:1. Sony also provided for redundancy of the primary eight channels by including four backup channels, in case damage to the film caused data dropouts on the main channels. In practice, this proved to be a necessary feature of the system, as prints were frequently damaged by careless handling on platter systems.

Although Sony had originally planned to deploy the SDDS in December of 1991 for the release of Hook, the work needed to refine the system delayed its release an additional year and a half. It premiered instead with the release of Last Action Hero. Since Sony at that time owned its own theater chain (later sold to Loews), it could leverage its exhibition position to gain market penetration that would have otherwise been difficult to garner in the face of competition from Dolby and DTS. Further, via their ownership of Columbia/TriStar, they could create demand for the system by releasing all of their films with SDDS.

Alas, despite their advantage in the exhibition market, the only other theater chain that signed onto the SDDS system was the AMC chain, who struck a deal with Sony in 1994 to include the system in the new auditoriums they were constructing during their expansion phase. While the much touted eight-channel could theoretically offer an improved theater sound experience, the reality was that fewer than 100 films ultimately used the full capabilities of the format. Further, theaters were reluctant to invest in the needed hardware and speaker system upgrades necessary to realize the full potential of the system.

Although the system did gain favor with many of the studios for release of bigger budget films, most independent films during this period were being released primarily in Dolby Digital (and possibly DTS), which meant that the capabilities of the SDDS theaters went underutilized.

With market penetration stalled, and facing strong competition form Dolby and DTS, Sony ultimately made the decision to abandon manufacturing of the system in 2004. However, support for existing systems will continue until 2014, and new titles are still being released utilizing the SDDS format.


Review

With the variety of competing formats, it is interesting to take note of the bit stream rates and channel configurations employed by each of the competing systems:Kodak CDS System (for both 35mm and 70mm film):

Data Rate: 5.8 mb/sec
Sample Rate: 44.1 kHz
Bit Depth: 16 Bits
Data Compression: Delta Modulation

Channel Configuration:
Five Channel (5.1 with LFE) (Left/Center/Right/Left Surround/Right Surround)

Dolby Digital (for 35mm film):

Data Rate: 320 kb/sec
Sample Rate: 48 kHz
Bit Depth: 16 Bits
Data Compression: AC-3

Channel Configuration:
Mono (Center Channel)
Two-channel stereo (Left + Right)
Three-channel stereo (Left/Center/Right)
Three-channel with mono surround (Left/Right/Surround)
Four-channel with mono surround (Left/Center/Right/Surround)
Four-channel quadraphonic (Left/Right/Left Surround/Right Surround)
Five-channel surround (Left/Center/Right/Left Surround/Right Surround)

Additionally, each of these formats can utilize an extra-low-frequency channel (designated the “.1 channel”), which is usually assigned to a separate subwoofer.

Dolby also provides for 6.1 and 7.1 formats in the Dolby Digital Surround EX format, which implement mono rear surrounds and stereo rear surrounds respectively.

DTS

Data Rate: 1.536 mb/sec
Sample Rate: 44.1 kHz
Bit Depth: 16 Bits
Data Compression: APT-X100

Channel Configuration:
Five Channel (5.1 with LFE) (Left/Center/Right/Left Surround/Right Surround)

SDDS

Data Rate: 2.2 mb/sec
Sample Rate: 44.1 kHz
Bit Depth: 20 Bits
Data Compression: ATRAC2

Channel Configuration:
Five Channel (5.1 with LFE) (Left/Center/Right/Left Surround/Right Surround)
Seven Channel (7.1 with LFE) Left/Left Center/Center/Right Center/Right/Left Surround/Right Surround)

Virtually all of the systems boasted a bandwidth of 20-20kHz (for the primary channels), along with a noise floor that was virtually silent in even the best theaters. Further, they did away with the compromises inherent in the 4-2-4 matrix used for Dolby Stereo analog. While debate still rages among aficionados as to which of the systems sounds the best, there can be no doubt that all them provided a major step forward for sound reproduction in a theatrical environment.

© 2012 Scott D. Smith, CAS

Making the Cable Connection

by Jim Tanenbaum, CAS

For many years, my sound cart has been a “cable-free zone.” Besides the talent’s radio mikes, both my boom operators are wireless, as are any plant mikes. My cart runs on batteries (105 amp-hours worth), including the built-in worklights. I have two UHF video transmitters that I connect to the video assist system so I don’t need coax cables from it to my monitors, and I send the audio out by Comtek. Director, script, producers, etc., all get Comteks, and my boom ops have their Comteks on a separate frequency. (I have a third channel available if each of them needs to hear only their own mike.) As a result, buzzes from H.I.D. (High Intensity Discharge) lights, such as H.M.I. or Xenon, 60-Hz hum from power cables, RF (Radio Frequency) pickup from radio/TV stations, audio/timecode crosstalk, and static from moving bad cables are all a thing of the past.

Or are they? Unfortunately, my cart is not, in fact, “cable free,” because all my equipment is interconnected with … (wait for it) … CABLES. And some of you still use cabled mikes, connect to video assist with cables, run some or all of your gear on AC, send/receive timecode (TC) by cable, etc. Here’s what I’ve learned in 45 years about dealing with these problems: like radio mikes, cables also work partially by magic. What appear to be similar problems often do not respond to the same solution, and equipment that is trouble-free one day may not be the next, even though everything is still in the same place.

The cables interconnecting various pieces of equipment on your cart are the easiest to deal with because they are under your complete control, do not change position (usually), and do not have to be connected and disconnected as much. Cables from your cart to somewhere else are subject to the “slings and arrows of outrageous fortune” in the outside world.

Most of this article is written at a fairly basic level, not requiring a great deal of electronics knowledge. There are a few advanced discussions here and there that can be safely skipped, or consulted with a more “techie” mixer. If the text becomes incomprehensible—just keep reading and it will soon clear up. (The other end of the cable is that I have overly simplified a few things, so I ask the technically-literate to please bear with me.) Local 695’s website has an excellent online “Ground Loop” seminar by Bill Whitlock that is incredibly detailed and technical. It is almost exclusively dedicated to AC power noise problems in fixed installations, but very useful nevertheless and well worth several hours to watch. I have made sure to cover his relevant main points in this article as well.

A final note: cables have a much higher retail markup than recorders or radio mikes. Rental cables, in spite of Herculean efforts by the rental companies, are not 100% reliable. Therefore, if you know which end of a soldering iron to grab; make your own. If not, get your local techno-nerd teenager to teach you or take the Local 695 cable construction class.

Some Basic Information to Start With

1. Mike-level signals are in the 5 to 50 mV (millivolt = 1/1000 volt) range. Line-level signals are in the 0.5 to 5 V (volt) range. AC power is in the 100 to 200 V range. If you are involved with H.I.D. lighting units, the cable connecting the lamp to the ballast (called a “head feeder”) carries thousands of volts (KV). The higher the voltage, the easier it is for the current to “leak” into some other circuit, like your mike cables.

2. Civilian AC power frequencies are in the 50 to 60 Hz range. (Hz or hertz, formerly called “cycles-per-second,” or “CPS,” a much more descriptive term.) Audio frequencies are conventionally said to be in the 20 Hz to 20 KHz (KiloHertz = 1,000 Hz) range, although few people today, especially anyone over 16, can hear anywhere near the top or bottom of that. RF (Radio Frequency, but not limited to “radio” signals) starts around 500 KHz with the AM radio band, and goes upward from there. MHz = 1,000 KHz and GHz = 1,000 MHz. The RF spectrum is further subdivided into HF (High Freq) = 30-100 MHz, VHF (Very High Freq) = 100-300 MHz, and UHF (Ultra High Freq) = 300-1,000 MHz. There is also ULF, VLF, LF, and MF, but they probably won’t concern you. Frequencies in the GHz range are often called “microwave” or “MW.”

3. All electrical conductors (except for superconductors, which you won’t be dealing with) have some amount of resistance, measured in ohms (Ω). When an electrical current flows through a resistor it loses some of its voltage, although the amount of current (measured in amperes) remains the same. The amount of this “voltage drop” is given by Ohm’s law: E = IR. (E is voltage; I is current; R is resistor; and IR means I times R.) For example, a table lamp with a 100-watt light bulb draws about 0.9 A (ampere or amp) at 110 V. If the 2-wire power cord has a resistance of 1 Ω in each wire, there will be a 0.9 V drop in each wire (0.9 A times 1 Ω), for a total of 1.8 V, so the light bulb will get only 108.2 V across the terminals of its socket. IMPORTANT: This drop is distributed such that the “hot” terminal of the socket will be at 109.1 V with respect to “ground” (the meaning of which we will discuss later), and the other (“neutral”) terminal will be 0.9 V above ground. This “IR drop” phenomenon is the cause of most of our woes with “ground loop” problems, as we shall eventually see.

4. Whenever two wires run in proximity, energy can transfer between them by two mechanisms. This is true whether they are internal circuitry in equipment or inner conductors in a cable. Shielding may reduce or increase the effect, and the shield can even serve as a conductor itself. Current flowing through a wire produces a magnetic field around it, and if the flow varies, the varying magnetic field can inductively couple to the other wire(s) and induce a current flow. In addition, if a wire has a voltage on it, even if there is no current flow, a voltage can be capacitively coupled to the other wire(s). Sometimes the energy source is the pair of wires of a circuit, with the signal current flowing up one wire and back down the other, or a static condition with the voltage on one wire positive and the other negative. In these cases, the corresponding magnetic or electric fields around each wire are opposite polarity and cancel out, theoretically. In reality, the physical arrangement of the two wires is never exactly identical so the cancellation is never complete. There will still be some residual field left to interact with the remaining conductor(s).

5. Cables have a characteristic impedance, also measured in ohms but using the symbol “Z” to represent impedance, instead of “R” (used for resistance). Impedance is a more complex form of resistance, having capacitive and inductive components in addition to resistive. The main thing you need to know is that for analog audio cable, its impedance is relatively unimportant. For digital audio and timecode, it may be necessary to consider impedance, especially for long cable runs (see baluns later in the article). For video and RF antenna cables, impedance definitely needs to be taken into account.

6. Input and output circuits also have a characteristic impedance, and some of the same considerations mentioned above apply. Most professional dynamic mikes are Lo-Z, about 150 Ω, but some ribbon mikes are 50 Ω. Semi-Pro Hi-Z mikes are about 1,000 Ω (1 KΩ). Professional line-level circuits are 600 Ω. High-impedance circuits are many thousands of ohms (47 KΩ is common). In general, you can connect the output of a low-impedance device to a high-impedance input without distorting the signal, though the Hi-Z input circuit may not have enough gain. (And certain types of 600 Ω line-level outputs may not deliver the full level of low frequencies to a Hi-Z load. You can usually fix this by connecting a 600 Ω resistor across the input terminals.)

7. To summarize the above, you can connect a 50-Ω ribbon mike with a 110-Ω mike cable to the 5 KΩ “bridging” input of an audio amplifier with no problems. If you are sending AES/EBU digital audio to a device with a 75-Ω BNC input, but use a 110-Ω mike cable and a simple mechanical XLR to BNC adapter at the end of the run, you may or may not have a “jitter” problem, depending on various things including the length of the cable. But if you use a 50-foot length of 75-Ω video coaxial cable to connect your 50-Ω wireless mike receiver to a 50-Ω sharkfin antenna, you will definitely notice a loss of range compared to the proper 50-Ω coax.

8. “Crosstalk” is the transfer of a desired signal to another circuit where it is not wanted. Factors that increase crosstalk are: higher voltage, higher frequency, closer proximity, less effective shielding, and ground loops. Note that digital signals (audio and timecode) are a type of “square wave” that have high-frequency components (over 20 KHz) and can more easily crosstalk into other circuits compared to analog audio signals. Crosstalk can occur between external cables, between components of multi-cable snakes, or between the wiring inside equipment. (e.g., if you’re recording TC on one audio channel of a video camera, attenuate the TC signal to at least 30 dB below full scale with an external in-line pad to prevent TC crosstalk inside the camera to the audio channel.) Even though some of these signals are above the audible range, they can still cause audible problems, as discussed in the next section for RFI (Radio Frequency Interference, but used for any type of noise signals in the MHz range.)

9. “Noise” refers to any unwanted addition to a signal (whether it is immediately audible or not). Beside audio and TC signal crosstalk, interference from AC power is another major offender.

AC power noise consists of hum, a low-frequency (usually the AC power frequency) tone, composed of a single, pure sine wave signal, or buzz, which adds harmonics to the basic hum. Because hum is a pure tone (60 Hz, or 50 Hz in some other countries), it can be more or less easily filtered out in post. A buzz, with its harmonics, can sometimes still be removed with a sequence of filters at 60, 120, 180, 240, 300 Hz… (or 50, 100, 150, 200 Hz…), but if there are nonlinear elements in the source, there will be non-harmonic components that cannot be readily eliminated. WARNING: Many military vehicles and installations use 400 Hz AC power or even higher frequencies. Working in this environment is extremely challenging because any AC pickup is almost impossible to filter out.

When AC power circuits have a bad connection point (loose or corroded; whether visibly/audibly arcing or not), it can create “static,” which is heard as a sputtering or ripping sound.

Static (actually a form of radio signal) is created whenever electrical current flows through a mechanically imperfect (e.g., a rubbing or lightly touching) joint, as compared to a solidly clamped or soldered one. Note that static can also be created if one of your own cables has an intermittent connection at a plug, or a broken conductor or shield wire(s).

Finally, “inaudible” RF audio and video signals can produce audible noises, especially from high-powered commercial radio and television stations, even at a considerable distance. (Or a nearby video assist or remote control transmitter.) While their frequency is far above the audio range (MHz vs. KHz), if they infiltrate audio equipment, nonlinear components in the audio circuits can “detect” these signals and produce audible interference based on their AM modulation. RF signals can also interact with wireless mikes by heterodyning, creating sum and difference frequencies that fall in the audible range. And if their level is great enough, RF signals can overload lower frequency circuits, causing distortion or complete muting.

A Balancing Act—It’s Easier to Balance on Two Wires Than One

Audio circuits can be either balanced or unbalanced. Balanced circuits have their signal carried by two conductors, neither of which is connected to “ground” (to be defined later, but we’re getting closer). The two wires can be surrounded by a metallic shield, or not. Unbalanced circuits have only one conductor, surrounded by a metallic shield that is used to complete the circuit as well as to keep out interference. Almost always, balanced circuits are less susceptible to noise than unbalanced.

There are two forms of noise: common-mode (C-M) and transverse-mode (T-M). (Transverse-mode is sometimes called differential-mode, or normal-mode, from the geometric term “normal,” which means “perpendicular to,” but I will use “transverse” in this tutorial as it is less confusing.) Transverse-mode noise involves the interfering signal appearing between the single conductor of an unbalanced circuit and ground (or some other point), or between the two conductors of a balanced circuit. Common-mode noise involves the interfering signal having identical voltages (referenced to ground or some other point) on both the single conductor and the shield of an unbalanced circuit, or on both conductors of a balanced circuit. Like phantom mike powering voltage, C-M noise will not affect a balanced signal, but unless blocked, it can travel along with the signal until it reaches a susceptible circuit component and causes trouble there.

Noise can get into audio circuits by various methods: directly, by means of an electrical connection (or leakage through insufficient insulation); or indirectly, by means of an electric field (capacitive coupling) or a magnetic field (inductive coupling), or both. Any metallic substance can shield against electric fields, but only certain magnetically-conductive materials can block magnetic fields. A radio wave consists of crossed electric and magnetic fields, and it is sufficient to block just the electric field to shield against it. (Or just the magnetic field, but that’s much harder to do.) An isolated electric field is most often encountered as “static electricity,” such as when you pull off a sweater on a cold dry day and your hair stands on end. Magnetic interference was a common problem in the days of tape recording, when the recorder’s heads would pick up hums from nearby AC motors or transformers. Today, the problem usually occurs when dynamic microphones, or certain condenser microphones that use audio transformers, get too close to an overhead fluorescent light fixture with a magnetic ballast.

On the Ground at Last

The term “ground” has many meanings (the U.K. uses “earth” for some of them):

1. The physical substance on the surface of our planet. Most AC electrical power systems have their neutral conductor “grounded” at the service entrance, usually by bonding to the underground metal water-main piping and/or an eightfoot metal stake driven into the earth next to the structure.

2. The metal case and/or chassis of a piece of equipment. Often called chassis ground.

3. The zero-potential circuitry of a piece of equipment. Often called circuit ground. This may or may not be connected to the unit’s case/chassis.

4. The metallic shielding of cables or other components.

5. A separate wire included in a cable or conduit for grounding purposes. This is done in some AC power cables as well as some audio cables. In the United States, power ground wires are colored green; other countries may use other colors. The ground wire may also be bare (un-insulated) or replaced by the metal conduit through which the wires are run.

6. A specific terminal on a device to which “ground wires” are connected. In theory, a ground has no voltage on it, and can accept unlimited amounts of unwanted signals and dispose of them completely. Unfortunately, things in the real world are not so easy. In AC power outlets, there is a third opening (in the United States, it is round to distinguish it from the power slots) for the safety ground circuit. Any leakage current from connected devices flows through it to the bonding point at the service entrance, where it returns to the neutral wire. But this current flow creates an IR drop of hundreds of millivolts or higher, and this voltage will be different at each outlet. Where the electrical conduit serves this purpose, it often has high resistances at the mechanical joints, which may increase further with age, so the IR drop can be several volts. In addition, normal AC current flowing through the power and neutral wires can inductively couple to the ground wire and raise its voltage even more.

Ground Loops (Not the kind an airplane does when it crashes on takeoff)

Since any current flowing through a conductor (whether AC power, a desired audio signal or unwanted noise) will experience an IR voltage drop, if the audio and noise share a common circuit path at any point, their voltage drops may be added together, with unpleasant results for the audio. Let’s take a simple example. A small nightclub has a singer performing in front of a stand mike. The Hi-Z mike is connected to the house sound system with an unbalanced mike cable. The PA amplifier is connected to an AC outlet with a three-wire power cord, which grounds the amp’s chassis to the electrical service entrance ground. The shield of the mike cable is also connected to the amp’s chassis. The singer sings and all is well. Then she grabs the mike with one hand to remove it from the plastic clip, while she steadies the metal mike stand with her other hand. A loud hum blasts out into the audience. What happened?

WARNING: Using 3-to-2 pin adapters without connecting their ground lug/wire to “lift the ground” of all the 3-wire cord units may sometimes actually stop the hum/buzz, or reduce it to usable levels. DO NOT DO THIS! NO SHOW IS WORTH YOUR LIFE OR THAT OF A FELLOW CREWPERSON! Floating safety grounds are a disaster waiting to happen—even if only a “mild” shock results, it can cause someone to fall or jerk back into a serious or fatal accident.

The mike was in a plastic clip that had insulated it from the metal stand. When the singer touched both it and the stand, her body now provided an electrical connection from the mike to the stand, which was resting on the concrete floor (concrete is not a very good insulator). While it is true that the AC-powered amplifier’s case and chassis were grounded, the ground connection was implemented through a long length of wire running down conduits in the building, and the AC leakage current flowing through it produced an IR drop of almost one volt, raising the amp’s chassis above ground by that much. This “hot chassis” voltage in turn caused current to flow through the alternate path to ground created by the mike cable shield, the metal mike housing, the singer’s body, the mike stand, the concrete floor, and the damp soil beneath it. Remember that the mike’s unbalanced cable has a center conductor surrounded by a metallic shield. The few millivolts of audio from the mike travel down both of them to the amp. But now a far larger voltage is driving 10 or 100 times more 60 Hz current down the cable shield, and its IR drop adds to the mike’s audio signal—indeed it completely overpowers it.

In fact, the singer was lucky that a loud noise was all that resulted from her actions. If the amplifier wasn’t properly grounded (e.g., the plug’s third prong broken off, a mis-wired outlet, or a 3-to-2 pin adapter used without attaching the grounding wire), she could have received a severe electrical shock instead of just a nervous one from the sudden loud noise.

If you plug several pieces of your equipment (with 3-wire power cords) into different AC outlets, the outlet’s “grounds” will be at different voltages, and therefore, so will the equipment chassis. Now, when you interconnected the gear with audio cables, AC current will flow through the cables’ shields to equalize the difference. 100 mA (milliamp) flowing through a 1 Ω shield will give an IR drop of 0.1 V. This is about a third of consumer or “semi-pro” line level, and only about -25 dB below professional. If it gets into the audio circuits… Using equipment with 2-wire cords won’t necessarily prevent problems because they still have electrical leakage, and it will flow down the audio cables to any 3-wire units. Using all 2-wire gear can still give trouble because their leakage voltages will be different and will produce equalizing currents. Furthermore, 2-wire equipment is required to have its leakage current limited by higher impedance, to protect you from electrocution hazards. (Unless there’s a design defect that got by the UL inspectors. You do trust them absolutely, don’t you?) But because of the higher impedances involved, touching a particular unit can increase or decrease the hum or buzz, which terribly complicates troubleshooting.

IMPORTANT: If you crimp a ring or hook lug to your wires, be sure that the crimper is properly sized for the gauge of wire, AND the particular type of lug. Lugs for the same gauge wire can have barrels of different wall thickness, and using a thin-wall lug in a crimper designed for thick-wall lugs will result in too little crimping pressure and an unsatisfactory joint. Make a test crimp and try to pull the wire out of the lug. The wire should break instead of pulling out. As an added protection, some mixers will solder the lug after crimping the wire in it.

Ground loops can also occur with the low-voltage DC current that powers your equipment, or even audio signals themselves. If you have an audio signal from one device passing through an unbalanced cable to another device powered from the same DC source, and the power cables to these pieces of equipment do not have exactly the same current times resistance, the IR voltage drop in the power cables will be different and the ground voltage at the equipment end will be different. If the equipment’s cases are connected to one side of the power (as most are, usually the negative), they will be at different potentials, and DC current will flow through the shields of the audio cables to equalize them. Unbalanced audio circuits can be severely disturbed by this situation, and even balanced audio circuits may be affected.

This DC current flow can upset circuits regardless of whether they have active or transformer inputs. A direct-coupled active input can have a DC offset introduced that is sufficient to swamp it, or at least, add to the audio to the point of overload.

Capacitor-coupled inputs can have the same problems if there is sufficient leakage current through the capacitor, or transient charge/discharge currents if the DC voltage level changes. DC current flowing through a transformer input can fully or partially saturate the magnetic core, producing much the same kind of distortion. Another problem occurs when HF noise such as from a switching- type (inverter) power supply travels out of the device over the DC power wiring, and then into another device over its power cable because both units are connected to the same battery. Using a power distribution panel that has filters at each output socket will help to eliminate this noise source. To be most effective, the filters should incorporate a parallel capacitor (to short out most of the higher frequencies before they can pass though to the next stage), followed by a series inductor (to block the remainder of the higher frequencies from passing through). The lower the high frequencies you want to block, the larger (physically and electrically) these two elements must be. You may have seen ferrite traps, split ferrite cores in a plastic housing designed to snap around the outside of a cable, but they are suitable for stopping only radio frequency interference.

While not a ground loop problem per se, if one piece of equipment has inadequate power supply filter capacitors, it may have its current drain modulated by the audio (or some other function) and cause the output voltage of the common battery to fluctuate. This variation may in turn affect other devices connected to the same battery (or power supply if you’re running on AC). You can solve this problem with “decoupling caps,” large electrolytic capacitors (1,000-5,000 MFD @ 20 VDC) installed across the power circuit, as close to the offending device as possible. WARNING: These capacitors are polarized and must be connected correctly. They may explode, or at least vent hot gasses if hooked up backward.

Be sure to use adequately-large wires to carry DC power to the various gear on your cart. A 1-volt drop is not significant in a 110-volt circuit, but definitely is in a 12-volt one. Also, stranded wire can have several strands broken in the process of stripping off the insulation, further increasing the voltage drop. Be careful to avoid this when making connections. As an example, 10-gauge wire is rated at 30 A for a 100 feet of household electrical wiring, but for 12 VDC it should be limited to no more than 10 A for runs of 10 feet. (Easy to remember: 10-10-10.) 10-gauge copper wire has a resistance of about 1 milliohm (mΩ) per foot, so 10 A and 10 feet gives a voltage drop of one tenth of a volt (0.001 x 10 x 10 = 0.1), or slightly less than 1% of 12 V. But remember that there will be this drop in both power wires, for a total loss of 0.2 V. (Another example: a smaller 20-ga wire is about 10 mΩ/ft, so it is good only for 1 A @ 10 ft, or 2 A @ 5 ft, or 4 A @ 2.5 ft, etc.)

The most common connector used for low-voltage power distribution is the four-pin XLR, with Pin 4 positive and Pin 1 negative. Each pin is rated for 5 amps. This is sufficient to operate most devices, but may not be for recharging larger 12-V batteries. On my battery charger cable and the mating receptacle on the battery pack, I jumper Pin 1 to Pin 2, and Pin 3 to Pin 4, thus doubling the current-carrying capacity. WARNING: Some power systems use Pin 2 and/or Pin 3 for other voltages or charger inputs, so be careful not to use this high-current jumper format with them.

Editor’s note: Subsequent installments will deal with transformer balancing, optimal cable wiring practices, sound cart construction to minimize grounding problems and other practices to assure safe and noise-free operation.

Text and pictures ©2012 by James Tanenbaum. All Rights Reserved.

Up “The River”: A Puerto Rican adventure

by Steve Nelson, CAS

I had some idea what to expect when I heard that the pilot we shot last spring for DreamWorks Television had been picked up by ABC. The one-hour pilot episode of The River was shot in Puerto Rico (PR) and, while it was no surprise that the new episodes would shoot in Hawaii, it was a little disturbing that we were nowhere to be found in the fall schedule. Eventually, the other shoe dropped and it was revealed that we would be a midseason replacement with a seven-episode order.

THE STORY

Dr. Emmet Cole (Bruce Greenwood), famous television explorer and family man watched and loved by millions—think Steve Irwin crossed with Jacques Cousteau, dosed on acid, “There’s magic out there!” his tag line—has gone missing along with his crew, presumed dead somewhere up a mysterious and increasingly bizarre Amazon tributary. When his locator beacon is suddenly detected, a search is mounted, spearheaded by his wife and son (Leslie Hope and Joe Anderson) and their producer (Paul Blackthorne) with the twist that it is to be filmed for broadcast. The show was co-created by Oren Peli of Paranormal Activity fame and master of the “found footage” concept. His concept was to make a show that, while strongly written and acted, would have the look and feel of a documentary, but in the supernatural thriller genre. Scary stories of ghosts and magic shot like unscripted reality TV. On a river, in a boat, in the jungle.

THE CHALLENGE

What does this mean for the Sound Department? The first thing we learned before we even left for Puerto Rico was that it meant shooting with many cameras. Up to 14, in fact, and almost every type of HD camera: a couple of Alexas, the hero cameras, Sony EX-3s for the on-camera/actor cameramen, a consumer handicam and Canon 5D and 7D for mounting and random hand-held, and GoPros anywhere you could hide them. (That’s right: We’re shooting broadcast-quality video for a prime-time network show with the same $200 camera you buy to strap onto your board, your bike helmet, your skateboard, your dog. Welcome to our brave new world!) Not to mention the camera mounted on the miniature radio-controlled helicopter, the so-called Laser Beak. Besides the actual camera operators, whether cast and/or crew, Dr. Cole’s boat, the Magus, is wired for video, with a switching/edit room amidships. There are cameras and apparently microphones everywhere imaginable, including a few places not so imaginable. With so many cameras running, we captured so much, let’s call it visual data, that the problems I had sneaking in a plant mike surely paled in comparison with the work the editors had to do just to get through dailies. The visual equivalent of recording all eight tracks all the time. Oh wait; I did that!

With shooting underway in San Juan, I learned from Oren Peli, our co-creator and executive producer, that, in his world of “found footage” movies (The Blair Witch Project, Cloverfield, Paranormal Activity), the only microphone needed was right there on the camera, so why was I even there? We had several half-joking conversations about this, and I think I was able to persuade him that despite our flagrant deviation from his strict philosophy and aesthetic of this relatively new genre, I was bringing something better than that camera mike and that the studio and the network would be much happier this way. The company had already spent quite a lot getting both me and Knox White, my excellent and endlessly entertaining boom operator, plus several pallets of sound equipment, to this island location. I reckoned we weren’t in imminent danger of replacement by that camera mike.

SO MANY CAMERAS
SO LITTLE TIME

Next we learned that with up to 14 cameras working, there really isn’t any place for a boom operator, much less an actual overhead microphone. Most often all we could get with the boom were the slates. Fourteen cameras means 14 IDs and markers. (Including the GoPros, which, at 25 fps only, were not technically sync cameras since everything else was at 23.978 fps.) Sometimes the best entertainment was watching the slating unfold. When I told the camera assistant that he had to voice ID each camera, his job got a lot harder, remembering each camera in its sometimes obscure location and rattling off each ID in a charming Puerto Rican accent. By the end of the season in Hawaii, the guys were so fast that I even had to slow them down a bit so the poor dailies folks could sort it out. If the cameras were too spread out over the boat for Knox, I might have to sacrifice a radio channel or two just for slates.

In this new hybrid world of fake-umentary (scripted posing as reality, the better to scare you), we would have actors who would play the camera operators and be prominently featured. In subsequent passes, we would float in one actual camera operator and then the other to get the shots we actually wanted. If one of the actual operators got into another’s shot, it was accepted that he would be cut out. As it was acceptable to reveal the cinematic process by showing the cameras, the question of sound naturally arose. There was no sound person in the script I read, but for about a minute we entertained the idea that it would be OK to see the lavalier mikes as part of the documentary process. This concept didn’t make it to Puerto Rico before the myth of the on-camera microphone prevailed. Business as usual: hide the lavs, forget about how quality sound is actually recorded, even in our make-believe world. Cameras and cameramen are apparently sexy enough for network television, not so much sound guys and their gear. (Personally, I think it all changed when we put away our quarter-inch machines; Nagras were hot! John Travolta didn’t have a digital recorder in Blow Out; there’s just something about those spinning reels.)

Our fate was becoming clear: The River would be a wireless show pretty much all the way. Wire everybody always unless they were going in the water. Forget about sneaking in the boom for the close-ups or even hiding a mike; it just didn’t work like that. This wasn’t “tight and wide;” this was tight and tight and wide and wider and tight and one up on the rigging and so on until we ran out of cameras. At this point, all I could do was embrace it. Forget about mixing for perspective; that is so last century! If their lips are moving, put them on an iso track, make sure the fader is open, record the words on the pages and be ready for more. Even if you could see all the monitors, who has time to watch? Just mix, baby!

ONE IF BY LAND, TWO IF BY SEA

My work would be divided into two categories: Land-based, which meant I could work from the comfort of my cart with all 14 channels of wireless and the big Yamaha board, and waterbased, which called for a much more portable setup. Land-based mode would work if we were on the boats, dockside. Our first two days of shooting were interiors on docked tugboats, which, being built entirely of steel, require special consideration for radio work. We also built sets for some of the boat interiors. The scenes in Zodiac boats zipping through the mangroves or on our practical boats underway or traipsing through some inaccessible jungle, obviously called for a portable, studio-in-a-bag modality.

THERE WILL BE TRAVEL

The story unfolds in two parts: the pilot in Puerto Rico and the seven episodes shot on Oahu. Pilots are always a bit special and often quite memorable. This one was. Although PR has been a location in many movies and television, this was the first time for many of us. We were gifted with the participation of Luis “Peco” Landrau, our utility person. Peco (“Freckles,” he’s a rare Puerto Rican redhead) is an experienced mixer and boom operator who mostly does utility and second units for visiting productions. His uncle is a highly respected mixer there and taught him well; he is extremely professional, prepared for anything, speaks English and gets along with everybody. We were fortunate to be working with DP John Leonetti, who I’ve known for many years, very talented and a real gentleman. John brought enough of his people to feel comfortable and the locals who rounded out our company were very experienced and competent. Of course, when it’s not too busy in a place like this, the top crew is available and we had them. Our production staff was strong, led by line producer Bob Simon, local unit manager Ellen Gordon, and first AD Dan Shaw.

Our director, Jaume Collet-Serra, a Catalan of no fixed address, known for films likeOrphan and Unknown, was embarking on his first adventure in directing for television. The pilot sets the look and the tone of the series and Jaume really went for it, creating a multi-faceted, jittery, sweeping pace that never let up in its intensity. He loves to grab a camera and operate and was fearless, while still showing concern for the quality of our sound and the intelligibility of the dialog. The profusion of accents among our cast was of some concern. We had U.S., Canadian, three different flavors of British (including one doing American), German, Mexican (her character doesn’t actually speak English), and an American doing an unspecified South American accent. We discovered in post that there is no sound problem that can’t be fixed with subtitles. With The River’s “documentary” feel and its use of chyrons for dates and locations and characters, this technique worked very well both for translation and getting us through a few rough spots. I want subtitles on all my shows!

A word here about our lovely cast. This was a truly international ensemble and a more generous, spirited, cooperative, talented, friendly and respectful group could not be found. They all seemed to understand that if this endeavor were to be successful, all the parts would have to work, including sound. Under the circumstances, with so much wiring going on and so much action, this was key. No complaints, endless patience and a costume designer who really got it; the wardrobe was very forgiving and easy to work with.

TOOLS OF THE TRADE

It had been a long time since I’d had to do extensive over-the-shoulder work. Back in the (very) old days at Entertainment Tonightwith a BVU 110 over the shoulder, a couple of lavs and a short stick, tethered to a running cameraman chasing stars down the red carpet. Or doing docs with a Nagra, hoping that the take-up reel hadn’t jammed leaving me with a massive pile of spaghetti under the lid. Since the breakout of reality TV and the digitization and further miniaturization of our gear, bag

work has made major strides. How then to gear up for maximum flexibility, ease of use under challenging conditions, best use of equipment I already own, and cost efficiency? The goal was to have two separate recording packages so that I could quickly transition from one to the other. Since I have 14 channels of Lectrosonics UHF wireless spread across four frequency blocks, it was an easy choice to stay with Lectro, enabling me to use a subset of my transmitters in all situations. But what about receivers? The Octo-Pak is very desirable but perhaps a bit over budget for a pilot; for this gig it would be the Venue Field, giving me six tunable receivers in a single box. Of course, if there were more than six speaking parts in a scene, choices would be made; no matter how many wireless you have, sooner or later that will happen. A little research told me which blocks would give us the least interference (almost anything is good where we were going). I would install the appropriate modules and be ready to make a seamless transition from cart to bag mode. (One note about the Venue Field: about the only thing that can go wrong with this all-but-bulletproof device is that there is a chance of damaging the battery contact, a flimsy metal tab. Not a big deal, but it is a good idea to have a spare in the kit.) An eight-track, fully featured recorder/mixer would be required; for the pilot I tried a Sound Devices 788T with a CL-8 controller. As a longtime Deva user, it took a little getting used to the different interface. It is a brilliant unit and, once I gained confidence that I could work it in a pressure situation, the Deva 5.8 stayed on the cart. Since I was using the Venue, I had not the forest of antennas resulting from a bag full of receivers, but antennas are still required. On the pilot I used two log periodic sharkfins, mounted on plastic rods which got me good range. I also picked up a Comtek M-216 Option 7 transmitter so that all Comteks would work from either setup. Shove it all into a Petrol bag and it is very easy to pick it up and go, potentially without changing actors’ wires or Comteks.

RUM DIARIES

Yet another camera operator, Laser Beak. They could really thread a needle with this. Puerto Rico is far. From the West Coast anyway. First you fly somewhere far, say, D.C., then you fly quite a ways further. We were put up in the capital, San Juan, which gave us good access to our dry-land locations and wasn’t too far a drive to Rio Grande, near El Yunque rain forest, and our little river. A U.S. Territory, Puerto Rico is a spicy mix of Spanish, Caribbean, and U.S. American. Spanish is prevalent, of course, but English is also spoken and taught. The currency is familiar but distances are measured in kilometers. With its Spanish colonial heritage and architecture and tropical setting, you feel like you’re in another country but there are four Costcos on the island and many strip malls with all the familiar chains. It should be noted that the Costcos here carry an excellent selection of local rums, quite reasonably priced. As the city grew, parts of the infrastructure did not keep up; there is little in the way of reliable mass transport, everyone has a car, so traffic can be bad. It is painful to be stuck in the morning rush hour on your way home after working all night, but that is really not so different from anywhere and at least we were in a chauffeured van. There are many great restaurants in San Juan, just none close to our hotel. We were at the Hotel Caribbe, one of the original tourist destinations from the swingin’ Rat Pack ’50s, refurbished nicely, located a long walk from Old San Juan and a shorter walk from the newer strip where most of the hotels, casinos, and restaurants are and where film crews usually stay.

The small slice of Puerto Rico that we experienced on our days off during our three weeks there was lovely and entertaining. Tropical climate, nice beaches, diving, snorkeling, sightseeing, eating & drinking, music, clubs. Knox and I even managed to work in an excursion to the beautiful island of Vieques to see the amazing bioluminescent bay. (That only involved a drive to the marina on the east end, a boat to the island, a van ride to the nature center and dinner, a bus to the bay, an electric boat on the bay, then finally, a dip in the psychedelic water, and then the whole thing in reverse. But so worth it!)

Working there was great. There is decent infrastructure and good crew used to working with Hollywood productions. We had a cozy but well-outfitted truck to share with video assist and a very attentive owner/driver. We were lucky to have Peco, who had been working with all these people forever; it felt very much like family. Working in a tropical place, it is not a matter of if it’s going to rain, but when, and the locals certainly get that. The first thing off the truck is the easy-up, which is automatically sandbagged: rain protection always. If it’s not raining, the sun will scorch you, so you’re covered both ways. The question is why does it always seem to rain right at wrap? Since we were working near or on the water in a pretty wet area, mosquitoes were plentiful, but nothing that a healthy dose of DEET wouldn’t solve. (Sorry, but I’ve learned that none of those alternative repellants, from Avon Skin So Soft to whatever natural product, are at all effective.) However, even after slathering on the DEET for our frequent night work—scary things do happen at night—I found myself terribly bitten. I thought I had bed bugs, which was ruled out after the hotel tore apart my room in search of them. It turns out they have some particularly nasty no-see-ums, biting midges. The best solution I found was long pants and long sleeves at night.

There is no soundstage in Puerto Rico outside of a television station. Instead we built sets, boat interiors mostly, in a dank and airless warehouse. The best that can be said about this place is that it was right next to one of the aforementioned Costcos and wasn’t too far from the hotel.

Our river location was on the east side of the island, below El Junque rainforest, near the town of Río Grande, on the Río Espíritu Santos, about 45 minutes to an hour from our hotel. It’s not much of a river; at points you could throw a quarter across it, but it is quite navigable and gave us long runs in either direction with little spurs and mangrove choked banks. It feels isolated even though it is close to the road, a couple of villages, and apparently an airfield. It is lush and green without the constant rain and restrictions of shooting in the actual jungle, but the wide shots would require some digital set extension and augmentation to really sell the Amazon. The boat playing the Magus was a full-sized, 60-foot working vessel not designed for this kind of environment. The art department did a great job of dressing it way down to give it a profoundly abandoned look, but it was very challenging to get it over the sandbar at the river’s mouth. We were lucky to be there during a full moon that was actually closer to the Earth than usual, which created a higher than normal high tide and allowed us to make it upriver.

THINGS THAT GO BUMP IN THE NIGHT

Once we got underway, and once we got our heads around the concept of such a multiplicity of image capturing devices and their effect on our work, it was business as usual—almost. All departments share many common problems working on boats: cramped space, not really production-friendly. Kind of like a big insert car on water, it is hard to make changes once you’re moving, much less stop or return for something you need. One issue that particularly concerns the sound department: what motivates the boat? Since we were using practical boats, there was the self-powered option. However, the few times that we fired up the twin diesels on the Magus, they were so loud that it was difficult to think, much less record dialog. Fortunately, we had Dan Malone heading our marine department. Dan is a guy who makes you both feel safe in all situations and that he understands your needs. He provided a large skiff which was used as a working platform and to push or pull the Magus and the smaller S.S. Hopewell—the boat that takes our intrepid travelers to find the abandoned Magus. Its engines were powerful enough to do the job, yet quiet enough, and the skiff long enough, to keep them at a respectful distance. The conceit was that the relatively quiet motor noise could be justified and blended with actual motor noise in the final mix. As these were working boats, we could have a pilot in the wheelhouse actually steering; this was not the case in Hawaii as you will see.

If the boats were moving or parked offshore or if we had a rugged and distant jungle location, I needed to grab my studio in a bag, the appropriate transmitters and mikes and go mobile. Knox would join me to manage the wiring and get the slates and maybe even boom a shot. Onboard, it could be challenging just to stay out of shot—did I mention that there were lots of cameras and never a tripod? I could often find a safe perch on the upper deck, which gave me a nice view of the surroundings, if not the action, and lots of fresh air. Sometimes the ferrous construction would limit my radio range; we might remote the antennas or I might have to scamper down from my crow’s nest to get closer. We did have a video assist operator who was responsible for wrangling the many monitors attached to the many cameras, but once on board, he had his problems too. If it was feasible to set myself in view of the monitors, I would take advantage, but usually they were crammed in a corner with too many people so it seemed best to find a relatively comfortable spot, keep my eyes on the script, and imagine what it would look like. This worked well; it kind of took me back to the very old, pre-video assist days.

When the boats were tied up to shore, it was possible to work from the cart. I could park in a comfortable spot and push my RF cart close to the water’s edge. I keep all the radio equipment, receivers and transmitters (Comtek and IFB) on this cart, which is tethered to my mixer’s cart by means of an Aviom digital snake. This allows me 16 channels in both directions via a piece of Cat-5 network cable up to 300 feet long. (I’ve gone even longer than that with no ill effects.) Of course, this adds another cart to the package, but I find the increase in flexibility more than worth it. Rather than remote all those antennas, we move the whole package—antennas, receivers, transmitters, power—drop anchor, connect the Cat-5 and you’re good to go. Aviom makes a good product, born of rock and roll, surprisingly sturdy and very transparent.

We had a few days when we were working with the inflatable Zodiac boats. Four boats, two with actors, an actor cameraman or an actual camera operator on each, simultaneously shooting as we zipped down our little river through the mangroves to where we emerge to find the Magus. We were in a follow boat trying to stay out of shot and yet within range of the video transmitters and my audio transmitters. Those Zodiacs are loud and the very dense mangroves and the water really do soak up the RF. This would have been a good opportunity for the Zaxcom TRX series of recording transmitters. When things get stretched beyond the limits of RF transmission, there is always the option of putting the studio in a bag in the boat with the camera and the talent, pushing “record” and sending them on their way. Sometimes that is the best choice; mostly we were able to keep our link by virtue of good driving. Reminding the actors to speak above the motor noise also helped. The situation is made more complicated when the actors are not only piloting the boats but operating the cameras, but with the coverage shot by the real camera operators and our persistence, we got the scenes.

As night falls on the river, along with the bugs there is a rising chorus, an onslaught of sound, quite formidable and immediately recognizable once you’ve heard it. It is los coquís (onomatopoeically: accent second syllable, rising pitch), the little frogs, the unofficial mascot of Puerto Rico, out looking for love every night. Barely an inch long, they raise quite a racket and there is no controlling them. I went to some trouble to get some nice clean coquí tracks, heading out on a small boat away from our encampment with a small Olympus pocket recorder. It was surprising how noisy it was out there when all you’re recording is frog wallah. It is a distinctive sound, quite lovely and, for Puerto Ricans, very nostalgic, and it is all over the soundtrack of the pilot. I had thought this unique sound would be used throughout the series for continuity’s sake, but it was left in the Caribbean. Although coquís had been introduced to Hawaii back in the ’80s, they were considered an invasive pest (along with almost every other animal and plant on the islands) and eradication programs were undertaken. I never heard one over there.

The water work fell into the middle of our schedule; once we finished out in the wild, we fell back to our stage work for a few days and then finished with one of our few days shot on location in the city. Then pack it up, ship it home and adíos. A final word of caution regarding departure from Puerto Rico: My equipment had been air freighted out of Los Angeles by a reputable firm and had arrived intact at our stages in San Juan. If you find yourself working down there, do not assume that your outgoing shipment will be treated with the same respectful diligence. Absolutely be sure to supervise any packing for the return trip.

WE’LL BE RIGHT BACK AFTER THIS BREAK

To be continued in “Up The River in Hawaii.” What? There are no actual rivers in Hawaii? Good point but we didn’t let that stop us. Learn how in the next issue.

Picket Lines: Organizing efforts on a reality show

In the fourth season of 1000 Ways to Die, a show produced for Spike TV, crewmembers voted unanimously for union representation. The company responded by immediately firing all the workers.

On February 27, the IATSE and Teamsters Local 399, supported by the Writers Guild, SAG, AFTRA and the Los Angeles County Federation of Labor, organized a picket line at the offices of 1000 Ways. On February 29, picketing expanded to include the Burbank offices of Original Productions, the parent company of the 1000 Ways production company. Original Productions also makes Deadliest Catch, Storage Wars and Ice Road Truckers.

Local 695 members provided active support at both locations. Soon thereafter, Spike TV declared that it would purchase no episodes made with scab labor and production was halted.

The process has now entered a new phase and the International has “bannered” several companies affiliated with Original Productions. This is a lower profile demonstration to inform related companies that Original Productions continues to be subject to organizing efforts. It also serves notice to Original Productions that they remain a target.

When Sound Was Reel 8: Dolby noise reduction in the ’70s

by Scott D. Smith CAS

In the previous installment of “WSWR,” we examined the development of Dolby noise reduction and its application to film sound recording. This issue looks at the further work done by engineers at Eastman Kodak, RCA and Dolby Labs in relation to film sound during the 1970s.

The Problem

Ever since the production of Disney’s Fantasia in the 1930s, studios and film producers had been looking for a low-cost method to distribute release prints accompanied by high-quality multi-channel soundtracks. The first multi-channel optical sound system developed for Fantasia in 1940 proved so costly (at least $45,000, in 1940 dollars) and cumbersome that only about a dozen road show engagements were mounted utilizing the full stereo sound system (outside of its initial 57-week run at New York’s Broadway Theater). While Fox’s Cinemascope system with four-channel magnetic stripes offered a less costly alternative (about $25,000), it still required the striping and sounding of prints on special “Foxhole Perf” film base and constant maintenance of the projection system mag heads. This applied to Todd-AO 70mm six-track systems as well, which only saw use in road show engagements of big-budget studio releases. By the end of the 1950s, research pertaining to improved film soundtracks was pretty much at a standstill and interest in multi-channel film exhibition waned. The exceptions were a few landmark films such as Woodstock and 2001: A Space Odyssey.

Original RCA stereo variable area optical recorder used at Elstree Studios in 1974. This was the same recorder used at Eastman Kodak for the 16mm stereo experiments, but converted to 35mm. (Photo courtesy of Ron Uhlig/SMPTE)

In the meantime, the general public was enjoying high-quality album fare on quarter-inch tape in the comfort of their homes. With the advent of consumer reel-to-reel recorders, record companies began releasing albums in quarter-inch stereo and even quadraphonic tape formats. The quality of a standard 35mm mono Academy soundtrack reproduction was rather dismal in comparison, especially when played through loudspeaker systems designed in the 1940s. Clearly, there was a real disconnect in the marketplace.

Sample of 16mm stereo optical negative (Photo courtesy of Ioan Allen-Dolby Laboratories)

Nearly a decade would pass before any further work pertaining to multi-channel sound was mounted. The medium of optical soundtracks hadn’t really seen any significant improvement since the 1940s, except for incremental improvements in film stocks and minor upgrades to the recording chain. The optical film recording transports made by both RCA and Westrex all dated back to original systems designed in the 1930s and ’40s.

While the Academy Research Council had conducted work on a push-pull color three-channel optical sound system in 1973, for various reasons it was deemed impractical at the time. At the same time, efforts at improving the quality of 16mm optical sound were undertaken by Ron Uhlig at Eastman Kodak, with help from Jack Leahy and RCA engineers. Their focus, however, was on technology for producing 16mm stereo variable area optical soundtracks, which was intended to compete with Sony’s three-quarter-inch U-Matic video format in the industrial film and educational markets (still a substantial source of revenue for Kodak). Despite the fact that the system never gained any traction in the commercial marketplace, their work did pave the way for similar developments pertaining to 35mm optical sound recording.

One of the key problems related to 16mm stereo optical tracks was the dismal signal-to-noise ratio which, even in mono, was pretty poor: about 50 dB “A” weighted on a good day. Splitting the area used for the 60-mil wide soundtrack into two 25-mil tracks (with 10-mil track separation) just exacerbated the issue. The application of the consumer Dolby B improved upon this by 6 dB, but it was still a far cry from the quality of even an average mag stripe print. Although the Dolby A multi-band processor could improve on this, it would mean a significant additional cost for the projection systems typically employed in the industrial/educational markets. In addition, the inherent constraints of 16mm optical limited the response of release prints to about 7 kHz at best. With few market prospects for the system, RCA and Kodak eventually abandoned their work for 16mm soundtracks.

35mm Stereo Optical Sound… A Long Time Coming

While stereo optical sound was nothing new (Alan Blumlein had developed a system in 1934 and John Frayne did further work in 1953), successfully marrying a two-channel soundtrack to a composite release print for commercial release was a daunting task. Among the issues of the day was the fact that theater projection systems would need to be modified for dual photocell tubes, using a prism optical system of the type that normally would only be encountered on (expensive) studio reproducers. Further, standard mono soundtracks of the era did not have much HF response past 12.5 kHz, and signal-to-noise ratio was still rather abysmal when compared to magnetic soundtracks.

Ray Dolby and Ioan Allen, along with engineers at ​Dolby Labs, had already tackled these issues as they pertained to mono soundtracks, and applied the same engineering approach to a two-channel variable-area soundtrack on 35mm film. The issue of improved HF response was addressed by reducing the height of the scanning slit on the optical reproducer, and removing the low-pass filter typically employed during the recording of the optical soundtracks. In addition, the advent of the solar cell allowed for an improved optical reproducer system that did not require expensive optical splitters in the projector penthouse.

The first optical recorders (one in the UK and one in the United States) used for striking stereo negatives were based on the RCA dual galvo system as originally conceived by Uhlig and Leahy. Later on, Westrex RA-1231 optical recorders were rebuilt using a modified version of the four-string light valve that dated back to 1938.

At the same time, newer electronics for the optical recorders were developed, which allowed for better control of exposure, as well as improved driving of the low impedance ribbon light valves. Further improvement was accomplished by using analog delay lines to compensate for the slow response of the “ground noise reduction” system (known as GNR), employed in optical recording. The GNR system was required to control the area of exposed track, reducing the un-modulated track area to a narrow “bias line.” Adding a delay to the actual program audio fed to the galvo allowed for sufficient time for the NR shutters to open completely during heavily modulated passages, thereby preventing clipping of the first few cycles of the waveform.

The 3 dB of S/N ratio that was lost due to the halving the soundtrack area for stereo was compensated by the application of Dolby A noise reduction. Film labs made additional efforts to control “printer slip” and irregularities in soundtrack print densities, factors that contribute both HF loss and inter-modulation distortion. While the resulting soundtrack was still not as good a magnetic track, it was an improvement over the 40-year-old Academy tracks. Most importantly, laboratories could strike high-volume release prints using conventional printers and processing equipment.

That’s Great; Now How About a Couple More Channels?

At this point, engineers now had a working model for a standard twotrack, three-channel stereo recording that could be recorded and reproduced on film using a basic decoding matrix to extract the center channel. Looking back at these developments from the vantage point of 2012, it may not seem like much, but the engineering effort involved in upgrading a nearly 45-year-old format was no mean feat. However, for real-world application in the cinema, it was lacking in one crucial area: it needed four channels.

While consumer stereo systems of the era only employed two (or in the case of quad, four) channels of program material, film reproduction in large cinemas required at least three channels, preferably four. This was to provide for the center channel speaker needed to anchor dialog, as well as providing for audience coverage in large venues. The Fox Cinemascope magnetic system had four channels and the Todd-AO system had six. If stereo optical soundtracks were to be commercially viable, they would need to accommodate at least four independent channels to match the Left/Center/Right/Surround speaker systems already in place in many theaters.

Fortunately, as a result of the “Quadraphonic” fad of the early 1970s, there was a solution to be had. For those not familiar with consumer Quadraphonic (usually referred to as “Quad”), systems of the era, a bit of background is required: Thinking that consumers were yearning for something beyond just two channels, record companies began experimenting with four-channel surround sound for home reproduction. Like the VHS and Beta wars soon to be launched, there were competing systems introduced in the early 1970s. Of course, they were all incompatible with each other. Of these, there were only two using a matrix approach that had any real commercial acceptance. The first to be introduced was the “SQ” system, based on work originally done by Peter Scheiber in the late 1960s. It was first used by CBS for selected record releases in 1971 and was later adopted by at least 11 other record labels.

The competing system was called “QS” for Quadraphonic Sound, (later referred to as “RM” for “Regular Matrix”). How’s that for generating confusion in the marketplace?! Developed by engineer Isao Itoh at Sansui Electronics, it was conceptually similar to the Scheiber system but utilized a different algorithm to extract the encoded four channels from the two-channel source. Alas, the general public was not ready for quadrophonic sound and except for a diehard group of audio enthusiasts, interest in the format died out after about five years.

However, both systems allowed for four channels to be encoded onto a two-channel carrier, thereby allowing them to be adapted to the standard L/C/R/S speaker configuration already in place in theaters that had been converted to magnetic. (Note, however, that the speaker configuration and recording techniques used for consumer quadraphonic systems were completely different from cinema loudspeaker layouts.)

In the beginning, Dolby opted to use the QS matrix technology, resulting in the “3:2:3 matrix” system. The entire system process was dubbed “Dolby Stereo.” The QS matrix was originally employed by Dolby for the release of Lisztomania and several other films. 1976 saw the release of A Star Is Born, the first film to employ Dolby Stereo surround technology. QS was also used to generate the five-channel Quintaphonic magnetic soundtrack used on the film Tommy in 1975. However, Dolby abandoned the use of the QS matrix in 1978, opting for a custom built matrix that employed a variation on the Scheiber system, which was referred to as “MP Matrix.” This was the system that was employed for the release of Hair in 1978.

Star Wars

While movies like Lisztomania, A Star Is Born and a handful of others helped to generate industry buzz for the Dolby Stereo format, theaters were slow to jump on the bandwagon. Owners were reluctant to invest in yet another sound system after the demise of the Cinemascope four-track magnetic and 70mm six-track systems.

This all changed, however, with the release of George Lucas’s groundbreaking Star Wars in May of 1977, followed by Spielberg’s Close Encounters of the Third Kind in November of the same year. Now, theater owners began sit up and take notice. With more than $353,668,000 in combined domestic rentals, these two films were largely responsible for the sudden interest on the part of theater owners to adopt the Dolby Stereo system. (It should be noted that both of these films were also released in Dolby Stereo 70mm six-track magnetic for their road show engagements, with 35mm Dolby Stereo optical prints being struck for smaller markets and secondrun. In addition to ramping up sales of the Dolby SVA processors, it also helped to reinvigorate interest in 70mm for road show releases, using the modified format of Dolby Six Track Magnetic.)

When Star Wars opened in May of 1977, there were only 46 theaters in the U.S. equipped for Dolby Stereo. By the time Richard Donner’s Superman opened in December of 1978, there were 200 theaters and, within three years, that number increased tenfold to 2,000. Clearly, the folks at Dolby were onto something, and they went on to establish a firm foothold in both the domestic and international theatrical market, a position that they would enjoy exclusively until the release of the competing “Ultra Stereo” system in 1984.

One of the key aspects in acceptance of the Dolby Stereo SVA (for stereo variable area) system was the fact that prints struck in the format were backward compatible (in varying degrees) with the original mono Academy optical format. This reduced the need for dual print (optical and magnetic) inventories, greatly simplifying distribution. While it is likely that the format would still have won out even without this compatibility aspect, it went a long way in helping to convince both studios and theater owners of the long-term viability of the system.

A Few Little Problems…

Despite the relative success of the Dolby Stereo variable area system, there remained a few issues. Chief among these was the poor separation between channels due to the limitations imposed by the surround decoder technology. To overcome this, Dolby employed a logic steering system, which would assist in “steering” the signal to the appropriate channel, thereby curtailing some of the problems of crosstalk between channels. However, this system required diligence during the mix to ensure that nothing ended up where it shouldn’t be due to the random phase relationship between channels. Therefore, all mixes destined for matrixed Dolby Stereo release employ a combination encoder/decoder on the dub stage to facilitate monitoring final results.

A further issue involved the quality of the surround channels, which is produced by taking the difference signal of the encoded left and right signal channels (called Lt and Rt in Dolby parlance), and routing it through a delay network in the playback processor to the auditorium surrounds. (The delay is always applied during reproduction instead of during the final mix, as every theater has different delay characteristics.)

The limitations of most auditorium surround speakers also made it necessary to limit the surround bandwidth to 100 Hz–7 kHz to avoid overloading the drivers. Due to the proximity of the speakers to the listener, combined with issues related to HF sibilance “splash” (caused by projection reproducer azimuth errors and uneven slit illumination), it was necessary to apply a further 6 db of noise reduction to the surround channel in the form of Dolby B.

While all this processing and signal manipulation may look inordinately complicated to some, it must be remembered that in 1975 the only other alternative available for getting four channels onto a film print involved expensive mag striping and sounding of prints, which also meant dual inventory for distribution.

While various improvements were made to the original MP matrix technologies over the years, the system still remains backward compatible with films made after about 1978.

What About the Folks at Home?

Having firmly established their brand in the cinema industry, Dolby wasted no time in applying the technology they had developed for theater applications to the consumer market. The arrival of the improved Hi-Fi versions of both Betamax and VHS in 1982 provided the first practical mass-market opportunity to distribute feature films with surround sound to the public for home viewing. With a catalog of film releases that dated back about seven years, Dolby saw an opportunity to exploit the market for movie aficionados who were craving something beyond standard two-channel reproduction (These were of rather poor quality in the early versions of both VHS and Betamax). Thus was born “Dolby Surround,” which was the moniker used to denote the consumer version of the Dolby Stereo cinema system. The major difference between the early consumer systems and the more sophisticated theater version was that the consumer version consisted of only three output channels; Left, Right and Mono Surround.

While this approach helped to provide a sense of spatial imaging for Dolby Stereo films, the lack of a separate center channel to anchor the dialog was a noticeable deficiency, which Dolby addressed with the release of the consumer version of Dolby Surround with Pro Logic in 1987. This system was licensed to various consumer audio manufacturers, who created packaged audio receivers employing the Pro Logic decoder. Further refinements were offered in 2000, when Dolby adopted technology developed by Jim Fosgate to encode and extract five channels of audio (Left, Center, Right and Stereo Surround).

While the Pro Logic system was the pre-cursor to the discrete 5.1 systems we enjoy today, the actual 5.1 channel format was first employed for the six-track 70mm releases of Superman and Apocalypse Now in 1979. The first use of 5.1 on 35mm film was the release of Batman Returns in Dolby Digital in 1992.

One reason for the rapid adoption of the Dolby Surround system was its backward compatibility with the earlier Academy mono systems that were still prevalent in the ’70s. This meant that films mixed in Dolby Stereo for cinema release could be directly mastered for home video using the matrixed two-channel (Lt/Rt) soundtrack masters. The only work typically required was to decode the Dolby A noise reduction used on the Lt/Rt magnetic print masters. This saved a considerable amount of money for studios wishing to license their back catalog into the burgeoning home-video market. Were it not for this, it is doubtful that Dolby Surround would have gained the market acceptance it did, which likely would have meant the public would have had to wait at least a few more years to enjoy the quality of sound that was already available to them at home.

Next: Digital sound comes to the movies


Many thanks to Ioan Allan of Dolby Labs and Ron Uhlig for their contributions to this article.

All photos and diagrams credited to Ron Uhlig/SMPTE were originally published in the SMPTE Journal, April 1973 , Volume 82, pp 292-295. DOI 10.5594/JO8887 Copyright 1973 SMPTE. Permission to reprint is gratefully acknowledged to the SMPTE and to Eastman Kodak, Ron Uhlig’s employer.

© 2012 Scott D. Smith CAS

Courtney Goodin, Part Two: Digits and Widgets

A Profile of Courtney Goodin
Part Two: Digits and Widgets

by David Waelder

The project that shaped the direction of Courtney Goodin’s career began on a dare. In 1982, he and Laurence Abrams were working on a commercial for Members Only clothing and the spokesperson was being cued with a QTV Teleprompter. Cuing devices (then and now) work by reflecting an image of the words onto a partially silvered pane of glass mounted in front of the camera lens at a 45-degree angle. The speaker can read the text while looking directly at the camera. It’s an ingenious device but the process of scrolling the text was quite primitive at that time.

The copy was printed onto rolls of paper with tractor-feed perforations at the sides. A mechanism with motors, belts and pulleys would roll the paper from one spool to another to move text across the screen. The potential for intrusive noise was obvious and the clattering machinery was a particular nuisance on that assignment. At lunch, Courtney sat down with the prompter operator and asked him, “Is anyone making a computer prompting system?” He replied, “Nah, it’s impossible to do on a computer. Computers just aren’t fast enough. There’s no way to do it on a computer.” Courtney just said, “OK,” but he took that as a challenge.

Courtney was familiar with the graphic capabilities of the Atari computer from his experience developing the Graphic Master image-editing program that he was marketing. He thought that the advanced graphics chipset used by the Atari was up to the task but there were some programming challenges. An effective prompting system must work smoothly; glitches or herky-jerky movement would be a distraction and undo the benefit of the prompting. There is a natural hiccup as the computer periodically refills its memory cache and Courtney had to devise a workaround that would permit smooth scrolling through lengthy passages. And it was important that it scroll smoothly both forward and backward and permit quick, jerk-free changes of direction. When he had that worked out, he developed a remote control using the same boxes he had used for the Goodsound Talk Back units. An operator would be able use the remote to scroll the text forward and back and smoothly ramp the speed up and down to match the reading pace of the talent.

With the software issues largely resolved, he and Abrams, his partner in this venture, turned their attention to designing a marketable product. Courtney redesigned the camera-mounted beamsplitter and devised more flexible support hardware to accommodate 16mm and 35mm film cameras and smaller video cameras as well as pedestal-mounted studio TV cameras. Previous teleprompters were designed for permanent installation in environments like news studios but their device could be quickly attached to a wide range of cameras without adapters or a bulky three-foot base plate. They also designed a custom case that opened into a freestanding operator’s console for quick setup. They called their new product “Compu=Prompt.” It was the first personal computer-based prompting system. This accomplishment was honored with an Emmy Award in 2010.

Courtney also devised a digital dongle to provide copy protection. A dongle is a device that must be plugged in to a computer to authorize use of a software program. Very little was available at the time to protect against an unscrupulous user simply copying the software and building a unit. While the use of an electronic key to restrict unauthorized use of software was fairly common, most of these devices were a simple resistor circuit and were easily circumvented. Courtney’s use of a digital code in the dongle was sophisticated for its time.

Producers were hesitant to adopt the new device but Courtney and Laurence knew they had a winner when they wrapped a commercial several hours early. Looking back on the day, the only factor that was different was the performance of the prompter. There was no downtime for copy to be reprinted and threaded onto a roll, no crinkling paper spoiling takes, no difficulty matching pace of the copy. When they needed to re-cue to the beginning, it was accomplished instantly with the push of a button, no more waiting for the prompter to roll the long roll of paper script back to the beginning. In short, there were no prompter delays at all.

It was a triumph of clever programming and engineering. A $150 Atari computer with a hand controller and Courtney’s proprietary software replaced a $12,000 mechanical prompter and did so with much greater efficiency. Courtney was able to sell the Compu=Prompt system for $4,000, an enviable markup.

As producers became aware of the advantages of this more flexible system, demand grew and Courtney had inquiries from all over. He even managed to sell several of his Atari-powered devices to IBM! He demonstrated a prompter for the Reagan White House but had no answer for their dilemma of what to do when the President would wander off script; for someone determined to go their own way, even a prompter was no help.

Regrettably, Courtney was a much better inventor than businessman. Although he had some interest from Panavision, the industry leading camera company, he elected to license the Compu=Prompt technology to Dreamdata, a partnership with two other people originally formed to market computer video games. But Dreamdata had no vision, no data, only dreams; sales and rentals of Compu=Prompt products became their only source of revenue. After several years of increasing sales and expansion, the other partners conspired to divert that income to themselves. It took a lawsuit to wrest the rights away from his former partners and Courtney and Laurence were unable to develop the product in a competitive market until the suit was settled 2½ years later. ProPrompt, the new company they formed, was successful in the rental market but found sales difficult since the Compu=Prompt system was still being marketed by the former partners during the lawsuit. By the end of the lawsuit, it had been more than seven years since the introduction of Compu=Prompt and competing computer systems from other prompting companies had been able to establish a foothold.

The bad experience recovering the rights to his prompting invention discouraged Courtney from developing other products that would need to be built and sold. He continues to provide the C-stand to microphone adapters sold through LSC and Coffey but is otherwise out of the hardware market. Our very small market didn’t support volume manufacturing so Courtney needed to fit components into boards and solder them by hand to make his preamps and talk-back boxes. Writing software offered the same satisfaction of finding a solution to a problem without the necessity of breathing toxic fumes from a hot iron.

He would soon have an opportunity to practice those code-writing skills. He purchased a Sound Devices recorder to transition from DAT to nonlinear sound. Right away, he was frustrated by his inability to listen to the recorded audio and also see a timecode display as the file played. This could be accomplished by playing the file in the recorder, of course, but not once the file was transferred and played in a computer. When he inquired, he was told that Pro Tools would accomplish this but he balked at purchasing a $1,200+ program just to listen to his own audio and check timecode. Learning that there was no reasonably priced software to do this, he turned his attention to making his own. A healthy part of success in any venture is recognizing exactly what needs to be accomplished and what is already available. Microsoft Windows comes bundled with Media Player software that can play any standard audio file. Moreover, Microsoft permits users to devise programs that use the core features of the Windows software so long as they don’t change the control interface. He wrote code that would use the Media Player engine for the audio reproduction while reading the metadata and displaying running timecode in a large window that could be used as a digital slate. It consolidated the metadata from many files into a human readable spreadsheet-like format that could be saved to disk. He also envisioned an application that might free him from the process of keeping sound logs, a laborious process with his cramped handwriting.

The resulting software, BWF Widget, has become an essential tool for both editors and sound professionals and is now available in a suite of variations. Licensed to Sound Devices, it is the basis for their original Wave Agent program.

In the course of developing BWF Widget, Courtney confronted the problem of an absence of file standards among audio equipment manufacturers. The various companies, from Aaton to Zaxcom, would interpret timecode implementation protocols differently and each was loath to bend their practices to match the other. Acting as an outsider with no personal agenda or competitive hardware to market, Courtney helped persuade the various companies to come to an agreement on how timecode would be applied in various frame rates and at different sample frequencies. Attempting to bring order to this Babel may be his greatest single contribution to professional audio.

The limited success of ProPrompt meant that Courtney had to continue working regular sound and video playback assignments to earn a living. Recently, he has been doing the video playback for Dexter. In some ways, his need for regular work has been our gain as he is presently serving his third term on the Local 695 Board of Directors and he has served several times on the Constitution and By-Laws Committee. He continues to refine BWF Widget and remains active in the field so there is no telling where his fertile imagination may yet take us.

The Workflow of OZ: Video challenges in Munchkinland

Facing high-stakes challenges on a big-budget digital 3D feature film, Local 695 Engineer Ian Kelly makes it sound easy as he describes the unique production workflow he designed for the film Oz: The Great and Powerful.

by Ian Kelley

The way it was…

In the days of film, everyone knew what the workflow would be, from the producers to the lowliest camera assistant. Film went in the camera, was exposed and then sent off to the lab and everyone watched the dailies the next day. Sometimes magic happened, sometimes it didn’t. But for everyone, right through post, capturing the image and manipulating it had been refined over time right through to making release prints. Not anymore. New cameras and workflows emerge on a regular basis and unlike with film, there is no one correct way to make a movie. And added to that, the massive increase of visual effects means that ways to help the process have to be invented. 3D has further complicated workflows. Pity the poor producers with only a sketchy knowledge of the digital processes involved having to make very expensive decisions. For me, the most interesting part of a project is the design and the planning. It is the time when everyone’s needs on the production have to be assessed from what cameras are being used and what the deliverables are to what the workflow will have to be in order to accomplish the needs of the production through post. I usually try and map this out well ahead of time but it frequently changes as more information emerges.

Oz: The Great and Powerful

This film, which just wrapped principal photography, is a case in point. To be directed by Sam Raimi with Peter Deming as DP, it was to be a prequel to The Wizard of Oz and would be shot on recently completed soundstages in Pontiac, Michigan, thanks to generous tax incentives provided by the state of Michigan itself. Sam and his editor, Bob Murawski, are both from the area so that added an additional incentive to go there. My job was to be production video supervisor, a post I have occupied before on Alice in Wonderland for Tim Burton as well as The Polar Express, Beowulf and A Christmas Carol for Bob Zemeckis.

My work on the production began some three months before shooting began, starting with a test shoot in April of last year. For Oz, we tested both Red Epic and Arri Alexa 3D rigs. This was over three days at Universal on their new virtual stage and we tested cameras, recording systems and data management systems.

Once the test shoot was over and the dailies viewed, it was time to really start planning workflows for both cameras and recording and the data management once the shooting stopped. The DP obviously makes the decision as to which camera system but that will then dictate which hardware will need to be used and what software management systems will be used. Oz is a Disney film and the studio is very hands-on with regard to systems management so several meetings were held to discuss the best approach.

For the test, the Red Epics recorded to on-board SSD cards and the Alexa to a portable Codex recorder and there was considerable discussion about data management. For Alice, we had recorded to the big Codex studio deck and archived to LTO4 tape. I really like the virtual filing system on that deck although the machine is a bit of a boat anchor out in the field. Online storage was too expensive at that point (the movie was shot using Genesis and Dalsa cameras— file sizes were 8MB and 16MB per frame respectively) and largely untried to that date so the movie was completely tape based on LTO data tape, a somewhat painful way of working. For Oz, because of the number of VFX shots and the need to pull clips for turnovers on a weekly basis, it was decided that we would use online storage to keep all the media available although we were still going to archive to LTO5 data tape. For data management, we tested both FotoKem’s nextLAB and Light Iron Digitals’ OUTPOST systems. Both have their strengths but ultimately, we chose nextLAB.

For the actual shoot, it was decided that Panavision would supply three Red Epic 3D rigs plus two 2D Epic rigs—two on the 1st unit and one on the 2nd unit, with the 2D rigs for VFX shots such as reflection passes. Recording was to be on-board camera to SSD cards. Both units would have 3D video assist as well as 2D monitors for use on the set.

To add to the complications, there would also be three or more witness cameras and two remote recording booths for two of the characters who would later be animated with reference camera recordings made which would be used by the animators in post. Plus we had Encodacam that tracked the camera moves and drove Motion Builder backgrounds for real-time compositing—essential when so much of the movie was shot against blue screen. Both the composite and the background plates needed to be recorded.

CLICK TO ENLARGE

Planning

All of this has to be tied together in post of course, and this is where I insisted on the use of central sync and timecode for all cameras and recorders plus sound and witness cameras. This we achieved by using an Evertz 5600MSC master clock for each shooting unit with distribution amplifiers and patch panels built in to each DIT cart. We then made custom camera looms that carried sync and timecode, left- and right-eye pictures, return SIP (stereo imaging processor) feed and 3D rig control cables. I wanted to use fiber optic links to the cameras as we had done on Hugo but production ruled that out on expense grounds. As the movie was to be shot all on soundstages, that didn’t prove to be a big deal though. I did make “all-in-one” re-clocking boxes, for long cable runs, with BNCs on each side and AJA HD10DAs inside and these worked well.

I also wanted the different carts involved—DIT, stereographer, video assist, sound, Encodacam and puppetcam—to be tied together in a logical way, with a minimum of separate cables between them, to speed up setups. More cable looms and lots of sleeving later, I am now a familiar face at Pacific Radio in Burbank.

Prepping was done at Panavision through late May and June and much liaising was needed between camera, sound, video assist, VFX, Fotokem and Disney to make sure all the details were in place. We would be shooting in Pontiac—a long way from our usual suppliers of last-minute ‘stuff’—and Production wanted to know how much it would all cost. Finally, just after July 4, we loaded a 53-foot semi-trailer full to the brim with all the gear and sent it on its way.

Shooting crew

As well as the regular complement of Local 600 camera crew (who did a first-rate job of keeping all the gear organized), we had a DIT, a 3D rig technician and a stereographer for each unit. Working with DIT’s Ryan Nguyen, who I had previously worked with on Alice, and Paul Maletich, I pushed the idea of central sync and timecode distribution as that would really help our post workflow.

Originally, the color was to be graded after shooting (and after the SSD cards had been pulled from the cameras) but our DP wanted real-time color correction so Fotokem figured out a way to incorporate 3CP-created CDL file metadata into their workflow.

Kyle Spicer did media management on-set and Bryan and Eric from Fotokem did the work off-set in day and night shifts. All three did an excellent job.

The video assist was ably handled by the Local 695 team of Mike Herron and Roger Johnson on the main unit and Sam Harrison on the 2nd unit. Mike had opted for a Raptor 3D rig with four machines—one each for A and B cameras plus one for composites and one machine for playback reference. Mike could also drive the puppetcam recorder (also a Raptor 3D in dual recording mode) and the Encodacam background recorder with the big advantage that all the files were named and he could start and stop all of them centrally for playback to the director. Naming the files per the slate and take saved me hours in post!

Mike was kept extremely busy as Sam Raimi, our Director, made good use of video assist and of reference material from editorial as much of the movie had been storyboarded.

CLICK TO ENLARGE

The witness cameras were handled by the VFX crew and were mainly used to provide shooting references for Sony Pictures Imageworks, who would be doing all the VFX work. These cameras all had Lockit boxes that were jammed on a regular basis so at least the timecodes were correct even if the files had to be named later. But the guys proved to be very good at slating shots and keeping good reports even if some of them were quite hilarious.

The puppetcam setup was very complicated. The idea was that the actors providing the voices and facial performances for the two animated characters would be outside the stages in a trailer with soundproof booths. They would watch the action taking place on stage on large-screen monitors via cameras mounted on lightweight Eurocranes cranes operated by puppeteers. It would be their character’s POV if you like. The on-stage actors would see the off-stage performers via small screens mounted on the crane arms under the cameras. The pictures came from Sony EX3 cameras mounted vertically in the booths. They shot performers via halfsilvered mirrors with the camera images flipped vertically, rather like a teleprompter arrangement. The pictures were recorded in HD and then downconverted to send to the set. It was a good idea and it worked quite well. But the twelve-foot-long Eurocranes, operated by puppeteers wearing Steadicam belts, were awkward and got in everyone’s way so we didn’t use them much. But we did use the HD recordings of the actors in the booths who mainly watched the shooting camera’s images. The director was able to watch their performances on the set as Mike, the video assist technician, was in full control of the Raptor HD recorders and could play back everything simultaneously.

We used Sony EX3 camers for all VFX-related reference material as they could take external tri-level sync and timecode. An early test I sent to Imageworks proved that Avid DNxHD115 files were quite acceptable and would be used for turning over shots. I could have hugged them all for that decision as conforming the original media to the editor’s cut would have been very painful.

Sound wasn’t a problem—it was well taken care of. Our sound mixer was Petur Hliddal from Local 695, who I had previously worked with on Batman Returns with Tim Burton and on Old School. And I’ve worked with Local 695 microphone boom operator Peggy Names many times—always a delight to work with. They were ably assisted by Local 695 members Gail Caroll Coe, also a microphone boom operator, and John DeMonaco, working as the utility sound technician.

What could possibly go wrong?

Well, first off, during prep, the Red Epic cameras would lose sync occasionally. The rig techs said they had a lot of trouble with this on a previous show and here it was happening again. I had my ’scope with me and, on checking the tri-level sync through the chain, all was well except for the output of the little Distribution Amp feeding the cameras. It was low, by .1 volt, despite getting the correct level in. I had a couple of little 1×2 D.A.s with me and they solved that problem.

It then transpired that the nextLAB system could not deal with AVI or Sony XDCAM files or transcode them. It ended up being my job to deal with any files that weren’t R3D files (which became known as Altmedia) and to transcode them to DNxHD115 using my Avid Media Composer and XDCAM transcoding software.

There is a common misconception that shooting digital means never having to say ‘cut’ as digital doesn’t cost money the way that film does. I had to explain to the producers that it takes 10 hours to archive one hour of material and that if we went over 2.4TB a day, simple arithmetic said that we would need more equipment and personnel or we wouldn’t keep up. That had the effect of concentrating people’s minds so that we returned to shooting an average of 1.5TB a day. All of this was archived on LTO5 tape and kept online on network-attached storage which ultimately reached a total of seven trays of 42TB each.

A further problem arose with our Red Epics. Latency became a big issue on some tightly operated shots. The minimum picture delay was two frames with the on-board Red display. To external displays, the delay could be up to five frames, depending on the monitor. I don’t think there is any way around this with 5K sensors and 2K monitoring with the current design of the camera. Our operators always managed to get the shot, although they had to be creative about anticipating movement.

Conclusions

We wrapped on December 22 after some 108 days of principal and four days of blue screen photography. Would I do it exactly the same way next time? Maybe. As I pointed out at the beginning of this article, there is no one ‘correct’ way to make a movie with digital cameras. Color correction on-set? Not necessarily. On Hugo, all color correction was done off-set as the DP was also the operator. Pre-built LUTs were used in Black Magic HD Link Pro boxes to approximate ‘the look’ that Bob Richardson wanted. And for 3D— should you converge on-set or shoot parallel and converge in post? Both ways are valid.

Technology is moving so rapidly that what was state-of-the-art six months ago isn’t necessarily so now. Techniques that weren’t available then are coming online all the time so ‘keeping up’ is essential. But not at the expense of risk. As I said to one of the producers— making a movie is like a swan sailing across a lake. You should be able to admire the artistry of the swan gliding along on the surface without any awareness of its little legs paddling away like crazy underneath. That’s our job, to make it look easy.

Beginnings of Local 695, Part 2

by Scott D. Smith, CAS

Author’s note:

This piece is a continuation of the article from the fall 2010 issue of the 695 Quarterly, which examined the early beginnings of the Local. While there is a wide range of historical events pertaining to the Local, I have chosen to focus on the events of 1933. Not only was this year crucial to the survival of the Local (and the IATSE as a whole), but it also closely mimics our current economic situation. All of the caveats contained in forward of the previous article apply here.

1933

While it is safe to say that 1932 was not a year that would be recalled fondly by most rank-and-file workers in Hollywood, few would have predicted the events that were about to be unleashed in the first quarter of 1933.

Although the general unemployment rate for the nation had risen to nearly 25%, many who toiled on the Hollywood lots were still fortunate to be working, in some cases making more than their counterparts elsewhere in the country. However, taken as a whole, the annual income for the average worker in the film industry was nothing to be excited about. While daily or hourly salaries may have looked attractive, earnings were frequently offset by long periods of unemployment with no income at all. (This was well before the advent of Social Security and Unemployment Compensation.)

Some studios, like United Artists, made efforts early on to keep their sound crews employed when off production. This might mean that someone who worked as a First Soundman during production would end up spending time as an Assistant Re-recording Mixer on the dub stage. Second or Third Soundmen, if they had technical skills, might be put to work in the maintenance shop between pictures.

This arrangement generally worked well for both the studio and employee. It provided steady employment for sound crews, which were still in rather short supply in the early 1930s, and allowed the studio to maintain a core staff of technicians to service their productions. This meant less training of new hires, which could be a headache for the sound department heads, as they sought to integrate fresh talent into their recording operations.

Not all studios subscribed to this practice, which meant that as soon as a show was finished, the sound crew would be idled at no pay until they were hired for the next project. While the studio system of the 1930s and ’40s may have the appearance of offering a more stable income for some crafts, the reality for many workers was similar to that of today, where employment was for the length of a show only.

Photo from Vitaphone set for a George Jessel short on Manhattan Opera House set ca.1926. The CTA microphone rigging is typical of the work that would be performed by the “sound grips.” (George Groves Collection)

It was sometimes not even that if you were unfortunate enough to be fired during production, a not uncommon event!

Given all these factors, it is understandable that most crew members during this period would strive to remain on good terms with both the directors and department heads, despite production schedules that called for six- or seven-day workweeks, 12 to 14 hours a day, and no overtime. While no one, from directors and actors on down, was happy with these conditions, the alternative was equally unattractive. The studio bosses knew this and the terms were made abundantly clear to all.

Those who didn’t play along would quickly find themselves unceremoniously escorted to the studio gate and thrown into the street along with their belongings. Not even department heads were exempt from such humiliation. Should you be deemed a “troublemaker,” your name would end up on the studio “blacklist” (which was rumored to have been exchanged freely among studios). Depending on circumstances, if you were fired at one studio, you might never find employment in Hollywood again.

The New Deal

On March 4th of 1933, President Franklin Delano Roosevelt, having just narrowly escaped an attempt on his life the previous month in Miami (the bullet intended for him instead took the life of Chicago Mayor Anton J. Cermak), was sworn into office. It was during this inaugural address that he famously proclaimed: “the only thing we have to fear is, fear itself.” As it turns out, there was plenty to fear…

With Democrats firmly in control of both the House and Senate, FDR wasted no time enacting a series of legislative changes that were deemed necessary to restore confidence in the U.S. financial system. The first of these was a “Bank Holiday,” instituted within hours of its passage on March 9.

This act, known as the “Emergency Banking Act,” sought to restore confidence in the solvency of U.S. banks. Similar in many ways to the action taken by our current administration, this act called for tough new reserve requirements on banks, as well as providing federal bailout money for those banks deemed crucial to the functioning of the U.S. financial system. It also removed the U.S. from the gold standard.

On Monday, March 9, all banks in the United States were ordered closed while federal examiners pored over their balance sheets to determine their solvency. After four days of non-stop grilling by the Feds, just one-third of U.S. banks were deemed sufficiently solvent to be reopened. Although just a fraction of the banks were left standing at the end of the week, the effort was largely judged a success. The effect on Hollywood, however, was disastrous.

More Pain Ahead…

On Monday, March 6, two days after Roosevelt’s inauguration (and the same day as the Bank Holiday), Will Hayes (architect of the much despised “Hayes Code”) called an emergency meeting of the MPPDA Board of Directors. This private meeting, which lasted long into the night, was attended by studio heads from most of the majors, including Sam Goldwyn (Samuel Goldwyn Studio), Nick Schenck (M-G-M), S.R. Kent (Fox), Carl Laemmle and R.H. Cochrane (Universal), Jack Cohn (Columbia), Albert and Harry Warner (Warner Bros.), Adolph Zukor (Paramount) and M.H. Aylesworth (RKO). Oddly enough, Hayes was more focused on furthering his agenda regarding the “immoral” content of current films, rather than addressing the dire economic straits facing the industry.

Despite this, initial plans for industry-wide salary cuts were hammered out among the attendees, and later presented to other studios. These called for studio employees to take a substantial reduction in salary for a period of eight weeks. Workers who made $50 or more a week were to have their salaries cut by 50%, while those making less than $50 per week would receive a smaller reduction of 25%. A minimum salary floor of $37.50 was proposed for those making more than $100/week and a $15 floor for those making less than $100/week.

However, not everyone subscribed to the “party line.” Having implemented its fourth wage reduction just three weeks previously, Universal was initially against the cut. United Artists (led by Mary Pickford and her partners), was flat out against it. And there was further turmoil as some studios later proposed a permanent reduction in wages.

The reaction from labor, including writers, musicians and actors, was swift and decisive. On March 9, at least four IATSE locals (including 695) announced that, if the studios went ahead with their plan, they would strike. Adding to this already-tense atmosphere, an earthquake registering 6.3 on the Richter scale occurred in Long Beach late Friday afternoon of the same week. The quake caused 120 deaths and $60M in damage to areas in Long Beach and Los Angeles. People’s nerves were really frayed.

While the actions of the Roosevelt Administration to restore confidence in the banking system were mostly laudable, the speed at which the legislation had been enacted left little time to analyze its effect on various sectors of the economy. Studios, already in dire straits due to falling theater attendance, relied heavily on the flow of cash from daily box-office revenue to sustain their operations. With cash flow completely shut off due to the bank holiday, even relatively solvent studios began to founder.

United Artists sound crew ca. 1928. Ed Bernds in middle row, second from left. From “Mr. Bernds Goes to Hollywood” (Photo courtesy of Scarecrow Press)

The March 14, 1933, issue of Variety succinctly summed up the total effect of labor cuts on the industry: the reductions of March 1933, combined with those of the two previous years, totaled more than $106M. All told, film payrolls at the majors were reduced from $156M in 1931, to just $50M in 1933, a staggering cut in the workforce. It was estimated that at least 90,000 employees were eliminated from studio payrolls during this period. Keep in mind, this was just for the major studios. Cuts at the independents varied, with some reporting similar salary reductions. Some, however, held out as long as possible before following suit.

(In a bit of irony that could only happen in Hollywood, a number of production companies attempted to cash in on the story of economic crisis, with more than one studio announcing plans to mount a production while it was still timely. However, on March 12, Monogram Pictures announced that their production start for Bank Holidaywould be delayed for a month. The reason cited: the bank holiday!)

Many issues arose as a result of the initial meeting on March 6th. Which employees would the cuts apply to? Would film exchange, distribution and theater employees be included in the wage cuts, or just studio workers? Clearly, some aspects of the plan had not been well thought through.

Confusion abounded as studios, actors, directors and crews attempted to sort out the terms of the salary cut. Irene Dunne, working on the RKO picture Silver Chord, refused to sign the 50% reduction without consulting her attorney, effectively shutting down production. Other actors made similar demands.

Despite these events, the film exhibition business continued to thrive for certain pictures. David O. Selznick’s production of King Kong opened in New York on March 3rd to great fanfare. Shows ran simultaneously at both the 6200 seat Radio City Music Hall and the 3700 seat Roxy Theater across the street. Crowds lined up around the block and all 10 shows were sold out for four days running, setting a box-office record. Eventually grossing $2M in its initial run, King Kong was the first film in RKO’s five-year history to turn a profit. Clearly, there were few bright spots still left in the picture business.

Labor Guilds

As the battle over wages and working conditions raged on, groups representing various studio workers became even more fractured. Actors, who had made a previous push for unionization in 1929, were anxious to establish their own bargaining group. From the previous foray made at that time under the banner of Actors Equity, six disgruntled actors met to form the Screen Actors Guild (SAG). By November of that year, they had 1,000 members, including the likes of Gary Cooper, James Cagney and George Raft.

Similarly, the writers, who felt that the Academy was nothing but a “company union,” broke away to form the Writers Guild. With rudimentary offices shared with SAG in a four-story Art Deco building on Hollywood Boulevard, both guilds made a push for recognition by the studios. Although SAG had already been recognized by the A.F. of L., the studios, still furious over their withdrawal from the Academy, refused to bargain with either entity. Another four years would pass before they would finally gain acceptance.

The Strike of July 22

Although Local 695 had been successful in signing agreements with many of the independents, as well as making some inroads into Warner Bros., most of the majors refused to budge. An ultimatum had been issued previously to Columbia Studios that if they did not ink an agreement with Local 695 by 10 a.m. on July 8, the soundmen would walk. In an effort to accommodate the studio, this deadline was then moved to the next day (July 9). The deadline was then further extended to 2 p.m., and then 3 p.m. Later, Columbia general manager Sam Briskin called to request a further extension. He maintained that the studio was signatory to the Basic Agreement, so there should be no strike. For their part, the Executive Board of 695 considered this simply a stalling tactic, as 695 was not included with the four IATSE locals represented under of the Studio Basic Agreement. In fact there were nineteen other unions or guilds working on the lots, none of whom were covered under the agreement.

In further discussion over the next ten days, Columbia (through Pat Casey, acting on Columbia’s behalf as the rep for the producers), took the position that this was a jurisdictional dispute between IBEW (International Brotherhood of Electrical Workers) and Local 695 and refused to budge. On Thursday evening, July 20, Local 695 Business Agent Harold Smith presented a formal contract proposal to the majors (as well as to independent producer Bryan Foy of Eagle Lion Studios), giving them until midnight Saturday, July 22, for an answer. The gauntlet had been thrown down. As midnight Saturday approached, with no answer from Pat Casey or the studios, a strike was called.

On Monday the 24th, Variety reported that many of the IATSE crafts would honor the action, including the Camera Local (659), Film Technicians (683), Studio Projectionists (150), Studio Mechanics (37), as well as all film lab technicians and cutters. However, the Carpenters, Studio Electricians (working under IBEW) and Musicians unions claimed they were not part of the action, and did not plan to honor it.

The major producers remained firm in their position, still claiming that it was a jurisdictional issue between Local 695 and IBEW, and planned to appeal to Washington to settle the dispute.

The studios, in a grand effort to prove that they did not need Local 695, continued production on the following Monday using replacement workers. Paramount started early that morning, working without sound, rehearsing cast members while they trained new technicians. United Artists brought in replacements from ERPI, Metro brought over men from their recording operations, replacing them with telephone and radio technicians (although they admitted they “had no idea the kind of sound it will get”). Men working under the auspices of IBEW were also transferred from other departments, some of whom already had related training in recording and broadcast operations.

The studios also placed ads in the Sunday and Monday papers urging men with broadcast and telephone experience to apply for positions. Radio appeals were broadcast and, by the end of the day Monday, more than 300 positions had been filled.

For their part, the members of Local 695 were exemplary in their behavior during the strike. Despite increased police presence at some of the lots, there were no reports of any trouble. When soundmen picketing the gate at RKO were asked to disperse by the studio police, they did so quickly. In addition, there were no reports of problems from productions working to the midnight deadline on Saturday.

The studios appeared ready to deal with the action as long as needed. They offered bunks on the lot for replacement workers, so they wouldn’t need to cross picket lines. They also culled additional staff from research operations.

Difficult Decisions

Despite the havoc unleashed by the threat of salary cuts, production did manage to continue at most studios, albeit at a much slower pace and with varying degrees of success regarding the quality of their output. While most crafts supported IATSE’s stand against the cuts, the reality of trying to survive during extremely difficult times meant that many workers were ready to cross the line to gain entry to jobs that, even in good times, might be unavailable to them. It also gave studio heads the opportunity to try to break the hold of IATSE over many of the crafts.

Two years would pass before the National Labor Relations Act would be enacted. There was nothing to stop various organizations from mounting a campaign to represent workers. It was “open season” for labor, with a variety of splinter groups claiming representation for the various crafts. Local 695, previously part of Studio Mechanics Local 37, had IBEW to contend with. As the union claiming jurisdiction over sound technicians involved in installation and, at some studios, sound maintenance, IBEW was in a prime position to launch an effort to raid the soundmen of 695.

With the breakdown in negotiations between the studios and Local 695, IBEW launched a bold effort to force the studios (and Local 695) to bow to their demands. Because IBEW controlled all electrical operations such as powerhouses, generators and power distribution (although not set electric, which was under IA jurisdiction), they were in a unique position to gain control. According to an article in the July 25, 1933, issue of Variety, Harry Briggerts, the national vice president of IBEW (and the man in charge of all IBEW locals), stated that if the producers negotiated with the soundmen (Local 695), he would “pull all his men from the studios.” He also claimed that the American Federation of Labor had granted IBEW jurisdiction over sound operations, further weakening Local 695’s position.

Studios could perhaps function without qualified sound crews but they certainly could not do without electricity. Therefore, the IBEW had both Local 695 and the producers exactly where they wanted them.

Doomsday for the IATSE

As a result, Local 695 was effectively shut out of their bargaining position with the studios. This impacted not just the soundmen, but all of the IA locals that the producers were intent on breaking. Chaos reigned. By August 14th of 1933, the number of workers who had split from the IATSE ran into the thousands. The membership of Local 37 alone, numbering about 3,000 before the strike, saw its ranks decimated to just a few hundred members.

A similar scenario was taking place within Camera Local 659. As studios backed away from direct negotiations with the IATSE, many of the cinematographers (but not operators or assistants) pushed for recognition under the auspices of the American Society of Cinematographers (ASC). As nearly every member of the ASC was also a member of the IATSE, it is unclear what the advantage may have been to the members by switching bargaining to the ASC guild, other than having their wages reduced. It also caused a significant rift within the membership, which further weakened their position with the studios.

At this point, studio owners boldly proclaimed the strike had been bust, and that production had returned to normal. (Later on, however, some reps privately admitted the strike had cost the studios about $2M in lost production time, not to mention problems caused by poorly executed work.)

The National Recovery Act

Simultaneous to the events taking place in Hollywood, the Roosevelt Administration in Washington was busy passing New Deal legislation intended to speed up the economic recovery. On June 16th of 1933, Roosevelt signed a bill creating the National Recovery Administration (NRA), charged with putting in place a set of controls for labor and industrial production. This act, which grew to affect between 4000 and 5000 businesses and 23 million workers, had a significant impact on the film industry.

In an attempt to rebuild their faltering reputation, the Academy, by now really just a representative of the studios, took an active interest in framing the rules contained in the NRA code. Over the next few months, numerous proposals were put forward by reps within the industry. Many of these related to caps on salaries as well as the ability of employers to “raid” the talent pool of other studios by offering more money. Another provision called for the creation of an “industry board” which would limit the salaries of the highest paid talent. This was not the way things got done in Hollywood and, when the code was finally released in late November, there were howls of protest. After a Thanksgiving meeting between Eddie Cantor (the head of the new SAG organization) and Franklin Roosevelt, the offending provisions were suspended. By 1935, the act itself was struck down as unconstitutional and many of its provisions were carried over into the new Wagner Act.

Local 695 and the IBEW

With their position significantly weakened by the threat of a work stoppage by IBEW Local 40, the soundmen found themselves in an extremely difficult position. Although they had managed to sign contracts with a number of the independents, the majors had (so far) successfully argued that this remained a jurisdictional dispute, and that they were simply obeying the mandate of the A.F. of L., which had given the IBEW jurisdiction over studio sound operations prior to the formation of Local 695.

With the studios allied to his cause, Larry Briggerts of the IBEW proclaimed that they had at least 2,500 men who had experience in sound (a highly debatable figure). While IBEW men had significant involvement in the installation and testing of sound equipment during the rush to equip studios for sound operations, it is doubtful that many men had experience in actual recording operations (especially since, up until 1927, there had been no recording operations!).

As a result of their influence over studio electrical operations, by early 1934, with the A.F. of L. backing them up, IBEW was able to secure contracts with many of the studios. However, the terms of the contract were less than favorable for those working under them. To entice the studios to sign, IBEW had offered a significant reduction in the wage structure as compared to that of Local 695. The studios, elated at the prospect of being able to rid themselves of Local 695, quickly signed on. This did not bode well for the men of Local 695, who, still without a contract that stipulated working conditions, would now see their wages further reduced. A dark cloud hung over the Local.

1934

By February of 1934, the IBEW had managed to make significant inroads into the studios, signing contracts for the staffing of sound operations. The First Soundmen (mixers) were beginning to see the writing on the wall, and at that point, began to distance themselves from both the IBEW and Local 695.

In early March, a group of about 125 mixers issued a statement that they were forming their own guild, along the lines of those of the cinematographers who had spearheaded the formation of the ASC into a bargaining unit, separate from the camera local. This new entity, the Society of Sound Engineers, Inc., would become the new bargaining unit for the mixers, separate from either Local 695 or the IBEW. By May, yet another organization was formed, under the moniker of the American Society of Sound Engineers, with Harold Smith (who had resigned as business agent of Local 695 in April) at the helm.

And so it went, with a new salvo in the dispute being launched almost weekly. IATSE was not going to quit without a fight. In early March, the President of the International, William C. Elliott, made a trip to Hollywood to assess the situation. According to reports at that time, Elliott’s goal was to reestablish the IATSE’s control over laboratory workers, prop men and projectionists, bringing them in under the Studio Basic Agreement. As the situation with cameramen, carpenters, soundmen and studio electricians was still in flux, he no doubt felt the best hope for reestablishing the ranks of IATSE workers in Hollywood was to focus on the crafts that were not open to jurisdictional battles yet to be sorted out by the A.F. of L.

Local 695 Rises Again

For reasons that are not immediately clear, by the end of June 1934, plans for the formation of both the American Society of Sound Engineers and the Society of Sound Engineers appear to have faltered. As of June 30, it was announced in Varietythat Harold Smith was once again helming Local 695, having been recalled by its members. Since the Local still had a few contracts to service with some of the independents, it may have been felt by the

members and Board that the soundmen had a better chance of survival if they stuck together, rather than risk a further fracturing their position by splitting some members into a guild (a plan which was not going well for the cinematographers).

It is illustrative to note the wage scale that was negotiated by the IBEW as of February 26, 1934. Note that the structure for mixers and technicians working on the lot was different for those on location. The six-hour basic rate was to satisfy a requirement imposed by the new NRA labor legislation, which called for a 35-hour week in an attempt to provide more jobs.

There was no limitation on the hours for those working on location, although the contract did stipulate that crews would be fed and housed at the studio’s expense. As a quick comparison, in 2010 dollars, this would equal about $400/day for a mixer, $299.00/day for recordists, and $225/day for boom and utility. Even with the recession raging, this was hardly anything to get excited about.

The Future

The battle between Local 695 and the IBEW would continue to rage late into 1935. Despite the hardships of the era, many of the members of 695 steadfastly refused to work under the wages and conditions as outlined by the IBEW contract. Both Local 695 and IBEW continued to petition Washington and the A.F. of L. to make a decision regarding jurisdiction, with no clear-cut mandate.

However, the International still had a few weapons they could wield in the fight, and by the end of 1935, they would put them to use.

To be continued…

Kit Cool: New Device for Boom Operators

Kit Cool: New Device for Boom Operators

by Tim Song Jones
Photos by Soli Jones

New devices for boom operators don’t come along very often. So when I saw the post from Marty Atias about the Kit Cool, I had to investigate.

I had started a discussion on the Movingmicrophones website (created by Don Coufal) asking about injuries folks had received because of working as a boom operator. Fourteen years ago, I had a herniated disc in my neck that pinched a nerve between my fourth and fifth cervical vertebrae causing an excruciating stabbing pain. Physical therapy made a world of difference. Having to hold up that darn pole is just an ergonomic nightmare. And with film cameras slowly We boom folk, being near the bottom of that other pole, will get no sympathy or break on this.

The Kit Cool is a telescoping vertical metal pole (23” to 48”) with a belt clip on the bottom and cradle with four foam rollers on the top. Your boom pole rests on the rollers, which allows for cueing left and right.

The weight is taken off your supporting arm and you steer with the other arm as normal. For long static takes (or if you are the one-man-band mixer/boomer), you can boom with one hand. And since the boom is just resting on the rollers, you can take it off in mid-take to boom normally with both hands and just as easily put it back on the rollers. A strap that goes around your neck and the vertical pole keeps it vertical. It is like a stripped down Fisher boom or Cuemaster; you wear it but it has no articulation or extension/retraction abilities. You can use any boom pole.

The metal belt clip has a knob that pops into a hole on the bottom of the vertical pole. The knob fits just snug enough to be easily removed, but it won’t come out accidentally if the vertical pole is pulled up. The strap that keeps the vertical pole up has a padded section that goes around your neck, an elastic section so you can cue forward and back and a “chest cushion” that goes between the vertical pole and your chest.

In actual use I found it works very well. It is a good idea to keep your “free” hand on the boom pole when you can as a precaution. It can be easily shifted to boom from the left or right side. The foam rollers create no extraneous noises when cueing. I found I could even sit down with the thing on (minus theboom pole). I purchased mine for $575 plus shipping from the East Coast (no, not a freebie for review). And I know many of you might be thinking, “Heck, I could make that” and I’m sure it could easily be done. I suppose other items could be attached as well; a flag for shade, video monitor, clip to hold the “sides,” drink cup, flashlight, iPhone, harmonica… So far the Kit Cool has been a real relief and it just might save your neck or whatever ails you.

I just wish they could come up with a better name. So far, reaction on the set has been good (I expected some ribbing but it’s worth it).

It is manufactured in France (hence the odd name) by Boom Audio & Video (www.boomaudiovideo.com) and is distributed in the USA by ATS Communications (www.coolcam.us). There is a good video demonstration on the website as well.

Antenna Tests Revisited

by David Waelder

Reaction to the antenna test article has generally been favorable. Several members wrote to say they appreciated the specific information to help guide choices in antenna selection and deployment.

Some readers with extensive knowledge of the theory and practice of radio transmission had questions about the methodology of the test procedure and the validity of the conclusions. The purpose of the test was always to examine a variety of antennas to determine how they compared in actual use in the field and to what extent variations in deployment translate into advantages or liabilities. To be valuable, a superior system should produce observable results in line-of-sight testing. An advantage that is reliably apparent only in laboratory tests may be genuine but of limited consequence for the user. Additional testing with devices like a spectrum analyzer is an effort to quantify and confirm the results from the field and is really tangential to the effort. Still, measurements and results should be consistent.

Wolf Seeberg and Henry Cohen raised some issues that relate to design advantages that may not be revealed in simple walk tests. Wolf is the proprietor of a video rental company and was, for a long time, a member of Local 695. Henry Cohen is the proprietor of a radio rental and service facility in New York and publishes an online journal on radio performance. They both pointed out that antennas with a circular polarized design, like helicals, are not desirable because they have additional gain but rather because they receive out-of-phase signals with minimal attenuation. In ordinary operation, we strive to maintain consistent antenna orientation for best results. With a belt pack transmitter, the antenna is typically vertical so we align the antenna on the receiver vertically. As the signal bounces off buildings, however, the phase of the signal can be altered just as the spin on a cue ball is shifted as it strikes the cushion. The consequences of phase shifting are usually minor but may have real consequence in a situation where the transmitter and receiver are moving. Doing car-to-car work in city streets, with signals reflected by buildings as the cars pass, is a prime example. Drop-outs may occur as multiple signals, some in phase and some out of phase, arrive at the receiver simultaneously. The circular polarized antenna copes with these phase reversals more effectively than log-periodics that are designed for use in a particular orientation.

This is a valid point; a simple walk test does not reveal a characteristic that might be a significant advantage in a scene with moving cars. And, insert-car scenes are relatively commonplace in our work. Still, this is an advantage that applies only in limited circumstances. Both the Sennheiser CP antenna and the PWS helical design are unwieldy devices to rig and deploy, at least compared with sharkfins, and are conspicuously more expensive. Since they seem to offer little advantage in an ordinary walk-and-talk, I would recommend against purchasing them as part of the regular kit unless you are employed on a cop show where insert-car work is a weekly event. But it would be well worth renting a pair for those days when moving-car work in an urban setting is scheduled.

Wolf also raised the issue of consistency of performance over a range of frequencies. The original tests were performed at 561.800 MHz in Block 21, a popular choice in the Los Angeles area. However, antennas are typically tuned to a particular frequency and may not perform optimally at other frequencies. It was Wolf’s contention that some particular antenna designs offered more consistent performance over a range of frequencies. I took several antennas down to LSC to check performance over their operational range using their spectrum analyzer. We compared signal loss over a range of 450 MHz to 700 MHz at 10 MHz intervals. Some of these designs are rated for performance up to about 900 MHz but the FCC prohibits radio-mike operation above 698 MHz.

Results were interesting. Virtually all of the designs I tested were strong at 450 MHz and exhibited a drop around 500 MHz or 550 MHz. Then they tended to recover and stay nearly flat until 700 MHz. The drop around 500 MHz was typically about 5 dB; none of the previously tested designs exhibited the larger losses that Wolf predicted. I did notice that there was some performance variation from example to example, not just from one design to another. I tested different examples of both PSC and Ramsey LPDAs and found some differences even between two examples of the same design. Observed differences may indicate some variance in manufacturing runs or it may just be a consequence of slightly different hook-up hardware. While there were measurable differences, nothing I observed would alter the basic conclusions of the earlier tests.

In the interest of full disclosure, it should be noted that this set of spectrum analyzer tests was conducted inside a building where reflections would certainly compromise results. But our observations were generally consistent with a previous round of testing done outside at a distance of 500 feet. For tests at multiple frequencies, we needed to use a bench analyzer that can simultaneously transmit and receive radio signals. The portable device used in the previous testing can only be configured to perform one task at a time.

While we were taking measurements, I also took some readings using ordinary whips to investigate the question of how much signal is lost with a mismatched antenna. This is an issue that comes up from time to time as users, with an antenna from an alternate block ready-to-hand, question the importance of an exact match. The answer is that it seems to depend on the frequency of the signal. At Block 21, the use of a Block 27 antenna resulted in a signal impairment of only 2 dB or 3 dB when compared with a properly matched antenna. It didn’t seem to make any difference whether the mismatch was at the transmitter or the receiver end; the loss was the same. But, with a Block 27 signal, the use of a Block 21 whip at either end resulted in a 10 dB loss compared with a properly matched antenna.

These challenges raised some interesting issues but I stand by the broad conclusions of the earlier article with one modification. The circular polarized antennas do seem to offer a genuine performance advantage while moving in an urban environment.

A brief reprise of conclusions is in order:

1. Higher gain antennas offer improved performance but the range advantage is only about 20% or 30%. No antenna doubled the effective range.

2. All of the log-periodic designs seemed to offer a similar performance advantage relative to 1/4 wavelength whips.

3. In an open environment, a good dipole antenna, like the Lectrosonics SNA600, yielded very nearly the performance of the directional sharkfins. But the directional antennas may offer an advantage in a crowded RF environment by restricting unwanted signals.

4. There was a small performance benefit to wide diversity spacing.

5. Performance was improved when the receiving antennas were raised for clear line-of-sight to the transmitter. However, once line-of-sight was achieved, we saw no further benefit from additional altitude.

6. In normal usage, helical and circular polarized antennas offered no identifiable advantage over log-periodic designs. However, circular polarized designs offer an advantage when moving in an environment that reflects radio signals.

Acknowledgments

As always, I am indebted to Coffey Sound, Professional Sound Corp. and Location Sound for the loan of equipment to test. And special thanks are owed Location Sound for the use of the test bench and to Victor Solis for his operational skills. I should also note that Henry Cohen’s criticisms were in response to my request on the Lectrosonics User Group. Errors and omissions are mine alone.

Rango

by Lee Orloff, CAS

I’ve heard it said that the two happiest days in a film person’s life are the day when we hear that we got the job and the day we hear the final “wrap” called. Rango was the exception; it was deeply satisfying to be working within a medium that had been only familiar to me from afar, yet applying a novel and unique approach to the process. While we were making it, we were all hoping to hear that a sequel was in the works.

I remember the first day the filmmakers invited me and a few other department heads over to their office to discuss the project and toss around some ideas. Gore Verbinski, the director with whom I had collaborated on six previous films, among them the first three in the Pirates of the Caribbean series, had been working on an animated feature for quite some time and was now ready to go into production. He wanted me to mix it. I immediately thought of cracking out a vintage big-diaphragm Neumann, as in the past for Nic Cage’s inner monolog on The Weather Man or other voiceovers. My instincts told me to double track the actors on a trusty U87 (or the like) along with my favorite choice of boom mike. However, as the meeting progressed, the project was gradually brought into sharper focus. Wait a second … they said this was an animated feature. What exactly did going into production mean?

We were to comprise an Emotion Capture Unit, a relatively lean unit of 85 or so of us, brought together for 22 days on Universal’s Stage 42 to record all of the cast performances as an ensemble in a comfortable and flexible setting. This would facilitate more spontaneous and natural performances than would have otherwise been possible if the production had been done using more traditional methods. It was to be shot as live action, though more akin to a motion-capture production. However, since nearly all the characters were to be rendered in non-human form, there was no need to utilize all of the motion-capture technology that we had used on past productions. One of the unique benefits of covering the action with three high-def cameras on an expansive soundstage was that it provided the animators not only with the actors’ facial expressions as reference, but all of the spatial relationship and blocking as well. The sets were largely “virtual” in nature, but there were significant elements such as a long oak bar, saloon doors, a wagon, or the mayor’s desk and wheelchair that the cast needed to play the scene. The set was neutral and evenly lit to provide consistent visual reference. The production’s expectation from the Sound Department was a tad more stringent, but also crystal clear: deliver the highest quality production track as the basis of the entire dialog recording for the film.

worked for all concerned parties.The shooting area was contained within a space delineated by neutral gray solids suspended most of the way from the perms to the floor. To achieve flat, even lighting throughout, coops were evenly spaced above. To eliminate footfalls we carpeted the entire area. We installed bafflesin the perms between the lighting instruments to dampen sound reflections that might otherwise spill over the grey solids. The construction department built us portable baffles and boxed out and vented the transformers on the rear end of the stage to lower the ambient noise floor.

Finally, the last project was to create a smaller, more intimate space where we could shoot additional “coverage” of larger scenes or shoot smaller scenes in their entirety. We wanted the smaller space to be equally flexible but a more controllable environment. I found that Universal had portable sound walls in storage that they had utilized to block the sound of outdoor events. These heavy baffles, covered by “Insul-Quilt”-type material, were suspended from the perms to create three sides of the space, with the fourth being left open for versatility. Normally, we would simply draw the rest of the solids across the opening and close off the space. For wider shots, the open side could be used to place cameras. In these instances, we would place our baffles behind the cameras as a fourth wall.

Boom microphones were used exclusively to capture the ensemble cast performance with the fullest, richest quality. You heard that right. Leave the wireless mikes back at the shop; an entire production accomplished without tagging a single actor. Due to the size of the cast and the impromptu nature of the blocking, three boom operators were needed. Jeffrey Humphreys, Brian Robinson and Mike Anderson handled the responsibilities with creativity, the utmost professionalism, and most importantly, great attitudes. Initially, I considered bringing the Sennheiser boom mikes directly into the Aaton Cantar and mixing with the Cantarem remote faders. The added flexibility of a Sonosax SX cart-based setup won out. I paired this to a Sound Devices unit doing backup duties onto CF cards that we shuttled to editorial. We used Sony F900s, synched to Lockit boxes and quieted down with barneys. The Prop Department worked with us to silence their objects as required. Video Village contained a specially constructed desk on wheels, more along the lines of a portable lectern, which contained an iMac, small Yamaha mixer, headphone distro amp, a passel of Sony MDRs and my Dynaudio near-field monitors for referencing playback as well as the video storyboard and other edited material. Our supervising sound editor, Peter Miller, made available some elements Gore wanted for playback to enhance the onset experience for the cast. On The Ring, there was an effect called “whisper keening” that we used repeatedly as a cue with the TV set. This time around we had church bells and assorted other sounds, as well as music cues, which were handled with my Steinberg Cubase–equipped laptop through a Sound Devices USBPre into a Crown amp and a pair of Technomad cabinets.

Now, it was time to assemble the concerned parties on the dub stage to listen to tracks and see whether we were all on the same page. With the gracious participation of Jon Taylor at Universal who opened up the stage to the director, producers, editor, and the studio post-production folks, we auditioned material from both the shooting areas. All agreed that we had acceptably knocked down the reflections of the empty stage, and that the quality of the recorded vocal tracks would nicely do the trick.

Imagine going to work on Day One and finding that customary walkie chatter was not about getting the cast off the stage after rehearsal and through the works, final touches and so on, but rather when they’d arrive at stage. Period. When they arrived they might don a hat, slip on an article of clothing, or prop themselves up to further get into character, but that was about it before we’d hit the lights and bell and off we’d go. Then doing far more 20-minute takes than not. Five weeks with the cross hairs aimed precisely at the same point. There were no little breaks like leisurely trips to graze at crafty or walking off stage for better cell reception during re-sets. One day I’m across in the men’s room and the PA comes running in, “They’re all waiting for you to roll.” I was mixing a show where, once the director had the take he was looking for, his only technical consideration was, “Was everything good for Sound?” One of the great things about working with Gore Verbinski, who happens to be a gifted musician among his other strengths, is that he happens to have an excellent “ear.” He is tuned in to the slightest details. Conversations about enunciation, separation, head turns and chin downs, a level that might have gotten “spicy” in his words, have been commonplace over the years. This time, that focus was probably multiplied five or tenfold which was a good thing for all of us in the department. It brought out the best in all of us, and it was a nice boost to the department’s pride, knowing that the attention to detail we all strive to provide on any show was truly understood and appreciated on Rango.

The Road to Tapeless Production

by Eric Pierce, CAS

Remember the scene in the 1987 film Broadcast News, with Joan Cusack in a mad panic, risking life and limb as she runs frantically through the broadcast studio with a videotape in hand? With seconds ticking, that was the only way to get a videotape clip to the tape room in time for the live broadcast. Thankfully, “sneaker net” has now been replaced with high-speed networks, capable of delivering huge quantities of data where it’s needed and when it’s needed. In the fast-paced world of live and live-to-tape television production, speed is the key to success. Producers have turned to high-definition media servers, such as EVS and the Grass Valley K2, to maintain the pace.

Local 695 video engineers and operators have been at the forefront of this technology, developing software and hardware interfaces to meet this need. Rick Edwards developed the ProQue system to control the import and export of clips, and to organize playbacks, including multiple playbacks in sync. Another media management system is Pi, www.playback innovations.com, developed by Local 695 member Jon Aroesty. The Pi system boasts that it can display an alert to the operator when files have been pushed, and generate multiple email updates to keep the post supervisor informed or even alert an onset producer that a clip has been delivered and is ready to roll Network servers are typically configured with multiple channels of high-definition video, each with four channels of 24 bit, 48k audio. An operator has immediate access to multiple sources that can be sent to a variety of destinations, such as multiple onset monitors, video set dressing and switcher feeds for integrating playbacks into the show. Large-scale awards productions like the Oscars, Grammys and Emmys, rely heavily on servers for set dressing and onset monitors, while also utilizing them for the instantaneous playback of multiple cued playbacks upon announcement of the winners. Previous to server playback, this was done with banks of individual tape machines all lined up and sitting at a cue. Originally, the correct tape machine would have to be rolled, and then later, when tape machines had RS-422 protocol, they would roll all tape machines, allowing enough time for them to get up to speed. The correct machine would then be selected at the switcher and pulled up at the sound board for mixing into the program. Imagine that with 10 nominees!

Data servers have the significant benefit of being able to connect to a high-speed Ethernet or fiber network for file transfers. This allows the operators to push elements from the stage to post-production, where Avid and Final Cut workstations have instant access to the files. And then post-production can push files right back to the stage from picture editorial or final mix. This is a clear advantage for live television, where very often packages are being edited right up to air time. Topical shows like Conan, Dr. Phil, The Doctors, Lopez Tonight, The Talk, etc., take advantage of networked servers to keep up with the pace these shows run. Clips and packages can be shipped from editorial and placed into a playback cue at the same time the show is being taped. In the case of Dr. Phil, the operator can record the program and save clips and, when Dr. Phil catches one of his subjects changing their story, he can say: “But earlier in the show, this is what you said!” and the clip is instantly played for all to see.

Using servers to ingest live feeds from the stage, then exporting them via network to the post-production servers, yields a huge time and cost savings. This gives editorial instant access to all the show elements which allows them to start cutting right away, making it possible to work on extraordinarily tight schedules that can sometimes allow only a matter of a few days from acquisition to delivery. Local 695 member Al Adams, one of the recordists working on Dr.Phil and The Doctors, tells us that once they complete the transition to full server acquisition, the show will save $20,000 to $30,000 each month in tape stock alone, not to mention the time saved and the creative choices made possible by using these advanced technologies.

And Joan Cusack? With high-speed servers delivering the content for her, that video clip would have made it to air and she’d still probably have time for a latte.

When Sound Was Reel – 7: Dolby comes to the movies

by Scott D. Smith, CAS

Author’s note: After a brief detour to examine some of the early history pertaining to Local 695, we now continue with our regularly scheduled program. We will continue to re-visit the continuing history of the Local in future issues.

Introduction

In the previous installment of “When Sound Was Reel,” we examined the proliferation of the “Widescreen Epic,” a format developed by the major studios to counteract the rise of broadcast television in the early 1950s. Although widescreen films would continue to be produced through the late 1960s, studio bean counters were becoming increasingly critical of these films, which typically involved significant costs for 65mm camera negative, processing, sound mixing and magnetic release prints, not to mention the large and expensive casts. As the novelty of a widescreen presentation with stereophonic sound began to wear off, studios were rethinking the costs associated with such productions.

Despite the success of a few 70mm releases during the 1960s, notably Lawrence of Arabia, 2001: A Space Odyssey and Woodstock, by 1970 the format had largely run its course. With the development of liquid gate printers, it was now possible to achieve a good blowup from original 35mm negative. When figuring the efficiencies of working with standard 35mm camera gear, no studio would consider the cost of shooting in 65mm worth the effort. Even Dr. Zhivago, released by MGM in 1965, was blown up from a 35mm camera negative.

However, the epic films established a new benchmark in quality and audiences came to expect something more than a 1.85 (aspect ratio) picture with mono sound, presented with projection equipment originally developed in the 1930s and ’40s. Moreover, by this time, quality home stereo equipment was becoming widely available and many consumers owned reel-to-reel tape decks that surpassed the fidelity of even the best 35mm optical track. If studios expected to provide a premium entertainment experience that justified the expense of the ticket, they would need to improve the overall quality of their films.

Raising the quality standard of 35mm prints to meet the expectations of road show pictures shot in 65mm (and released in 70mm with magnetic soundtracks) was a considerable challenge. While 70mm releases, even those produced from 35mm negatives, continued to be considered the “goldstandard” for big-budget releases, by the early 1970s, the advances made in print stocks, laboratory procedures and optics were beginning to shrink the gap in terms of the picture quality that could be derived from a good quality 35mm print vs. a 70mm blowup. However, the issues in regards to sound remained. Although 4-track magnetic Cinemascope prints were still in fairly wide use during the 1960s, there were considerable costs for striping and sounding onto the special “Foxhole perf” stock (with smaller sprocket holes to allow space for the mag tracks). In addition, theater owners were balking at the costs associated with replacing the 4-track magnetic heads, which wore out quickly (this was before the advent of ferrite heads).

Advances in the quality of 35mm mono optical tracks had been mostly in the area of negative and print stocks. These yielded slight improvements in frequency response and distortion but were still a long way from the quality that the average consumer could derive from a halfway decent home stereo system of the era. For the most part, significant development in optical tracks had stalled out in the 1950s, at the time most studios were directing their efforts at widescreen processes. Most of the optical recorders still in use by the early ’70s were derived from designs dating back to the 1940s, and had seen little innovation, except in the area of electronics, which had been upgraded to solid state.

Multi-Track Magnetic Recording

While the film industry was wrestling with issues of how to improve sound reproduction in a theatrical environment, other innovations were taking place in the music industry. Chief among these was the advent of multi-track analog recording, a process largely attributed to guitarist Les Paul. Since the 1930s, Les Paul had experimented with multi-layered recording techniques, whereby he could record multiple instances of his own performances, playing one part and then adding subsequent layers. His first attempts used acetate disks which, predictably, resulted in poor audio quality. Later on, Les Paul worked with Jack Mullin at Ampex, who had been commissioned by Bing Crosby to develop the Ampex 200 recorder. Recognizing the obvious advantages of working with magnetic tape as opposed to acetate disks, Les Paul took the Ampex 200 recorder, added an additional reproduce head in advance of the erase and record head, and developed the first “sound-on-sound process.” The downside of this, of course, was that each previous recording would be destroyed as a layer was added. Seeking a better solution, in 1954 Les Paul commissioned Ampex to build the first 8-track one-inch recorder, with a feature called “Sel-Sync®,” which allowed any track to be reproduced through the record head, maintaining perfect sync with any newly recorded material and not destroying the previous recording. This technique would go on to become the mainstay of multi-channel recording for both music and film well into the 1980s.

In this regard, Les Paul and the engineers at Ampex were true pioneers, developing techniques that would forever change the way that music was recorded. There was, however, one small problem. Noise.

 Enter Ray Dolby

About the time that Jack Mullin and his team were improving audio recording, another Ampex engineer, Ray Dolby, a bright fellow from Portland, Oregon, was working on the early stages of video recording. Armed with a BS degree from Stanford University (1957), Dolby soon left Ampex to pursue further studies at Cambridge University in England upon being awarded a Marshall Scholarship and NSF Fellowship. After a brief stint as a United Nations advisor in India, he returned to London in 1965 and established Dolby Laboratories. His first project was the development of a noise reduction system for analog recording, which was christened as “Dolby A.” This multi-band encode/decode process allowed program material to be compressed into a smaller dynamic range during recording, with a matching expansion during playback.

The first processor that Dolby designed was a bit of a monster. The Model A301 handled a single channel of processing and took up five units (8.75 inches) of rack space! Needless to say, it didn’t catch on in a huge way for multi-channel work. However, it made inroads into the classical recording market, especially in the UK. Understanding that the real market would be in multi-channel recording, Dolby quickly followed up with the release of the model 360/361 processors, which used a single processing card (the Cat 22), and took only one unit of rack space per channel. While this cut down the amount of rack space required for eight or 16 channels of noise reduction, it was still a bit unwieldy.

In 1972, Dolby took the development a bit further, with the release of the Dolby M system, which combined 16 channels of Cat 22 processing cards in a frame only eight rack units high. By utilizing a common power supply and smaller input/output cards, this system provided a much more cost-effective solution to multi-track recording.

Dolby Consumer Products

About two years after the release of the Dolby A301 processor, Henry Kloss (of KLH fame) persuaded Dolby to develop a simplified version of the Dolby A system for consumer use. In response, Dolby developed what is now known as the “Dolby B” system, which has found its way into millions of consumer products over the years. Unlike the Dolby A system, it utilized a single band of high-frequency companding, designed to overcome the most conspicuous defects of consumer recorders, and required a minimal number of components.

Having firmly established itself in both the professional and consumer music recording market, Dolby turned to the next challenge: film sound recording.

The “Academy Curve”

As the enthusiasm for releasing films in 35mm 4-track and 70mm 6-track magnetic waned, producers and engineers in Hollywood began to search for other solutions to improve film sound. During his initial evaluation of the process of film sound recording, Dolby determined that many of the ill’s associated with mono optical soundtracks were related to the limited response (about 12.5 kHz on a good day), as well as the effects of the “Academy Curve,” which had been established in the late 1930s. To understand how this impacted film sound, one needs to look at the development of early cinema sound systems, many of which were still in use up through the 1970s. Most early cinema sound systems (developed by RCA, Western Electric and RCA), had paltry amplification by today’s standards. In the early 1930s, it was not unusual for a 2000-seat house to be powered by a single amplifier of 25 watts or less! To be able to obtain a reasonable sound pressure level required speaker systems of very high efficiency, which meant that horn-based systems were the order of the day. Although quite efficient, most of these early systems had severely limited HF response. This was OK, though, as it helped to compensate for the problems of noise from the optical tracks.

However, compensating for the combined effects of high frequency roll-off and noise from optical tracks meant that high frequencies needed to be boosted during the re-recording process. Typically, this involved some boost during re-recording with further equalization when the optical negative was struck. While this helped to solve the problems associated with noise and HF roll-off during reproduction in the theater, it also introduced significant HF distortion, which already was problematic in the recording of optical tracks. Excessive sibilance was usually the most glaring artifact.

While the development of new cinema speaker systems in the mid- 1930s (most notably the introduction of the Shearer two-way horn system in 1936) improved the limited response of earlier systems, the HF response was still limited. This was due primarily to the combined effects of optical reproducer slit loss, high frequency losses from speakers located behind perforated screens, and amplification that was still anemic by today’s standards.

Engineers of the Academy, working cooperatively with major studios, put into place a program to standardize sound reproduction in cinemas. Recognizing that many theaters still employed earlier sound systems with limited bandwidth, the Academy settled on a compromise playback curve that would not overly tax these systems. They settled on a playback curve at the amplifier output that severely rolled off around 7 kHz. This is about the quality of AM broadcast radio. Thus was born the “Academy Curve” which would be the standard for about 40 years.

The Academy Curve Gets a Makeover

In 1969, an engineer by the name of Ioan Allen joined Dolby Laboratories and quickly began a systematic examination of the entire cinema reproduction chain. Working with a team of four engineers, Allen examined each step in the sound recording process, including location production recording, re-recording, optical recording and subsequent theatrical release. Some of what he found was surprising. Although magnetic recording had been introduced to the production and re-recording stages of film sound recording in the early 1950s, the advantages of the superior response and S/N ratio were largely negated by optical soundtracks and theater sound systems. While Allen and Dolby ultimately determined that optical tracks could be improved upon, trying to change the standards of the industry overnight was a huge order. Allen and Dolby engineers decided first to test some theories by addressing the production and re-recording part of the chain, which began with work on the music for the film Oliver in 1969.

Following those tests, Dolby A noise reduction was employed for some of the music recording on Ryan’s Daughter, which was to be released in 70mm magnetic. However, Allen and the engineering team at Dolby were frustrated by the fact that very few of these improvements actually translated into improved sound reproduction in the theater. Seeking to solve the issues related to the limited quality of the optical tracks of the day, Dolby Labs arranged to make a test using one reel from the film Jane Eyrewith Dolby A applied to the optical soundtrack. The results were rather disappointing; the noise reduction did nothing to compensate for the limited HF response and audible distortion.

During the mix of Stanley Kubrick’s film, A Clockwork Orange, Allen and Dolby convinced Kubrick and composer Walter Carlos to use the Dolby A system for premixes. However, the final release of the film was still in Academy mono. It was during these tests at Elstree Studios that Allen and Dolby determined that limitations of the Academy curve were to blame for many of the problems associated with mono optical tracks. Allen found that the measured room response at Elstree main dub stage (using Vitavox speakers, a two-way system comparable to the Altec A4) was down more than 20 dB at 8 kHz! This was in line with earlier findings by others. To compensate for this, post mixers would have to severely roll off the low end of most tracks and typically boosted the dialog tracks at least 6 dB in the area from about 3 kHz to 8 kHz. Predictably, this severely exacerbated the problems of distortion in the optical track, which typically had a low-pass filter in the system around 10–12 kHz to control distortion and sibilance. When one looked at the chain in its entirety, it was obvious that a huge amount of equalization was taking place at each of the various stages.

Around this time (late 1971), one-third octave equalizers and higher power amplifiers, were becoming available and being used in music recording. Notable among these was the Altec model 9860A one-third octave equalizer and 8050A realtime analyzer. Along with improvements in crossover design, these developments in room equalization and measurement technologies brought a new level of sophistication to auditorium sound systems and studio monitors alike.

With the advent of one-third octave equalization and good measuring tools, Allen, along with Dolby engineers, conducted further tests at Elstree Studios in late 1971 through 1972. The first thing they did was to position three KEF monitors in a near field arrangement (L/C/R) about six feet in front of the dubbing console. Based on a series of measurements and listing tests, it was determined that these were essentially “flat,” requiring no equalization.

They then inserted one-third octave equalizers into the monitor chain of the standard Vitavox behind-the-screen theater monitors and adjusted the equalization for the best subjective match to the KEF monitors located in the near-field position. While they were not surprised to find that the low-frequency response and crossover region needed to be corrected, they were rather miffed by the fact that the best subjective match between the near-field monitors and the behind-the-screen system indicated that the listeners preferred a slight roll-off in the high frequencies of the larger screen system. This was attributed to the psycho-acoustic effect of having both the picture and sound emanating from a faraway source (about 40 feet, in the case of the Elstree stage), as well as the effects of room reverberation, coupled with HF distortion artifacts. At the end of the day, however, they determined that a flat response for the screen system was not a desirable goal. This was a good thing, as it was virtually impossible to achieve in the real world.

Theater Sound in the Real World

About two years prior to the work that Allen and Dolby engineers conducted at Elstree, some significant research on sound reproduction in the cinema was published in three papers in the December 1969 issue of the SMPTE Journal. First among these was a paper somewhat dryly entitled “Standardized Sound Reproduction in Cinemas and Control Rooms” by Lennart Ljungberg. This was most notable for its introduction of the concept of “A-chain” and “B-chain” designations to cinema sound systems, with the “A-chain” representing all parts of the reproducer system up to the source switch point (i.e.: magnetic sound reproducer, optical reproducer, non-sync sources, etc.) and the “B-chain” comprising everything from the main fader to the auditorium system.

In the same issue were two other papers, one from Denmark, titled “A Report on Listening Characteristics in 25 Danish Cinemas” by Erik Rasmussen, and another from the UK, “The Evaluation and Standardization of the Loudspeaker-Acoustics Link in Motion Picture Theatres” by A.W. Lumkin and C.C. Buckle. Using both pink noise and white noise measurements, both of these papers presented some of the first modern evaluations of cinema acoustics and loudspeaker systems, and defined the challenges in the attempt to mix a soundtrack that could be universally presented in varying theaters. It also provided the basis for what would later become known as the “X Curve,” which would define a standardized equalization curve for cinemas worldwide (or, at least, that was the intent).

Standards? Who Makes These Things Up?

The origination of the “X-Curve” dates back to May of 1969, when engineers associated with the SMPTE Standards Committee held a meeting at the Moscow convention in an attempt to codify international standards related to film sound reproduction. The first draft standard produced by this committee called for a response that was down 14 dB at 8 kHz, using either pink noise or white noise inserted at the fader point in the chain (thus removing the “A-chain” characteristics from the final results). While this was a good start in standardizing theater reproduction characteristics, it was still a long way from the “wide range” playback standard that Dolby engineers envisioned. Work continued for another three years, during which time some significant wrangling occurred within the various standards committees. It would take until 1977 for an International Standard to be approved, which subsequently became the basis for SMPTE Standard 202M, defining the characteristics for dubbing rooms, review rooms and indoor theaters.

In 1982, the standard was modified to include a different EQ characteristic based on room size and reverberation time, taking into account some of the research that Allen and the Dolby engineering team had originally conducted 11 years ago at Elstree Studios. This standards stuff takes time…

At Last, Some Improvements

Largely as a result of the early work related to the improvement of theater sound reproduction standards, Dolby was able to showcase some material that showed off the capabilities of an improved cinema sound system. The first of these was a demo film titled A Quiet Revolution, which Dolby produced in 1972 as a presentation piece aimed primarily at film industry execs. This was one of the first films released which had a Dolby A encoded mono optical track, and was intended to be played back on systems which had a modified optical sound reproducer with a narrower slit that would extend the HF response. The first Dolby encoded feature film, the movie Callan, premiered at the Cannes Film Festival in 1974. Although the film received only a limited release, it did serve as a good demo for EMI and Dolby in their efforts to improve the quality of standard mono optical soundtracks. However, these efforts would soon be overshadowed by the next development of optical sound recording systems.

Next installment:
Dolby Stereo Optical Sound

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 10
  • Go to page 11
  • Go to page 12
  • Go to page 13
  • Go to Next Page »

IATSE LOCAL 695
5439 Cahuenga Boulevard
North Hollywood, CA 91601

phone  (818) 985-9204
email  info@local695.com

  • Facebook
  • Instagram
  • Twitter

IATSE Local 695

Copyright © 2023 · IATSE Local 695 · All Rights Reserved · Notices · Log out