By Will Waters, Audinate, principal product manager.
If you’ve been around sports broadcasting for more than a few years, you’ve probably noticed a shift in how audio moves around. The snake cables and patch bays are still in some venues, but increasingly, the real work happens over the network. Audio is becoming a shared resource rather than a collection of point-to-point connections.
That change isn’t just about swapping copper for Cat6. It’s fundamentally reshaping how productions are staffed, how shows get turned around between games, and how multiple teams tell different stories from the same event.
Production hardware configurations
Sports production setups vary wildly depending on scale and requirements. A regional college basketball game might use a small flypack with a compact mixing console, a handful of wireless mics, and basic intercom. A major league playoff game involves full-production trucks with large-format consoles, dozens of RF channels, extensive intercom matrices, and simultaneous feeds to multiple destinations.
What ties these different scales together is the underlying infrastructure. Whether you’re working with a Calrec console in a large truck or a compact Yamaha mixer in a small venue, the trend is toward networked connectivity. Mixing surfaces, I/O boxes, intercom stations, commentary systems, and effects processors increasingly speak the same network language.
The hardware ranges from traditional broadcast consoles with network cards to software mixers running on commercial servers. Commentary kits that used to be standalone analog boxes now connect via network ports. Wireless mic receivers output directly to the network. Even graphics and replay systems subscribe to audio feeds over IP rather than requiring dedicated embedders and de-embedders.
Choosing the right equipment
The question isn’t just “does it sound good” anymore. It’s whether the gear can integrate with the rest of your workflow without creating bottlenecks. Can it scale when needed? Can it support remote talent? Can it share resources with another simultaneous production?
These operational questions push teams toward equipment that supports standard networking protocols. An audio console might be excellent on its own, but if it can’t participate in a networked workflow, it becomes an island that requires special handling.
Software is playing a bigger role as well. Cloud-based mixing applications, virtual intercom clients, and software routers are replacing dedicated hardware. The key is choosing tools that can operate both on-premises and in cloud environments when needed.
For most large-scale deployments, the deciding factor is predictable behavior across different gear from different vendors. That’s why platforms like Dante have become common: they provide a consistent layer that different manufacturers build into their products, so a Calrec console, an RTS intercom, and a Focusrite interface all discover each other automatically and share a common clock.
Why AV-over-IP is the clear choice
Traditional point-to-point audio connections have limitations that become obvious in modern productions. Every new source needs a physical cable run. Rerouting means physically repatching. Sharing a feed with multiple destinations requires distribution amplifiers or splitters. Scaling up means more copper, more connectors, and more potential points of failure.
Audio over IP solves these problems by treating audio as network data. A single network cable can carry hundreds of channels. Routing is software-based, so changes happen in seconds. Audio feeds can be subscribed to by many destinations without additional hardware.
The practical benefits show up immediately. When NBC Sports needed to support multiple simultaneous productions from its Stamford facility, it built around networked audio, which currently manages 10,000 sources and 12,000 destinations across 500 devices. That scale would be physically impossible with traditional wiring.
More importantly, IP audio enables workflows that weren’t feasible before: remote production where mixing happens hundreds of miles from the venue, cloud-based production where the entire control room is virtualized, and alternate broadcasts that reuse the same venue feeds without requiring duplicate infrastructure.
Demanding live events
Live sports events present some of the most demanding scenarios for networked audio. Crews arrive at a venue with cases full of gear, connect to existing infrastructure, and need everything working immediately. There’s no time for troubleshooting on game day.
This is where standardized audio networking proves its value. Mobile production units increasingly come equipped with network audio infrastructure that can quickly patch into venue systems. When a venue’s permanent installation already has networked audio endpoints, mobile units can tie in through a few network connections rather than running hundreds of feet of copper.
A typical setup might include wireless mic receivers scattered around the venue, all feeding into the network. Commentary positions with local announce consoles connect via network drops. Intercom stations for camera operators, stage managers, and truck crews all share the same infrastructure. Effects mics feed in as network streams. The production truck subscribes to the feeds it needs, and the house PA system can simultaneously subscribe to multiple sources to enhance the in-venue experience.
The advantage is speed and flexibility. If a commentator needs to move between booths at the last minute, or if a new crowd mic is required, both can be accommodated without running new cable.
Redundancy matters in live events. Networked systems typically run primary and secondary network paths. If a switch fails or a cable gets damaged, audio automatically fails over to the backup path without interruption.
Studios benefit with audio
Studio environments benefit from different aspects of audio networking. The challenge isn’t quick setup at a new venue but rather managing multiple shows turning over throughout the day, often with resources shared between control rooms.
In a networked studio facility, audio becomes a shared resource. Any control room can access any booth through the network. This flexibility means that shows can swap control rooms when scheduling conflicts arise, without technical limitations getting in the way.
The NBC Sports Stamford Operations Center demonstrates this at scale, with 12 production control rooms, 15 audio control rooms, and multiple submix rooms all sharing networked audio infrastructure. Their Field Acquisition Unit model relies on this architecture: mobile units at venues handle only signal acquisition, while core production tasks like audio mixing occur back at the centralised facility.
Day-to-day operations in a networked studio mean faster turnarounds. The morning show wraps, and the afternoon program can reconfigure audio routes in minutes rather than requiring physical repatching.
Connecting legacy equipment
No broadcaster starts from scratch. Every facility has consoles, wireless systems, intercom gear, and playback devices that predate the IP era. Much of that equipment remains reliable and familiar, and there’s no business case to replace it to adopt a new connectivity standard.
Audio networking handles this reality through converters that are bridges between legacy gear and the IP network. These interface boxes come in various forms: rack-mount units for fixed installations, small portable adapters for mobile productions, and even PCIe cards. The A1 doesn’t need to know or care that a particular source started as analog.
Format conversions also happen transparently. Properly configured, audio networks can translate between formats like AES67, SMPTE ST 2110, MADI, and AES3, presenting everything as a unified set of sources.
The goal isn’t purity – it’s practical operation. Broadcasters can keep using trusted legacy equipment while gradually expanding networked infrastructure around it. As old gear eventually needs replacement, the new equipment slots into the existing network without requiring a forklift upgrade of the entire facility.