Riot Games streamlines production of Valorant Champions Paris with ST 2110 flypack

As the Valorant Champions Paris tournament drew to a close on Sunday 5 October at the Accor Arena, organiser Riot Games was able to look back on a successful tournament that not only saw thousands of fans enjoy high-energy, high-quality gaming, but also saw the successful debut of brand new fully ST 2110 flypack.

Production of the tournament was handled out of Riot’s Dublin Remote Broadcast Center Powered by AWS (RBC), one of two state-of-the-art facilities for producing and distributing esports content that the company operates. A Seattle RBC came online in 2023.

In many ways, the Paris tournament was based on Riot’s usual workflow with content captured on site, cut in Dublin and then coming back to the arena with a latency of just 80ms   . In addition, around 40 different feeds were distributed to global language partners from the RBC.

While the process is undoubtedly slick, the transition to 2110 is having an immediate and impressive impact.

“What we’re trying to do is speed up load in. We wanted to unify our network, see if we could eliminate or reduce the need for a truck on site, and then increase flexibility in terms of having operators in different places. ” said James Wyld, principal infrastructure engineer, Riot Games.

“Since we launched the RBC, our objective as a tech team is to get the technology out of the way of where the humans can be. The theory is, you put an entire gallery in Dublin and produce the show from there, but sometimes there’s reasons for the producer to be on site, and you can do that now. You can have one producer on site, and you can have the director and the TD somewhere else, and you can start to mix and match where people are according to the needs of the show. That’s one of the real things that I think this infrastructure enables.”

Technical load in itself has been reduced from two days to around four hours, and further improvements are expected as the setup is used more frequently.

Alex Rybalko, senior manager, broadcast engineering, Riot Games, explains: “We’re only starting to get more efficiencies with the features that we’re adding. Before we would have a whole truck just for screen switching. This new setup has a switcher in it; it has audio in it so we can do local mixing; we can put a panel somewhere else and, in fact, that can be controlled from anywhere else as well, but we can save that, and we can quickly change anything and be up and running. When you consider our schedule and the logistics, that’s super important, not even just from a simple cost savings perspective, but in terms of getting the show done.”

The flypack itself features a Cisco Nexus 9K core network running hybrid 2110 and IT infra workloads, an Evertz NEXX hybrid core router, a Sony MLS-X1 ST2110 video switcher and Calrec Type R ST 2110 audio console with modular I/O. It supports 130+ contribution paths using JPEG XS (Nevion Virtuoso) and low-latency HEVC (Sony NXL-ME80). It is fully integrated with command and control from Riot facilities worldwide, including the two RBCs.

A modular rack design has contributed to the reduced load-in time, while the use of MTP multi-strand connectors connecting all the racks also means setups can be saved, further streamlining the process.

As a company that prides itself on being at the forefront of innovation, these time savings could also bring longer term benefits.

Max Trauss, global broadcast engineering lead, Riot Games, says: “For the production team all that time is for their own iteration, making the show better and better. You can reach a limit of what your capacity for innovation is because you have to load in and get right to the show. If we buy ourselves more time, we can increase that.”

Rybalko expands: “The more we do it, the more we can shave time off, and that gives room to other teams. After all, it’s all about the fans. We want to have a better show and if we don’t have to spend money on extra days in the arena, we can spend it on fan-facing features.”

Wyld agrees: “We’ve got this pressure – we’ve got to show up every time and deliver the show, but at the same time we need to stay ahead of the game and deliver innovation too. By making these time savings in the middle, it allows us to work on both even while we’re at shows.”

The sustainability benefits of the new approach also shouldn’t be underestimated.

Rybalko says: “In terms of the footprint on site, the energy savings are considerable. For example, we now only have to worry about cooling in one place, instead of having multiple trucks with their own overhead or cooling other equipment. If you have one truck, there’s people looking at the same camera on the same monitor as in another truck. We’re deduplicating all of that and sharing resources with people that we now don’t have to have on site.

“I think it’s even more noticeable in the RBCs, because in the RBCs, it’s a data centre model. So the production control rooms are very lightweight. We have Amazon WorkSpaces for workstations, so that’s a zero client into the cloud. We have all of our heavy compute, heavy lifting equipment in the data centre, where it’s designed to be very efficient with cooling and power, and we don’t have to take on that overhead in our facilities. It’s a similar principle here. We’re centralising but being able to distribute the control and erase that boundary of physical location.”

Designing and building the RBCs was, of course, a substantial undertaking, but it’s the learnings from that process that are being implemented here. Rybalko adds: “All of this technology is what we found building the RBCs. We worked out the bugs, we worked out the workflows, and now we’re just iterating on those workflows and deploying them in the field. It’s a phased approach, with Dublin, Seattle, and now in the field with the global kits. It’s also by design, because we want to make sure that we take on those learnings and we become better with every iteration. And then some of the learnings that we found with the deployment here at Champs, we will go back and we’ll implement in our other deployments.”

Subscribe and Get SVG Europe Newsletters