Bringing a seamless live production workflow to Valve’s Dota 2 TI esports tournament
When one of the world’s largest gaming developers approached specialist Norwegian virtual production agency, Myreze, it was tasked with a project of a mammoth scale; producing a virtual set and live streamed workflow to bring the once physical Dota 2 esports stadium tournament to a global audience.
With a prize pot of $40 million, expectations and the quality of delivery were high for gaming giant Valve, which runs the annual tournament. The outcome of the collaboration produced more than 125 hours of live content across the ten-day event, with over 110 million hours of content watched (excluding China). The tournament also made Twitch’s third most watched video of all time and was nominated as a ‘Best e-Sports Event’ at the 2021 Game Awards.
Myreze president and partner, Jørgen Steinheim, explained how his team delivered such an ambitious project to a unique online audience of more than 100 million under such challenging circumstances.
“We were very excited to collaborate with Valve for the first time. With the physical tournament cancelled due to the escalating COVID-19 situation, we were tasked with supporting delivery of the first Dota 2 tournament as a virtual production.”
The team at Myreze had the technical expertise to implement a solution that would ensure a virtual production of this magnitude could be executed. Still, the challenge at hand was ensuring the workflow was seamlessly implemented. To do this, they relied on various tools, including Unreal Engine, Pixotope, 3D Max and Blender.
Crucially, they needed to track the pathway and movement of 15 Grass Valley cameras during the production. The biggest challenge this posed came during rehearsals due to time constraints. “Rehearsals are the only time where it is possible to adjust the alignment for the tracking,” says Jørgen.
“To achieve this, we developed a recording and playback solution based around multiple Blackmagic Design HyperDeck Studio 4K Pro broadcast decks and Teranex Mini Audio to SDI 12G converters.”
The system was used to capture video and audio during rehearsals with timecode and then embedded via a series of Teranex Mini Audio to SDI 12G converters.
“We could then play back those recordings offline using the Blackmagic HyperDecks, playing back the tracking data inside Pixotope, where the virtual studio and on-air graphics were created,” notes Jørgen.
“During rehearsals, we’d adjust the graphics to fit the scene. This was time-consuming and required each camera to be operated at all times. However, with the ability to work offline using the tracking data we’d recorded, the production made significant savings in both crew costs and on rehearsal time.
“For example, delays elsewhere during rehearsals meant that we had to align all of the broadcast graphics within the virtual set after Valve’s production team had gone home, something that was only possible because of the newly developed workflow process,” he goes on.
Reflecting on the pandemic, Jørgen adds that the Myreze team has had to accelerate almost ten years’ worth of learnings into a two year timeframe.
He says: “We’ve seen unprecedented growth in the requirement for virtual event production. While this has been a challenge, we are thrilled to see clients accelerating their use of virtual production techniques to access reach they weren’t able to achieve previously.”
Myreze has worked with international clients across broadcast, event and virtual industries. Clients have included Valve, The Weather Channel, Epic Games, Eurosport and Bloomberg.