Guest Analysis: AE’s Stuart Coles on Cloudview and AR innovations for Indian Super League football
The 2015 Hero Indian Super League is into its second season, writes Stuart Coles, Sales Director, Alston Elliot. Match Day 1 kicked off on October 3 and the eight teams will play a regular season until December 6 followed by semi-finals and a final on December 20. The League is co-promoted by the IMG-Reliance joint venture and Star India, and is supported by All India Football Federation. For India, Star is the official broadcaster. International rightsholders include Fox Sports Australia, Next Generation Sports Network for Canada, Japan, Korea and USA, Turner for Brazil and Abu Dhabi Media Corporation for MENA.
At Alston Elliot (AE) we’re providing our usual range of match graphics delivered using Vizrt driven by AE’s bespoke software applications. Augmenting this we are delivering presentation graphics for Star’s wraparound coverage of each match using VizTrio.
The key differences between this setup and our typical service are twofold. Firstly, the graphics hardware being used belongs to Star Sports. Traditionally AE’s service has been based around providing a one-stop solution in terms of hardware, software and operational staff. Whilst this remains the case for the vast majority of work, especially at outside broadcasts, we are increasingly being asked to provide solutions for our clients at their studio locations using in-house hardware.
The second key difference in the case of ISL is that each match is being beamed back clean, with commentary to Star’s studios in Mumbai at which point the graphics are being added downstream. Whilst basing the graphics operation at the studio creates obvious cost savings in terms of travel/accommodation and better utilisation of operational staff and equipment, it does nevertheless present some workflow issues. The major sticking point is that the production team onsite at the OB are unable to see the graphics which are being offered and added to the pictures.
However, our team has developed a new application, which we’ve called ‘Cloudview’, enabling us to solve some of these issues. Our Web Developer Andrew Turner, creator of Cloudview, explains its functionality:
“Cloudview is a cross-platform application (iOS, Android and Desktop for Windows and Mac) written to take advantage of the Adobe Air runtime. It displays information about an event by polling one of our four regional web servers situated in the UK, India, South Africa and Australia for data uploaded directly from the event or other location. The server to be used is chosen with regard to the location of the event and end users to minimise latency between the two.
“Cloudview currently runs in three modes (Editorial, Live and Information Channel) and can run all three simultaneously as well as being able to switch between multiple events. The app contacts its designated server at pre-defined intervals, which can be as frequent as three seconds, and downloads the latest available data.
“Editorial mode displays pre-prepared graphics and text and uses an intelligent caching system to minimise server traffic when updating content. Because of this feature, it can continue working even if the connection to the server is interrupted. Additionally, updates only occur when new content is actually available so minimising processor load. Live mode displays in realtime, the current on-air image and the next image (preview) to be aired. Two windows zoom in and out displaying the prioritised graphic. Information Channel displays current match stats in a tabular form and is also written to handle updating only the data that has changed, again minimising processor load.”
So far Cloudview has been launched for a number of our football and rugby clients. As we know, the desire for remote production is increasing as connectivity continues to improve and we see this app as a key feature in enabling us to offer our clients remote solutions without impacting negatively on the quality of our graphics service.
Augmented Reality content – OB
In contrast to the offsite solution for core graphics, we do have some onsite presence at each match OB; delivering a one-stop telestration and AR solution. Using AE hardware and software application our operator is able to offer a mixture of telestration tools and AR content, including advertising, using the Viz Arena system.
We opted to use Viz Arena for this project because, as a company with significant Vizrt experience and knowhow, it affords us a blank canvas in terms of the aesthetic enhancements we can bring to the tools on offer.
Touchscreen and AR content
In addition to the AR content generated at the OB, to further augment Star’s presentation coverage of ISL, we have been producing AR content at the Mumbai studio. AE has for some time delivered AR graphics in an OB and studio environment using camera data from a variety of systems such as NCAM, Stype and Shotoku. The point of difference in this instance is that we have integrated AR with our touchscreen. Below, an explanation from the designer responsible for the project, Chris Izatt:
“We have been delivering many high end touchscreen solutions for a wide variety of broadcasters all over the world since 2011. Every time we are asked to provide a new touchscreen solution, we re-visit the methods and solutions we have used in the past to see if we can make the process more streamlined and efficient — allowing us to spend more time finessing the overall look, feel and functionality of the final product.
“This approach has seen us develop a number of custom plugins that we have created for use within the Vizrt Artist/Engine environment. These plugins have made it possible for our team of designers to exert an incredible amount of control and detail to the vast number of possible transitions that you are faced with when creating a touchscreen with such a wide variety of information graphics available to the presenter.
“We were asked to pitch a number of ideas to Star for their 2015 coverage of the Hero Indian Super League football tournament. Our existing relationship with them meant that we already knew the Augmented Reality capabilities they had recently acquired for Studios A and B that were going to be using for the ISL coverage. We immediately started looking into the possibility of creating a touchscreen solution that would also control the graphics being produced via the AR Vizrt Engine.
“We wanted to mirror the presenter’s interaction with the touchscreen into the physical studio set in front of them so that the director would be able to show their analysis from a different perspective. We knew that we needed to come up with a solution that was extremely fast, otherwise the viewer would notice delays and the end result would not look very responsive.
“We ended up utilising the scripting capabilities within Viz Artist/Engine to engineer a system of raising events from the master touchscreen scene whenever the presenter interacted with the touchscreen. We then created the augmented reality graphics and set them up to listen for the events being raised from the master touchscreen scene.
“Initially we encountered performance issues that resulted in some events taking almost 10 seconds to get through to the AR engine. We realised that too many events were being raised via the master touchscreen scene and so we refined our scripts until we had a more intelligent system that was a lot more stringent about deciding when certain events should be raised.
“Whenever a counter is touched, dragged, released, highlighted etc. an event is raised, publishing its status to a centralised point on the network. The graphics running on the AR engine have pre-registered into this central system and, as a result, are notified whenever anything changes. This means that the solution is easily scalable and we can have any number of different Vizrt engines listening for the events raised via the master touchscreen scene.
“We needed to ensure that the final solution was extremely easy to operate in a live environment. The graphics operator simply needed to load the specific graphics into the AR engine and the master touchscreen did the rest, ensuring that the AR output synchronised itself with whatever was being controlled via the master touchscreen being operated by the presenters.”