SVGE FutureSport 2014: Fast forward into the future of slow-motion replay technology

Slo Mo group smiling

Ultra Slo-Mo Replay panel (L/R): Laurent Renard, I-MOVIX; Martijn Swart, Broadcast Rental; Laurent Petit, EVS; and Sandro Glanzer Broadcast Solutions (

Slow motion pictures have come a long way since an Austrian priest first obtained them in 1904 by using a revolving mirror system to slow down film, although it was 1963 before instant replay (off tape) in normal speed was available. However, when talking about where we are going with slow motion, super slo-mo, and ultra slo-mo, we can’t even agree on what each of those terms means, said moderator, SVG Europe contributor Philip Stevens, in his introduction to Ultra Slow-Motion Replay Update: Cameras and Servers, at FutureSport 2014.

EVS has been central to the development of slow motion for sports, and its Market Solution Manager for live sport, Laurent Petit, sees that as double speed and triple speed cameras increase in frame rate, going to six times or ten times, “then comes the confusion of how we should name those type of cameras – are they still super motion, or hyper motion, or ultra motion?”

Martijn Swart, CTO, Broadcast Rental in the Netherlands, which is “a heavy user of all slow-motion equipment, from EVS servers to ultra-slow-motion cameras,” added that there is also the confusion that “some things are recorded at 500 frames per second and get played back at 50, and something else records at 250 and plays back at 25. It gets a bit confusing — which is hyper motion or ultra motion? It’s all marketing,” he said. “I prefer to talk about the number of speeds a camera does. That makes a lot more sense.”

It also matters whether all the images from a camera are recorded or not, responded Petit. “In the normal super motion mode, we record everything, all the content is there on the server. In the hyper motion mode, you only record what you get from the memory on the camera.”

State of the art

I-Movix released a new 4K camera at NAB, and its CEO Laurent Renard believes that being able to capture in 4K, but digitally extract HD, is already becoming a major trend. “Most of our customers do a still image through the 4K framing, then they make a HD selection (in that 4K framing), then they zoom out in live conditions,” which he believes “is very interesting content for multi-platform use,” particularly as you can do it in realtime.

“One of the quests I’m on is to make ultra slow motion mainstream,” said Swart. “The current slow motion technology is fine, but obviously the focus that you have and the sharpness of the pictures in ultra slow motion is completely different, especially since my iPhone does 240 frames as second. Everybody has this in their pocket.” He wants to get smaller cameras, higher frame rates, and being able to use the equipment wirelessly, into more productions.

Comparing the World Cup in 1994 and 2014, it wasn’t just the quality of the pictures and the speed of the slow motion that was different, said Petit. “The number of cameras has increased, [as have] the number of replays that are put on air. Not just to show more angles of the game, but also to show more emotion. That was one of the editorial aspects that was very important at the last World Cup.”

For EVS, it wasn’t just a matter of providing the technology for the live broadcast, but also for all the second screen applications. One of the most important aspects for this was to keep the metadata associated with the camera. “The content goes through the complete production chain. It keeps these aspects, and it can be played as slo-mo at all the different places, even on the second screen.”

What, if any, allowances does the broadcaster have to make for the second screen within the slow-motion workflow, asked Stevens.

“We rely on the EVS servers that are present for the live broadcast television, and then we extract the content from our server, based on selection by an operator dedicated to that,” responded Petit. “The content is then sent to the cloud and distributed to the end user. You don’t need additional equipment at the venue. The infrastructure in the cloud that we have developed allows us to get the content directly on our server.

While EVS has become the standard system for replay, there are competitors, such as, which has recently entered the market. Its product manager Sandro Glanzer from Broadcast Solutions stated that: “It’s not just another replay server.” Given that so many companies have tried to enter the market that EVS has dominated for 20 years, “it doesn’t make sense bring just another copy to the market,” so is trying to offer something different.

“All our servers are based on SSD drives,” which he believes is unique in the market, “and it brings us a lot of advantages” in terms of “the incredible speed SSD provides,” how robust and reliable they are thanks to no moving parts (making it more likely they will survive OB life), and its small form factor (it has an eight-channel 1RU server with 66 hours of recording on an 8TB SSD), which makes it ideal for smaller OB vans or even for ENG. “Even if they don’t use it as slow motion, they can use it as VTRs,” said Glanzer.

Having analysed the market, “we see you always need very well trained and experienced operators,” so has tried to “make the interface as easy as possible. Operators don’t like that when I say that, but we think operators should focus on the sports and not handling the machine. That gives you more flexibility, and it saves you costs of learning the interface.”

The server is based on Windows, which can easily connect to standard storage systems, and offers remote software updates and upgrades. The latest version of its biggest, Black Jack, server allows up to 12 channels of 3G and three controllers on one system. “That means even a big OB van can work with just one four rack unit server.”

How will IP affect slo-mo?

Slo Mo wide shot

Ultra Slow-Motion Replay Update session at FutureSport, Lord’s Cricket Ground, 2 December

“We already use a SMPTE fibre, which is a 10Gbps connection, so effectively we have data transmission going to the base station,” said Swart. However, IP could improve the workflow, if we could record straight from the camera to the server, instead of storing on the camera first. Broadcasters could also benefit from remote connectivity to “simply have the cameras on site and do all the other stuff in the studio.”

Petit believes that IP is very important for interfacing, particularly for super motion or 4K cameras, which still need a lot of cables to the server. However, 4K creates 12Gbps of data to transport, so “one single 10Gbps link would not be enough for that [unless you remove some unnecessary parts of the stream], but what if you’re going to double or triple speed that?”

One answer might be to use light compression, but this requires standards first. “The technology is there, but the standardisation is not,” he said.

“We guess that 4K would have arrived much faster if there was standardisation,” added Glanzer. was ready to go 4K three years ago. “We have the servers ready, but we don’t have the interfaces for it. What sense does it make if we have an island solution and no interconnectivity to the mixers, or crosspoint switchers? As long as there is not one real standard, everyone will cook his own soup and 4K will not have its breakthrough.”

Renard sees the main benefit in IP as being able to get the content of the capture instantly, without having to wait until the end of the replay. This would be particularly useful for 4K, which (at 1,000fps) could easily produce raw images of 64GB. It would also mean it wouldn’t be necessary to have a dedicated server for each camera.

Going wireless into the future

During a pro-tour cycling race last year, Broadcast Rental put a NAC Hi-Motion camera on the back of a motorcycle. “It was a big challenge, obviously, the way the camera works, recording all of the content inside the head, but we managed to do it and gained some really great pictures, and some great insight on the inner workings of the cycling race,” said Swart. But, if you have 3x or 6x speed camera, “you’re looking at sending multiple signals via an RF link, which is already a challenge, especially if they become 1080p.”

Having an ultra slow-motion camera behind a goal can give great details pictures, “but you always want to go a step further,” such as on a Steadicam or find new places that you can put a high-speed camera, “and it is definitely a challenge to get that working,” he added.

Wireless is just one aspect of the slo-mo workflow that has to improve in future, and Stevens asked the panel what other developments they are looking at.

I-Movix currently uses Vision Research cameras with a 4K sensor “that is at least good enough”, but when 8K or 16K cameras come out, Renard wants to be the first to work with the raw images. It isn’t just the frame rate that is important, there is also the lack of motion blur, and if you have 4K or 8K framing, there is 200% or 400% zooming, “which will give you plenty of creativity, plenty of new framing, new features, to get new images for your multi-platform content.

Coming sooner than 4K or 8K, Swart believes that just getting existing ultra slow-motion technology “mainstream in all sports production” will have a bigger immediate impact, something Petit also wants.

EVS is also working on the longer-term future with camera manufacturers, “looking at the next generation interface. Looking also at what Ultra HD is after, increasing not only the resolution, but also the dynamic range, and even the frame rate for the standard camera.”

“It’s always the same in technology: larger, wider, higher, faster,” proffered Glanzer. “Here we have the same, we need more and more channels. If we think about 4K, it means we need, in general, four channels instead of one, so bandwidth needs to be increased and capacity needs to be increased. […] If we really think about 8K, and what that needs, and 16K, and what that needs, it is a never ending development.”

Subscribe and Get SVG Europe Newsletters