SMPTE 2015 draws record crowd, tackles future direction for UHD, HDR and IP

The 2015 SMPTE Annual Technical Conference and Exhibition in Hollywood last week drew a record crowd and set an all-time high for exhibits

SMPTE’s Annual Technology Conference and Exhibition in Los Angeles last week provided plenty of food for thought about the future of the industry. And it appears that many in the industry are hungry for information: attendance reached all-time highs, and demand for exhibits required additional space at the hotel.

Among the fascinating presentations, one by ARRIS Group Fellow Dr. Sean McCarthy cast new light on a future where high dynamic range, wide colour gamut and high frame rates are part of the viewing experience. His overview of how the brain processes those elements into a cohesive experience challenged some notions on the benefits of bigger, more, better.

To begin with, he said that, while it may be tempting to consider resolution, luminance, colour gamut and frame rate as separate and distinct, the human visual system doesn’t and, therefore, they need to be considered as integrated features.

“Everything is integrated in the human experience, and it all comes together,” he said. “We need more research on luminance and how the brain invents colour, because that is what our brains do.”

One of the most basic beliefs is that bigger is better. But McCarthy noted that, although adding more pixels allows a larger screen, if the viewing angle is wider than 30 degrees, blind spots move from being outside the screen’s edge to within the viewer’s field of view. That means that a UHD viewing experience on a massive screen can be very different from that on a tablet, where there is no blind spot.

The upside of the bigger screen, however, is that it can require the viewer to move his or her head to take in all the information. That head movement is an important part of creating a sense of immersion. A tablet, for example, cannot offer that sense of immersion because moving the eyes alone is all that is required to take in the visual information.

McCarthy pointed out that higher frame rates help visual acuity: viewers watching an object or athlete moving faster than 0.03-0.07 degrees per frame (at 60 fps) will lose visual acuity. “The higher frame rates could help the viewer’s gaze maintain a sub-threshold retinal slip and have an experience that is closer to real life as they can track motion better.”

One interesting drawback to the move to HDR, McCarthy noted, was the bleaching adaptation that can occur when, for example, an SDR and an HDR commercial are run back to back.

“After watching an SDR commercial or content,” he explained, “it will take a minute or two for the eye to adjust, and, during that time, the viewer will lose the ability to capture HDR information.”

Luminance can also have an interesting impact on the viewer’s experience. As a picture brightens, the brain’s neural responses get faster, there is greater sensitivity to flicker, and stroboscopic effects become more noticeable.

“You can use brightness to evoke a sense of jutter or flicker,” McCarthy said. “And luminance can also affect the perceived color hue as, above 500 nm, things will appear more yellow and, below 500 nm, things will look more blue.”

Another phenomenon is the Stevens Effect of Adaptation, in which perceived brightness of constant luminance decreases over time.

Regardless of the challenges ahead, McCarthy is bullish on a future where viewers can have a more lifelike viewing experience.

UHD in a hybrid SDI/IP world
It’s no surprise that UHD requires a lot of data to be moved, and the current standard workflow is to tie four 3-Gbps SDI links together to move the UHD payload. And, while transport of UHD over four 3-Gbps SDI circuits (typically called ‘quad-link 3G SDI’) was standardized in 2014, SMPTE is hard at work on improving the situation to make IP-based UHD transport a reality as well.

On the opening morning, Randy Conrod, senior product manager, digital products, Imagine Communications, and Nigel Seth Smith, Signal Integrity Products Group, Semtech, offered an overview of UHD in a hybrid SDI/IP world.

“The standards are very flexible and interoperable,” said Conrod, “and it is easy to translate signals from quad 3G to 12-Gbps SDI. That is important because the infrastructure is not limited to the formats it can carry and interface with.”

There are currently three ways to move UHD signals: use SDI signals (like the quad-link 3-Gbps SDI approach mentioned earlier), move to an Ethernet-based system, or adopt compression at every interface. “Time-critical switching using IP is still a niche,” Conrod noted. “A hybrid approach where time is critical can use SDI while non-realtime uses can use IP.”

With respect to UHD over IP, 10-Gbps Ethernet is the dominant high-end technology, and work is being done to allow first-generation UHD (where just the resolution has increased) to be transported over a single 10-Gbps circuit. The use of multiple Ethernet links for a single UHD image is currently not defined, and, more important, the next single Ethernet link has not been decided.

But the addition of higher frame rates or higher dynamic range will require a move to 40-Gbps Ethernet. And, if the desire is to move signals with UHD resolution, twice the frames, and expanded dynamic range, the demands require 100-Gbps Ethernet. The good news is that, this year, the concept of using four 25-Gbps Ethernet connections to move 100 Gbps was proved out.

“Plus, you can take one of those four 25-Gbps Ethernet connections and have a 25-Gbps connection that is suitable for broadcast uses,” said Conrod. “We are still in the early days of standardisation, but there is some action.”

The use of mezzanine compression is another option as long as it is 2:1 or 20:1, depending on the image format and the interface bandwidth. The problem is that compression will result in latency of at least one frame, an issue for real-time production environments.

“It is possible today to go all IP for UHD transport, but a hybrid of SDI and IP may be the best,” added Conrod. “It’s a little bit more of a challenge to go all IP. But, whatever the choice, let’s not have analysis paralysis.”

Thomas Edwards, VP, engineering and operations, Fox Networks, described how Fox Sports is already putting live IP-based production systems to use for live HD sports production.

“IP enhances the flexibility and agility of the video,” he said. “It is denser than SDI, bidirectional, and agnostic to resolution, bit depth, and frame rate.”

Other benefits include compatibility with Ethernet-network switches and commodity devices and the ability of associate IP streams with groups of media. In addition, network-based registration and discovery of devices means that the system can recognise which devices, such as cameras, are plugged in and ready to go.

The key is the use of VSF TR-03, a proposal for a common protocol that Edwards believes will eventually become a SMPTE standard. It allows the video, audio, and ancillary data to be separated and then forwarded through the IP network and recombined by the production team as needed.

The use of VSF TR-03 also means more signal efficiency. Today’s SDI workflows require as much as 38% of the bandwidth to be dedicated to ancillary data. With VSF TR-03, as many as six 720p or 1080i HD-SDI streams can fit into a 10-Gbps Ethernet circuit; if only the active video area is used, that jumps to seven streams in 1080i and eight in 720p.

“It’s time to do away with the horizontal and vertical blanking area,” McCarthy said to applause from the audience.

Subscribe and Get SVG Europe Newsletters