IBC 2024: Qvest Teams Up with NVIDIA and Arvato
Qvest (10.C24), a consultant and systems integrator in media-focused services, has announced a collaboration with NVIDIA to drive AI adoption in the media and entertainment industry. These solutions powered by generative AI (GenAI), will create new opportunities for revenue generation, customer engagement, and operational efficiency.
In addition, Qvest has introduced the new version of clipbox, its studio server designed for advanced ingest and playout. With this latest release, clipbox improves studio ingest and playout and becomes the first studio server to fully-support Arvato Vidispine’s new renderless workflow powered by the Web Render Engine (WRE).
Qvest to accelerate generative AI for M&E businesses in collaboration with NVIDIA
Qvest is expanding its alliance program in artificial intelligence to further position the company as a leader in AI for media businesses. By fostering collaboration across industries and leveraging shared expertise, Qvest delivers tangible GenAI applications that drive ROI.
The collaboration with NVIDIA combines Qvest’s deep expertise in media and entertainment with NVIDIA’s leading AI capabilities and technology platform. This will deliver advanced AI-driven solutions that empower companies to harness the full potential of GenAI.
Using NVIDIA’s AI technologies, Qvest is further developing GenAI offerings in its Applied AI practice that address the key business challenges of media companies and help increase their productivity, promote creativity and optimize processes.
“The collaboration with NVIDIA enables us to quickly deliver powerful use cases for media and entertainment companies. Together, we are helping our customers generate value from AI in areas most impactful for their business,” says Christophe Ponsart, EVP and Applied AI Practice Co-Lead, Qvest US.
A GenAI solution will be unveiled at the IBC Show in Amsterdam, streamlining content management and creation within existing media workflows for:
- Automated Metadata Tagging: Drastically reduces time and costs associated with manual cataloging by automating structured metadata tagging across multiple modalities (text, audio, vision).
- Accelerated Content Discovery: User enablement to quickly find relevant content across and within video assets.
- Empowered Content Generation: Generation and iteration on contextually relevant headlines, scripts, and narratives in seconds to accelerate time to market.
“Media and entertainment companies are looking to accelerate AI adoption to deliver enhanced experiences. By combining NVIDIA’s AI platform with Qvest’s industry expertise, Qvest is empowering customers to quickly move from idea to scalable implementation using the latest developments in AI,” said Richard Kerris, Vice President of Media and Entertainment at NVIDIA.
NVIDIA NIM inference microservices play a critical role in this collaboration, offering seamless hosting of state-of-the-art GenAI models for computer vision, content generation, audio transcription, and more. Companies can choose where to run their models, whether in the cloud or hosted locally, helping ensure that data remains secure while optimizing performance.
Qvest is committed to continuing its collaboration with NVIDIA, with plans to integrate NVIDIAHoloscan for Media, an AI-enabled, software-defined platform for live media, for real-time streaming ingestion and content discovery. This integration will help further enhance the capabilities of GenAI infrastructure, increasing security, reducing costs, boosting performance, and minimizing operational overhead.
Qvest and Arvato Systems’ Vidispine team partner to enhance newsroom collaboration
Qvest has introduced the new version of clipbox, its studio server designed for advanced ingest and playout. With this latest release, clipbox improves studio ingest and playout and becomes the first studio server to fully-support Vidispine’s new renderless workflow powered by the Web Render Engine (WRE). For editors and operators in the newsroom environment, this means better collaboration and a reduction in production delays.
This latest version of clipbox gives users even greater customization for their ingest and studio playout workflows, especially in fast-turnaround news workflows where every second counts. It also provides full support for real-time rendering with transition effects, so users can play out sequences directly, without requiring a vision mixer.
Additional benefits also include clipbox’s ability to create proxy files in parallel to either HD or UHD recording which can be opened and edited during ingest and which are fully synchronized with the full resolution assets. This saves editors a huge amount of time which would otherwise be spent waiting for the recording to be finalized before being able to edit.
“Customers all over the world are already using clipbox for their studio ingest and playout. It’s exciting to think about how much more efficient we’ll be able to help make their workflows with this latest version, while of course continuing to be the most cost-efficient newsroom studio server available,” commented Frank Mistol, Managing Director Qvest Stream GmbH.
“Vidispine’s Web Render Engine revolutionizes production by allowing content to be stored and managed efficiently without creating new files for every sequence. Only metadata is generated when users edit a sequence, eliminating the need for file movement or rendering. This enables multiple editors to work simultaneously on the same content from anywhere”, said Karsten Schragmann, Head of Product Management at Vidispine.
The renderless workflow created by the Web Render Engine, which has been developed by Arvato Systems’ Vidispine team, is designed to save editors a huge amount of time, regardless of where they are. With this in place, multiple editors can work on the same content at the same time without needing to wait for any assets to render. With clipbox supporting WRE sequences natively, a unique fast turnaround workflow is created with no videos needing rendering in advance. Even transitions can be executed in real time. The workflow sees video clips able to playout from a shared memory, removing the need to copy them at the expense of time or costs. It also supports playout of sequences while content is still recorded, which allows time delay playback.
“We’ve always had a great relationship with the Vidispine team and we’re really happy for that to continue as we head to this year’s IBC Show knowing that clipbox is the first studio server to fully-support its Web Render Engine,” Frank Mistol adds.
A demo of the partner integration will be available on both the Qvest booth (10.C24) and Vidispine booth (7.A15) at IBC 2024.