Share this post:
Artificial Intelligence (AI) has the potential to revolutionize video, from acquisition through production and distribution to a better engagement and more personalized viewer experience. Maybe the true game-changer for video is the hyper-personalized user experience that wouldn’t exist without AI. IBM’s latest media report highlights how AI let broadcasters and other media companies engage with viewers like never before.
Imagine the manual workload broadcasters undertake to compile 10-minute recaps of the dozens of live sporting events taking place daily all over the world. Then, factor in what the human eye misses. People can’t possibly catch every play, pitch, goal, fumble or flag.
Highlights, whether sports, film or television, are the fastest growing segment of video, with the enterprise video market expected to grow to nearly USD 20 billion by 2023. Aiming to grab more of that burgeoning market, content creators can use AI to analyze massive amounts of video and data.
Breakthrough fan and market engagement
A recent example of how AI is changing the viewer experience is capturing highlight moments at the 2019 Masters tournament, the first of golf’s four major championships in this year. IBM Watson’s AI solution pulls together highlight packages for fans based on the players they’ve identified as favorites. Clips are assessed and analyzed by sound, player gestures, and emotion; natural language processing seeks out excited verbal queues in broadcast commentary. Highlights are scored and indexed, giving fans quick access to the custom content they want to see.
Furthermore, a new and unique experience became available for viewers to watch the best highlights of a competitor’s ‘Round in Three Minutes’. These video packages are enriched with metadata – such as the player’s name and the hole in which the stroke took place – and run against a set of rules that help ensure the best clips are included in the round summary.
This breakthrough fan engagement has been applied before by Fox Sports at the 2018 FIFA World Cup, enabling fans to create and share their own customized soccer highlight videos. Another tournament using IBM Watson AI technology to enrich the fan experience is Wimbledon. In 2017, the Watson AI platform helped to create highlights for the first time, and Wimbledon 2018 took it a step further. Watson analyses a number of areas to spot the key action points, including the noise level of the crowd, and even celebrations or gestures from the players, to identify what are the real highlights of a match.
With the ability to have new conversations with its audience, broadcasters become a platform for fan and market engagement. Advertisers will take notice, as will telecom companies investing in sport rights to differentiate “quad play” bundled packages of broadband, landline, mobile phone, and TV contracts to increase customer loyalty and average revenue per user (ARPU).
Expanding the value and performance of video
Beyond sports, AI can help drive consumption of digital content by making it easier to classify and find. Most everyone wrestles with endless menu options when looking for something to watch on streaming video services. The choices offered across platforms are overwhelming. AI and personalization have the potential to help viewers find content they want regardless of platform by pulling content from all sources, be they Netflix, Amazon Prime, HBO, or iTunes. For traditional multichannel video programming distributors (MVPDs) and cable operators, using analytics and AI to understand cloud DVR patterns can create personalized offers and help improve churn.
Intelligent systems can learn about viewing behavior to understand what the user likes to watch. When combined with technologies like facial recognition—such as using a front-facing camera on a laptop—the system can understand the emotional state of the viewer based on physical reaction and other cues. It would then recommend videos the viewer would most likely enjoy watching, leading to highly personalized media recommendations.
When Watson went to film school
IBM scientists collaborated with American film studio 20th Century Fox to create the first-ever AI movie trailer for the 2016 sci-fi thriller Morgan. Using IBM Watson APIs and machine learning techniques, the system analyzed the trailers of 100 movies in the horror and thriller genre.
Watson was then fed the full-length film and programmed to make a trailer based on what perceived fear looks and sounds like, and provided the filmmaker a total of 6 minutes of footage. The traditional process to create a movie trailer can take days. But the AI-built trailer only required about 24 hours from the moment the system first watched Morgan to final editing.
While AI can’t fully equate the human touch creatively, it can optimize workflows and media processes and hyper-personalize the viewer experience to gain more value from video content. Which bring us to some questions for you to consider. What key repetitive media content processes could you replace with AI? And what is your plan to use AI to increase consumer engagement with your video content?
To learn more, download the full report from: ibm.co/aisportsvideo