Innovation

Analysing Sports in a Virtual Reality Environment Using Conversational Interfaces

Share this post:

Tools in the sports industry today regularly represent 3-dimensional data, e.g. player or ball positions, in a 2-dimensional plane. Whilst this is accessible to most, and can provide an overview of the information recorded for analysis, it does lose a sense of reality and context. It’s easy for an analyst to assess a match from a birds-eye view, but they lose the speed of play, they lose the pressure that a player would have felt in that scenario, it’s difficult to stand in the player’s shoes.

Here, we present our thoughts and a proof of concept that demonstrates the value of using Virtual Reality to explore positional and performance data in sports.

This idea isn’t completely brand-new. There are companies exploring this, and this year saw Sky Sports debut the Beyond Sports VR offering on-air.

Our project though, includes more detailed analysis, creates a more seamless, immersive experience through conversational interfaces and fundamentally, was built by one developer in just 7 days. This development time included the building of the 3d model of the stadium, integrating data sources from Opta and TRACAB and finally interfacing this data with the IBM Cloud Services IBM Watson Assistant, Speech-to-Text  and Text-to-Speech.

 

An Additional Sense of Immersion

The core of this demonstration, and a lot of the Virtual Reality work our team develops, is a “Conversational Interface”. Rather than having menu options, dropdown bars and filters to pick from, you converse with your environment, data and analysis.

As such, this project is constructed on the existing Conversational UI work we have conducted over the past year with the likes of Leatherhead FC, the Women’s World Cup and the NFL.

Virtual Reality generally uses handheld controllers to navigate around the virtual world. Alternatively, depending on the space available, users can also actually move around a physical space, with the movement then being reflected in the digital space. This is generally very effective at immersing the user in the virtual world.

A screenshot of the Watson Assistant tooling where, in this case, Watson has been trained to recognise a range of entities for the Women’s World Cup.

 

One thing we’ve found missing though is that the immersion of interaction with digital characters falls down when you’re having to select from pre-defined responses to converse. The alternative we use is IBM’s Watson Unity SDK and the IBM Cloud services IBM Watson Assistant, Speech-to-Text and Text-to-Speech.

New users to VR, or in scenarios where you’re exploring very large data sets, Virtual Reality user interfaces currently don’t offer intuitive exploration methods. In our demonstration, you just need to ask IBM Watson what you’d like to analyse and see.

Analytics War Room

The experience we’ve developed consists of two key action areas, the football pitch and a box/suite in the stadium. The user starts in the executive suite. This room is loosely modelled on the analytics war room that IBM built for the Toronto Raptors, This was built for the Raptors a few years ago, and has played a contributing factor to their recent success in winning their franchise’s first ever NBA Championship.

IBM & Toronto Raptors War Room – Used by recruiters, scouts and coaching staff for the NBA’s Raptors.

In our virtual war room the user has a “god” view of the pitch. Here, they track the players’ movement in-sync with the video footage being played on a large wall-size screen, as well as a smaller screen on the user’s wrist. They have traditional fast-rewind/forward controls accessible through their controllers, but their primary way of navigating the footage and data is through their voice.

If the user wants to watch back and analyse the game’s free-kicks, they just need to ask/say, “Show me the free kicks”. Within a second, they have a fully customised playlist ready to go.

“Beam me up Scotty”

The ability to converse with the virtual environment makes for a frictionless user experience. There are not endless menus or screens to interact with, the user can simply say what they want to see, and it happens. With the quantity and complexity of data on-hand int he experience too, utilising Speech-to-Text in this way means that the user can jump straight to the point of action that they care about.

Within a Virtual Reality environment, immersion is so important. Existing Sports VR experiences require a third-party user to control which segments a user is watching. This results in the VR-user conversing with that third-party to request changes, who in turn has to control the VR experience via an external source like a laptop or PC.

In contrast, for our experience, the user simply has to ask, for example “Show me every corner by Jordon Henderson in the second half”.

Not only does our experience allow for command and control of the data, but they can also control their movement. In addition to the standard VR teleportation mechanism for movement using the controllers, the user can move themselves using the conversational interface with something like “Take me down to the pitch”, or “Show me this from Jamie Vardy’s perspective”

Now, the user has the ability to explore the pitch from the player’s perspective. This gives analysts, or even fans in the context of a more fan-tailored experience, the opportunity to experience the action as perceived through the players’ eyes.

The opportunity to witness a player’s decision making, from both the god view (3d), video, birds-eye (2d) and as the player’s on the pitch will give the analysts a whole new insight into the feedback they provide to players. The players will respect this more so, knowing the analyst should have more empathy to the player in that particular game state. It also provides an interface for the players the review clips and in-game footage themselves as they were experiencing it on the day.

The conversational interface makes this all incredibly seamless to explore and navigate, opening the door to not just the analysts using tools like this, but also the players, coaches and even fans.

Master Inventor & Emerging Technology Specialist

More Innovation stories

Rainbow Octopus – Animate a 3d character with ARKit & IBM Watson

Rainbow Octopus is an open-source project created to show how a developer could use IBM’s Watson services in ARKit and Unity to control and manipulate a 3d animated character. Specifically, via the Watson Unity SDK, we capture and send speech to the IBM Speech To Text, and interpret the results using Tone Analyser and Watson […]

Continue reading

What Makes a Great Wimbledon Champion

IBM has been supporting Wimbledon’s pursuit of greatness for 27 years. In recent years, much of that contribution has come from the Hursley based Emerging Technology team. This year’s challenge? To find out what makes the greatest Wimbledon champion of all time. A team of statisticians, tennis analysts, coaches, journalists and developers assembled to start […]

Continue reading

Emerging Technology Sponsored Extreme Blue project wins European Expo

A team of undergraduates working for IBM over the summer has developed a medical application for a pharmaceutical computing that records and plays back spoken words. The Extreme Blue™ program is IBM’s premier internship program for top-notch students pursuing software development and MBA degrees. Working in teams of 4 each team is challenged to develop […]

Continue reading