IoT ruins movies: The Mummy 

Share this post:

Setting the scene: The Mummy

I wouldn’t fancy a job as one of The Mummy’s archaeologists. Or even your run-of-the-mill treasure-seeking opportunist along for the ride, for that matter. The working environment is all wrong, for one thing. Enclosed tombs with shaky ceilings, booby traps and an atmosphere of questionable chemical composition make for an off-putting office space. Even the library is beset with unsecured lofty and heavy stacks which can (and do) topple with only a slight push.

I think these folks need some help from the IoT to provide a nice, safe, working environment. That still won’t save them from the murderous, preternaturally all-powerful, reanimated hell-beast that is the film’s protagonist, but hey, you can’t have everything.

The movie: The Mummy

The Mummy is a pleasingly gruesome 1999 action-adventure with just a smidge of comedy to take the edge off the reanimated corpses. Loosely based on the 1932 film of the same title, this newer version was written and directed by Stephen Sommers and stars Rachel Weisz as Egyptology expert Edie, John Hannah as her bumbling elder brother and Brendan Fraser as pistol-toting all-American-hero Rick O’Connell.

The action begins in Thebes, Egypt in 1290 BC, when High Priest Imhotep has an ill-advised affair with the Pharaoh’s mistress Anck-su-Namun, kills the Pharaoh and tries to bring his now-dead lover back to life. As punishment, Imhotep is buried alive at Hamunaptra, city of the dead, with only flesh-eating scarabs to keep him company. His tomb is sealed with a curse, and if awoken from his undead state, he will arise ‘a living disease, wielding the power of ages’ to bring an everlasting plague on humanity.

Killing him might have been a better plan.

Of course, thousands of years later, Imhotep’s corpse is disturbed and accidentally reanimated. Carnage breaks forth, and only Evie’s mastery of Egyptian mythology can bind him once again.

Digital archives and the power of Watson

Our first IoT-enabled health and safety challenge is in the library archive of Cairo’s Museum of Antiquities. In an early scene, Evie overbalances replacing a book and knocks over every shelf stack in the room, narrowly escaping a squashing and burying valuable records under mountains of others.

A secondary digital archive could have stored a copy of these precious records to protect them from physical mishap. It would have been easy to cross-reference too, thanks to Watson’s awesome analytical power, which is capable of ingesting and analysing not just gigabytes and terabytes of data, but exabytes – even zettabytes.

If a ‘zettabyte’ means nothing to you, try this on for size: if an 11oz coffee equals one gigabyte, a zettabyte would have the same volume of the Great Wall of China. This is crazy big data, and it’s the job of cognitive computing systems like the Watson IoT platform to make sense of it and mine it for specific information or queries.

So Evie would have her historical records, tomes, and texts at her fingertips, with very little likelihood of being crushed beneath them.

Image recognition tech and sensors: or, ‘how to spot a booby trap’

Let’s move past the library and into the newly rediscovered tombs beneath Hamunaptra.

Some of the minor (and therefore expendable) characters in The Mummy have the misfortune to open an ancient case in their search for treasure. Naturally, it’s booby trapped, and they encounter a mysterious substance which promptly melts them, skin, hair and bones, into mush. The casket was adorned with hieroglyphs, which none of them could read. Shame really, as they probably spelled out a dire warning.

A few months ago, Google released a significant update to its Word Lens tool that enables real-time translations of symbols and images into the language of your choice. This means that it can recognise pictorial or calligraphic languages like Japanese, and dare I say it, Ancient Egyptian, through image recognition techniques.

Had our luckless friends scanned the casket’s hieroglyphs with Word Lens, a handy translation would have popped up: “Do not open. Bad things will happen.” Or something to that effect.

They might have benefited from a quick sensor kit-up of the working environment, too. An unhealthy or unbreathable atmosphere would be detected and its source pinpointed before the inevitable mush.

Look, no hands! Augmented reality glasses for real-time data

What we need now is an information system or dashboard that can collect and interpret data from Word Lens and the various sensors and deliver it to the diggers in real-time.  Hands-free would be pretty nifty, since they’re probably going to be busy with torches, picks, hammers and other equipment.

So let’s give our diggers some AR glasses to help them ‘see’ hieroglyph translations and sensor data via a Heads Up Display (HUD) like this one. Vital information would be continually delivered to the display on the periphery, enabling them to keep an eye on sensor data and the conditions of their working environment, while still being able to see what they are doing.

Safe as houses. Now to tackle the impending apocalypse.

IoT ruins other movies

What flicks would you like to give the IoT ruins movies treatment? We’ve written a few others including Thelma and Louise, and Batman: The Dark Knight Rises.

Let us know your suggestions in the comments below!

Add Comment
No Comments

Leave a Reply

Your email address will not be published.Required fields are marked *

More Blog Stories
By Jen Clark on July 27, 2017

IoT weekly round-up: Thursday 27th July 2017

This week in the connected world: Disney tracks movie audience’s enjoyment with a new facial recognition system, Apple and Cochlear develop the first iPhone hearing implant and Google announces a new studio aimed at getting machine learning startups off the ground. Read on for the latest. Google introduces Launchpad Studio for AI startups Another week, […]

Continue reading

By Jen Clark on July 20, 2017

IoT weekly round-up: Thursday 20th July 2017

Welcome to the IoT weekly round-up. This week, IBM gets a coveted spot on the IoT Podcast, an MIT researcher may have solved the mystery of integrated touch with virtual reality experiences, and Google’s Expeditions app allows the public to travel virtually to over 600 destinations via a smartphone and VR headset. Bret Greenstein takes […]

Continue reading

By Jen Clark on July 13, 2017

IoT weekly round-up: Thursday 13th July 2017

This week in the connected world, DARPA funds the efforts of six groups to develop a brain-to-computer, two-way interface, there’s a new tool for the visually impaired with text and facial recognition, and IBM has a new solution in town: The IBM Services Platform with Watson. Read on for the latest. DARPA agency awards $65 […]

Continue reading