August 24, 2021 By Pearl Chen 3 min read

Humans have been exploring the ocean for thousands of years, but now the power of AI can help unlock its mysteries more than ever. To commemorate the 400th anniversary of the Mayflower’s trans-Atlantic voyage, the Mayflower Autonomous Ship (MAS) will repeat the same journey—this time without any people onboard. The world’s first full-size autonomous ship will study uncharted regions of the ocean, and an AI Captain will be at the helm.

From Plymouth, England to Plymouth, Massachusetts, the crewless vessel will use explainable AI models to make accurate navigation decisions. The ship will collect live ocean data, delivering valuable research that can inform policies for climate change and marine conservation. Through IBM technologies, MAS makes all of this possible by advancing three areas vital to a successful mission: talent, trust, and data.

The future of the ocean is at stake

More than 3.5 billion people depend on the ocean as a primary source of food, and ocean-related travel makes up 90% of global trade. Since the 1980s, however, the ocean has absorbed 90% of the excess heat from global warming, endangering life both below and above the seas.

Protecting the ocean starts with understanding more data about its ecosystem, but this undertaking requires massive investment. MAS reduces the need for enormous resources in ocean research by using data and AI to augment human work (talent), navigate safely while meeting maritime regulations (trust), and fostering collaboration to develop actionable insights (data).

Talent: Saving time and costs for scientists

A typical ocean research expedition can take six weeks with as many as 100 scientists onboard. Only one week is often spent on actual research. The rest of the time entails traveling to and back from destinations and sometimes managing bad weather and rough seas.

“Traditional research missions can be very expensive, limited in where they can explore and take a long time to collect data,” says IBM researcher Rosie Lickorish, who spent time on RSS James Cook as part of her Master’s in oceanography.

MAS significantly cuts down time and costs for scientists. A solar-powered vessel, it travels independently to collect data in remote and dangerous regions of the ocean. Researchers back on land can download live data and images synced to the cloud, such as whale songs or ocean chemistry detected by an “electronic tongue” called HyperTaste.

“With AI-powered sensors onboard that can analyze data as it’s collected, scientists can access more meaningful insights at greater speed,” says Lickorish. “The cost of data for our experts is low, in time as well as money.”

Trust: Navigating accurately with explainable AI

A combination of technologies helps MAS travel with precision: a vision and radar system scans the ocean and delivers data at the edge; an operational decision manager (ODM) enforces collision regulations; a decision optimization engine recommends next best actions; and a “watch dog” system detects and fixes problems.

This entire system makes the AI Captain intelligent, allowing it to make trusted navigational decisions driven by explainable AI. Rules-based decision logics in ODM validate and correct the AI Captain’s actions. A log tracks exactly which initial conditions were fed into ODM, which path it took through the decision forest, and which outcome was reached. This makes debugging and analyzing the AI Captain’s behaviors vastly easier than the “black box” AI systems that are common today.

Safety and compliance are key. For example, decision optimization through CPLEX on IBM Cloud Pak for Data, a unified data and AI platform, helps the ship decide what to do next. CPLEX considers constraints such as obstacles; their size, speed, and direction; weather; and how much power is left in batteries. It then suggests routes to ODM, which validates them or advises another course.

“ODM keeps the AI Captain honest and obeying the ‘rules of the road,’” says Andy Stanford-Clark, IBM Distinguished Engineer and IBM Technical Lead for MAS.

Data: Fostering collaboration for better insights

Once the mission is complete, researchers will use IBM Cloud Pak for Data to store data, apply governance rules to enhance data quality, manage user access and analyze data for actionable insights.

Having all data managed by a unified platform can enable greater collaboration for various project teams across ten countries. In addition, organizations and universities around the world can partner with the research teams, forming a grassroots coalition to advance measures that curb pollution and climate change.

Ready to be a Mayflower?

 MAS’s challenges—saving time and costs, making trustworthy predictions, and solving complex data problems—are not unique. Organizations in industries like banking, healthcare, transportation and more tackle these types of goals every day.

With the help of data and AI innovations from IBM Cloud Pak for Data, they might just become a “Mayflower” too. Learn more about the platform or schedule a consult.

Was this article helpful?
YesNo

More from Cloud

IBM Cloud expands its VPC operations in Dallas, Texas

3 min read - Everything is bigger in Texas—including the IBM Cloud® Network footprint. Today, IBM Cloud opened its 10th data center in Dallas, Texas, in support of their virtual private cloud (VPC) operations. DAL14, the new addition, is the fourth availability zone in the IBM Cloud area of Dallas, Texas. It complements the existing setup, which includes two network points of presence (PoPs), one federal data center, and one single-zone region (SZR). The facility is designed to help customers use technology such as…

Apache Kafka use cases: Driving innovation across diverse industries

6 min read - Apache Kafka is an open-source, distributed streaming platform that allows developers to build real-time, event-driven applications. With Apache Kafka, developers can build applications that continuously use streaming data records and deliver real-time experiences to users. Whether checking an account balance, streaming Netflix or browsing LinkedIn, today’s users expect near real-time experiences from apps. Apache Kafka’s event-driven architecture was designed to store data and broadcast events in real-time, making it both a message broker and a storage unit that enables real-time…

Primary storage vs. secondary storage: What’s the difference?

6 min read - What is primary storage? Computer memory is prioritized according to how often that memory is required for use in carrying out operating functions. Primary storage is the means of containing primary memory (or main memory), which is the computer’s working memory and major operational component. The main or primary memory is also called “main storage” or “internal memory.” It holds relatively concise amounts of data, which the computer can access as it functions. Because primary memory is so frequently accessed,…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters