Real-time data analytics using IBM Predictive Maintenance and Quality

For capital-intensive assets-based industries such as oil and gas exploration and production, you need access to real-time production figures and accurate predictions of future production. Learn how to use IBM Predictive Maintenance and Quality to load production data in real time, aggregate data, predict production, and populate the data store to refresh dashboards.

Share:

Saurabh Gupta (sgupta18@in.ibm.com), Solution Architect, IBM

Saurabh GuptaSaurabh Gupta is a solution architect at IBM with more than a decade of experience creating architecture for and evangelizing solutions. He has played key roles in several projects — from project planning to implementation for various customers in growth markets. He is currently associated with the Predictive and Business Analytics Industry Solution and Services group at IBM India. He plays an important role in creating business analytics solutions for predictive asset optimization.



Seba Kauser (seba_kauser@in.ibm.com), Author, IBM

Seba KauserSeba Kauser works with Global Business Services at IBM India and is currently part of the Predictive and Business Analytics Industry Solution and Services group. Working with the chemical and petroleum community, she uses the IBM Predictive Maintenance and Quality solution to meet the requirements of chemical and petroleum customers. Her interests include application integration and middleware using WebSphere products in various industry domains.



13 May 2014

Also available in Spanish

Overview

To apply business analytics to any situation, a current snapshot of data is recorded in the source systems and analyzed. The current analytics are correlated with the historical data to study trends and build predictive models. To correlate analytics based on current and historical data, you must integrate an analytical solution with the data in the source system in real time and update the analytical data store and the historical data.

For example, in an upstream (exploration and production) oil and gas industry, a production supervisor wants to analyze the latest production figures of the oil and gas being produced at a remote site, and measure the performance of an offshore well with respect to the latest reported production figures.

The IBM Predictive Maintenance and Quality solution includes software components that perform the advanced analytics required for this kind of scenario. One of the components, the integration bus, is integrated with the source system to populate the Predictive Maintenance and Quality analytical data store.

The event trigger mechanism of the integration bus is used to load the production data and raw events from the Production Data Management System (PDMS) database. The event trigger mechanism is a technique that enables users to receive updates to a single table or to multiple tables within a database in real time. The user is able to view changes in the enterprise dashboard and in reports (accessible through the GUI) while an action or an event is triggered.


Architecture of the solution

The sample solution described in this article is built on the IBM Predictive Maintenance and Quality solution, a packaged, preconfigured, cross-industry business analytics solution. As shown below, the Predictive Maintenance and Quality solution includes different software components.

Image shows architecture for Predictive Maintenance and Quality

Solution workflow

The real-time data analytics solution adds upstream oil and gas industry content by interfacing with various software components of the Predictive Maintenance and Quality solution. The sequence diagram shown below illustrates the interfaces and the sequence of activities.

Image shows workflow and interfaces between components

Click to see larger image

The solution includes the following components:

  • Production Data Management System (PDMS)— Populated using a web application designed to distribute petroleum-related information.
  • DATA_UPDATE_EVENT table— Used to capture the updates made in PDMS by the web application. A new event is written to this table each time the database is updated.
  • Database adapter— The DatabaseInput node is used to respond to events in a database. It gets triggered when updates are made to the database view, based on the timestamp. This node checks the Timestamp column (foreign key) to determine the production date that was updated and loads the required production data from PDMS. Once this data is completely fetched and persisted, the node sets the Synchronised column (primary key) to True to indicate that the data has been successfully read.
  • Custom mediation flow— Fetches updates from PDMS for the required application table based on the Timestamp field match in both the event and application table. This sets the Synchronized flag from False to Part. Next, the data object is converted to XML, and the XML is written to a queue. Based on the Timestamp field, a select is done to fetch updates from the other application tables in the same data store, and all such messages are processed and persisted in the same queue. Next, the state of the Synchronized flag is set to True from Part. After that, the production data is transformed into Predictive Maintenance and Quality event files and stored in the Predictive Maintenance and Quality event directory using the respective stored procedures, which hold the transformation logic for each type of production data.
  • Queue— The IBM WebSphere® queue, which holds all production data updates received from PDMS in the form of XML messages.
  • Predictive Maintenance and Quality event directory— The default directory, which is the eventdatain folder that holds the events to be processed by the Predictive Maintenance and Quality mediation flow.
  • Predictive Maintenance and Quality mediation flow— Stores or records the events in the Predictive Maintenance and Quality database by correlating them with information present in the master data and the metadata tables of Predictive Maintenance and Quality.
  • Predictive Maintenance and Quality database— Database server that contains the analytic data store. The data store acts as an event store and holds calculated key performance indicators (KPIs) in the form of aggregated events and profiles. It also contains supporting master data and metadata information for the solution.
  • IBM SPSS® Modeler— Creates predictive and forecasting models. The models are then deployed to SPSS Collaboration and Deployment Services, where they are available to be called as scoring web services. Listing 1 shows a sample of the scoring web service.
    Listing 1. Sample scoring web service
    <getScore>
        <scoreRequest id="GasProdForecast">
                  <input name="Well_Id" value="Schola-F-3"/>
                  <input name="TimeStamp" value="2012-08-01T15:59:00"/>
                  <input name="Production_m3" value="166137"/>
        </scoreRequest>
    </getScore>

    These calls are made from Predictive Maintenance and Quality mediation flow to produce the forecasted value and the deviation value. The same flow handles the forecasted value and records it in the Predictive Maintenance and Quality database.
  • SPSS Decision Management— Used to develop the decision-making process and the thresholds for taking action using rules. With SPSS Decision Management, rules can be authored, tested, optimized, and deployed to SPSS Collaboration and Deployment Services, where they become available to be called as decision management web services. Listing 2 shows a sample of the decision management web service.
    Listing 2. Sample of a decision management web service
    <getScore>
             <scoreRequest id="GasWellRecommendation">
                <input name="Well_Id" value="Schola-F-3"/>
                <input name="TimeStamp" value="2012-07-15"/>
                <input name="POP_Flag" value="0"/>
             </scoreRequest>
     </getScore>
  • IBM Maximo®— A maintenance application that supports the creation of work orders through a self-generated web service. The Predictive Maintenance and Quality mediation flow calls the Maximo work order web service to create a work order when a recommendation for this action is received. Listing 3 shows a sample of the Maximo web service.
    Listing 3. Sample of a Maximo web service
    <WORKORDER action="AddChange"						
          <SITEID>Schola</SITEID>						
          <ASSETNUM>Schola-F</ASSETNUM>
          <DESCRIPTION>Urgent Inspection</DESCRIPTION>
    </WORKORDER>

    After the work order is created in Maximo, it becomes visible to authorized individuals.

Advanced analytics

SPSS Modeler is used to calculate the production forecast based on the historical production data of oil and gas wells. After the model is trained with one year historical data, actual production values are processed by the SPSS Modeler to predict the forecast value for the following week. The forecast provides the production supervisor with the view of future production values and the predicted production loss with respect to the target production values.

An autoregressive integrated moving average (ARIMA) model is used to estimate the forecast value. The modeler also evaluates the deviation of the forecasted value with the actual production value and sends a flag to the decision-management component. Based on the value of the flag, the decision-management component provides a recommendation as to whether an urgent inspection is required on the well or platform.

The Actual vs. Forecast key performance indicator is available by using an IBM Cognos® Business Intelligence dashboard. The Cognos framework manager uses the Predictive Maintenance and Quality Analytical data store to construct the key performance indicator. The actual production volume is displayed when loaded from the PDMS. The actual value is shown in comparison to the forecasted value, which is provided by the SPSS Modeler scoring service. This comparison helps the production supervisor to gain insight into whether the oil or gas well is performing according to the trend.


Conclusion

This article gives an overview of the components involved in performing predictive analytics using the IBM Predictive Maintenance and Quality solution. Using the example of an upstream oil and gas scenario, it shows how to use IBM Predictive Maintenance and Quality to monitor, analyze, and report on information gathered from capital assets and operational processes and how to recommend optimization activities regarding those assets and processes for other industries such as mining and steel production.

To implement a similar solution in your own environment, include the following aspects in your planning process:

  • Review your existing processes and assets in terms of whether you have the infrastructure required to support business analytics.
  • Study the data source system used in the remote production database. Understand the data structure of the application and events.
  • Identify the interfaces and protocols that connect to the source system.
  • Freeze the visualization layer and the business rules before implementing before the solution.

Resources

Learn

Get products and technologies

Discuss

  • Get involved in the PMQ Practitioners Community to share knowledge, ideas, solutions and experiences around IBM Predictive Maintenance and Quality.

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Big data and analytics on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Big data and analytics
ArticleID=970108
ArticleTitle=Real-time data analytics using IBM Predictive Maintenance and Quality
publish-date=05132014