To apply business analytics to any situation, a current snapshot of data is recorded in the source systems and analyzed. The current analytics are correlated with the historical data to study trends and build predictive models. To correlate analytics based on current and historical data, you must integrate an analytical solution with the data in the source system in real time and update the analytical data store and the historical data.
For example, in an upstream (exploration and production) oil and gas industry, a production supervisor wants to analyze the latest production figures of the oil and gas being produced at a remote site, and measure the performance of an offshore well with respect to the latest reported production figures.
The IBM Predictive Maintenance and Quality solution includes software components that perform the advanced analytics required for this kind of scenario. One of the components, the integration bus, is integrated with the source system to populate the Predictive Maintenance and Quality analytical data store.
The event trigger mechanism of the integration bus is used to load the production data and raw events from the Production Data Management System (PDMS) database. The event trigger mechanism is a technique that enables users to receive updates to a single table or to multiple tables within a database in real time. The user is able to view changes in the enterprise dashboard and in reports (accessible through the GUI) while an action or an event is triggered.
Architecture of the solution
The sample solution described in this article is built on the IBM Predictive Maintenance and Quality solution, a packaged, preconfigured, cross-industry business analytics solution. As shown below, the Predictive Maintenance and Quality solution includes different software components.
The real-time data analytics solution adds upstream oil and gas industry content by interfacing with various software components of the Predictive Maintenance and Quality solution. The sequence diagram shown below illustrates the interfaces and the sequence of activities.
The solution includes the following components:
- Production Data Management System (PDMS)— Populated using a web application designed to distribute petroleum-related information.
DATA_UPDATE_EVENTtable— Used to capture the updates made in PDMS by the web application. A new event is written to this table each time the database is updated.
- Database adapter— The
DatabaseInputnode is used to respond to events in a database. It gets triggered when updates are made to the database view, based on the timestamp. This node checks the
Timestampcolumn (foreign key) to determine the production date that was updated and loads the required production data from PDMS. Once this data is completely fetched and persisted, the node sets the
Synchronisedcolumn (primary key) to
Trueto indicate that the data has been successfully read.
- Custom mediation flow— Fetches updates from PDMS for
the required application table based on the
Timestampfield match in both the event and application table. This sets the
Part. Next, the data object is converted to XML, and the XML is written to a queue. Based on the
Timestampfield, a select is done to fetch updates from the other application tables in the same data store, and all such messages are processed and persisted in the same queue. Next, the state of the
Synchronizedflag is set to
Part. After that, the production data is transformed into Predictive Maintenance and Quality event files and stored in the Predictive Maintenance and Quality event directory using the respective stored procedures, which hold the transformation logic for each type of production data.
- Queue— The IBM WebSphere® queue, which holds all production data updates received from PDMS in the form of XML messages.
- Predictive Maintenance and Quality event directory—
The default directory, which is the
eventdatainfolder that holds the events to be processed by the Predictive Maintenance and Quality mediation flow.
- Predictive Maintenance and Quality mediation flow— Stores or records the events in the Predictive Maintenance and Quality database by correlating them with information present in the master data and the metadata tables of Predictive Maintenance and Quality.
- Predictive Maintenance and Quality database— Database server that contains the analytic data store. The data store acts as an event store and holds calculated key performance indicators (KPIs) in the form of aggregated events and profiles. It also contains supporting master data and metadata information for the solution.
- IBM SPSS® Modeler— Creates predictive and
forecasting models. The models are then deployed to SPSS Collaboration
and Deployment Services, where they are available to be called as
scoring web services. Listing 1 shows a sample of the scoring web
Listing 1. Sample scoring web service
<getScore> <scoreRequest id="GasProdForecast"> <input name="Well_Id" value="Schola-F-3"/> <input name="TimeStamp" value="2012-08-01T15:59:00"/> <input name="Production_m3" value="166137"/> </scoreRequest> </getScore>
These calls are made from Predictive Maintenance and Quality mediation flow to produce the forecasted value and the deviation value. The same flow handles the forecasted value and records it in the Predictive Maintenance and Quality database.
- SPSS Decision Management— Used to develop the
decision-making process and the thresholds for taking action using
rules. With SPSS Decision Management, rules can be authored, tested,
optimized, and deployed to SPSS Collaboration and Deployment Services,
where they become available to be called as decision management web
services. Listing 2 shows a sample of the decision management web
Listing 2. Sample of a decision management web service
<getScore> <scoreRequest id="GasWellRecommendation"> <input name="Well_Id" value="Schola-F-3"/> <input name="TimeStamp" value="2012-07-15"/> <input name="POP_Flag" value="0"/> </scoreRequest> </getScore>
- IBM Maximo®— A maintenance application
that supports the creation of work orders through a self-generated web
service. The Predictive Maintenance and Quality mediation flow calls
the Maximo work order web service to create a work order when a
recommendation for this action is received. Listing 3 shows a sample
of the Maximo web service.
Listing 3. Sample of a Maximo web service
<WORKORDER action="AddChange" <SITEID>Schola</SITEID> <ASSETNUM>Schola-F</ASSETNUM> <DESCRIPTION>Urgent Inspection</DESCRIPTION> </WORKORDER>
After the work order is created in Maximo, it becomes visible to authorized individuals.
SPSS Modeler is used to calculate the production forecast based on the historical production data of oil and gas wells. After the model is trained with one year historical data, actual production values are processed by the SPSS Modeler to predict the forecast value for the following week. The forecast provides the production supervisor with the view of future production values and the predicted production loss with respect to the target production values.
An autoregressive integrated moving average (ARIMA) model is used to estimate the forecast value. The modeler also evaluates the deviation of the forecasted value with the actual production value and sends a flag to the decision-management component. Based on the value of the flag, the decision-management component provides a recommendation as to whether an urgent inspection is required on the well or platform.
The Actual vs. Forecast key performance indicator is available by using an IBM Cognos® Business Intelligence dashboard. The Cognos framework manager uses the Predictive Maintenance and Quality Analytical data store to construct the key performance indicator. The actual production volume is displayed when loaded from the PDMS. The actual value is shown in comparison to the forecasted value, which is provided by the SPSS Modeler scoring service. This comparison helps the production supervisor to gain insight into whether the oil or gas well is performing according to the trend.
This article gives an overview of the components involved in performing predictive analytics using the IBM Predictive Maintenance and Quality solution. Using the example of an upstream oil and gas scenario, it shows how to use IBM Predictive Maintenance and Quality to monitor, analyze, and report on information gathered from capital assets and operational processes and how to recommend optimization activities regarding those assets and processes for other industries such as mining and steel production.
To implement a similar solution in your own environment, include the following aspects in your planning process:
- Review your existing processes and assets in terms of whether you have the infrastructure required to support business analytics.
- Study the data source system used in the remote production database. Understand the data structure of the application and events.
- Identify the interfaces and protocols that connect to the source system.
- Freeze the visualization layer and the business rules before implementing before the solution.
- Learn more about the solution in the "IBM Predictive Maintenance and Quality information center.
Get products and technologies
- View a demo on how IBM Predictive Maintenance and Quality helps spot problems before they happen so you can plan for, rather than react to asset failure.
- Explore the IBM Predictive Maintenance and Quality solution, which helps you maximize asset productivity and operational performance.
- Get involved in the PMQ Practitioners Community to share knowledge, ideas, solutions and experiences around IBM Predictive Maintenance and Quality.