Asset Management

The struggle for data quality is real for asset and facilities management leaders

Share this post:

Today’s facilities are swiftly moving towards a smarter ecosystem, with thousands of physical asset attributes to maintain and manage. To support this complex infrastructure, data management is crucial, and has conventionally been a major issue for higher education facilities. Without quality data management, organizations will struggle to define mission-critical information and achieve supported decision making. This blog explores common data quality challenges, and how to approach them.

Common data quality challenges

A facility can only reach optimal performance if its data management plan supports alignment and transparency. Let’s take a look at a common example of the impact of poor data management:

A maintenance organization defines a primary mission objective as ‘maintaining assets to provide their designed intent.’ The mission is to deliver service and support to customers, who could be manufacturing or end users. The team identify asset uptime as one measure that is closely aligned with this objective. The uptime information is captured through a historical business process, but initiatives have been attempted or completed in the past to ensure the data is available in the Enterprise Asset Management System (EAM).

Asset uptime information is captured by entering data into work order records when maintainers or operators report or address failures. Uptime numbers are not consistent, and customers report the opposite of what is shown in the EAM. Maintenance personnel are verbally reporting higher numbers than EAM data and are frustrated at criticism resulting from the recorded uptime numbers. Uptime reports are being generated by multiple personnel in the organization, but it is not clear to everyone how the measures are calculated or where the data is coming from that is presented.

A quick look at the EAM data shows that some work orders where uptime (or downtime) is reported, do not have the correct asset identified. Many of the work order fields are not populated even though the records are completed, and no description of the failures or how they were addressed is present on the records. In this case, data quality and confidence are low, even though this measure is identified to be a Key Performance Indicator (KPI) for the organization.

The organization in the above example is struggling to meet its strategic objectives because of poor data management. It is hampered by four common challenges:

  • Lack of guidance from leadership on Master Data Management
  • Increasing amounts of data with no objective or quality-driven purpose
  • Lack of confidence in the data available in enterprise systems
  • Lack of alignment of data requirements to organizational goals and business processes

So, how can asset and facilities managers tackle challenges like these? First, and most importantly, process data needs to be made easily accessible. Once the data are available in clear, concise views, they can better support decision making by offering a clear insight into performance issues and trends.

4 steps to achieving quality data management

Below are four recommendations for quality data management:

  1.      Create a Data Management Plan
  2.      Define what objectives and processes are critical to strategic objectives, and the data needed to measure and monitor them
  3.      Locate process gaps through definitions of what a “good” record is in your enterprise systems
  4.      Identify initiatives for improvement that are aligned to your processes and objectives, and measuring the ROI of these.

Let’s address each of these in turn.

Managing your data: Step 1: define your objectives

Data management includes defining which data are critical to the mission and operations of your organization. It means identifying sources, consumers, permissible values, and analysis capability to support decision making. Managing the data includes rules or business processes for capture or entry, removing duplication, validating values or presence before committing to the system, and standardizing data types and values to simplify these tasks.

Organizations need to know their mission and objectives first. Only then can they define what data is needed when and by whom. Setting a mission objective defines why are you doing what you are doing. Many data, software, or business initiatives do not closely couple the ‘why’ with objectives, and this misalignment is a source of pain and dissatisfaction for not only employees, but customers or consumers too.

Step 2: validate, monitor and improve

Once objectives are known, data requirements are defined, and expectations are set, the process of validation, continuous monitoring, and improvement can commence. This should be done by providing a single source of clear and concise performance to everyone. Good performance happens when trust and transparency are present.

If the performance is now clearly available, exceptions and anomalies will be easy to identify. The journey to data quality and confidence is reached in small steps, so do not expect performance and quality out of the box. Simple measures – like whether attribution is available on work records for assets – will start to show insights immediately into whether data quality or confidence meet expected levels. You may be completing work on time, but if only 40% of the work records contain failure data or maintenance logs, the value of the work record and system is diminished.

Step 3: document tribal knowledge to see future rewards

Tribal (or undocumented) knowledge within an organization can be a source of value – but it needs to be retained and used effectively. By defining what data should be present, and logging historical trends around performance, we can embed that knowledge so that it stays available for future decision making.

Any system tracking performance must be able to harness and provide access to insights that people create while using the data in their day-to-day work.

Continuing the journey to data quality

In my next post, we will work through the pain points and issues identified as the journey to data quality, and confidence continues. Desired Destinations? – Engaged employees and customers, transparent expectations and results, continuous improvement, performance and mission success.

Watch a demo on IBM Maximo.

Learn more about Cohesive Solutions.

 

More stories
By Bruce D Baron on November 6, 2018

How UCSF Health is putting patients first with facilities management

If buildings are getting smarter, then medical facilities have to be brilliant. That’s why University of California, San Francisco (UCSF) Health is working with IBM Maximo to create cutting-edge health facilities that aim to keep the patient care environment safe. It started when UCSF Health and IBM partnered to optimize the management of existing health […]

Continue reading

By Joe Lonjin on October 30, 2018

The struggle for data quality is real for asset and facilities management leaders

Today’s facilities are swiftly moving towards a smarter ecosystem, with thousands of physical asset attributes to maintain and manage. To support this complex infrastructure, data management is crucial, and has conventionally been a major issue for higher education facilities. Without quality data management, organizations will struggle to define mission-critical information and achieve supported decision making. […]

Continue reading

By Dr. Stephan Biller on October 15, 2018

IBM named a Leader in Gartner Magic Quadrant for Enterprise Asset Management 2018

We are thrilled to share the results of this year’s Gartner Magic Quadrant (MQ) for Enterprise Asset Management (EAM)!  IBM positioned highest for ‘Ability to execute and furthest for ‘Completeness of Vision’.  This is a win IBM is extremely proud of, as we have invested heavily in innovations and research around Maximo in the last […]

Continue reading