September 11, 2019 By Dan Sutherland 3 min read

Let’s be clear — we do not intend to bury data governance, but rather to place it within the context of a more comprehensive approach: data enablement.

The goal of data governance — ensuring the quality of an organization’s data across the data lifecycle — is noble enough. But the processes involved often end up stymieing transformative progress and success. At their worst, these processes are black holes that drain resources from the organization and yield no tangible benefits.

Data governance still holds value, but it is only one piece of the puzzle. Enterprises today should shift their focus to the larger picture: data enablement. By building a program around data enablement, enterprises can ensure that the right data is delivered to the right resource at the right time. Data enablement requires innovative thinking, vision, people, processes and technologies.

Most importantly, clean data delivered in real time — the underlying promise of data enablement — is essential to the cognitive enterprise, which must be able to leverage AI and analytics within critical workflows to drive processes and decision-making. And the benefits of the cognitive enterprise make the transformative impact of data enablement worth any initial discomfort.

A data quagmire


Organizations have been told for years that within the vast amounts of data generated by users, apps, processes and devices lies the key to understanding customers and markets, identifying opportunities and streamlining operations. So in that regard, traditional governance programs begin with good intentions.

But things quickly get bogged down by committee-bred chaos, lack of accountability and murky data lakes containing every bit of data the organization collected, all of which diminish data quality and usefulness. Research firm Gartner estimates that enterprises lose about $15 million every year because of poor data quality.

When everyone in the organization simply dumps application-specific data into a data lake with the vague notion that it will someday be of some unspecified value, they’re virtually assuring the data will be of little use and even less value. Mostly, it will be a waste of money. It’s no surprise, then, that data governance has “a negative connotation for many executives,” as the Harvard Business Review writes.

Enter data enablement


Data enablement is an active means of data management that relies on defined and enforced data policies. It provides the real-time ability to deploy automated and active data validation, classification and management, which comes with complete and visible data lineage and associated metadata. It incorporates a comprehensive and searchable data catalog (updated in real time), which enables self-service data access and data discovery.

None of this works, however, without assigning ownership and responsibility for the data to whoever creates it. Under this “you create it, you own it” principle, the responsibility for data quality and cleanliness falls squarely on the initiating entity — whether an application, a person or a business unit — and not on any downstream department or application.

All data enablement policies and associated actions are driven by that principle, and that results in accountability rarely seen in data governance initiatives. The architecture relies on real-time monitoring and SLA-driven KPIs to enforce real-time data validation and to efficiently integrate, curate and optimize enterprise data. That means no more data swamps and no more costly, unsuccessful post-processing data cleanup initiatives.

If a data enablement validation fails, suspect data is put into an exception queue and ruled out of play until whoever owns the data takes corrective action. And because everything is reported in real time, data validation problems can draw immediate attention and concern from the C-suite. That’s a strong incentive for data owners to stay on top of things.

Implementing data enablement


Enterprises can take the following steps to establish a successful and lasting data enablement program:

  • Identify and prioritize data enablement initiatives by greatest business value
  • Make data enablement a part of the culture by entrusting enforcement to the data owner (with appropriate incentives from the responsible executive)
  • Define the roadmap, framework, guardrails, patterns and tools needed to empower delivery
  • Make post-processing data cleansing a publicized exception with no preallocated budget

Data enablement is proactive and dynamic. It emphasizes empowerment, innovation and instant business value consumption. As such, it is the clear path forward toward the cognitive enterprise.

Learn more about IBM’s big data and data platform services

Was this article helpful?

More from Cloud

From complexity to clarity: Future pathways for VMware clients

5 min read - Today, VMware clients might be facing transformational decisions amidst an evolving landscape following Broadcom's acquisition of VMware and in search of the best pathways to serve their business needs. However, this process can be complex and challenging, with the potential impacts of choosing the right offerings, adapting to licensing modifications and navigating the partnership impacts. IBM Consulting® can support VMware clients in their transformational journey based on its vast experience of supporting clients through their hybrid cloud estate. IBM Consulting…

Accelerating responsible AI adoption with a new Amazon Web Services (AWS) Generative AI Competency

3 min read - We’re at a watershed moment with generative AI. According to findings from the IBM Institute for Business Value, investment in generative AI is expected to grow nearly four times over the next two to three years. For enterprises that make the right investments in the technology it could deliver a strategic advantage that pays massive dividends. At IBM® we are committed to helping clients navigate this new reality and realize meaningful value from generative AI over the long term. For our…

New 4th Gen Intel Xeon profiles and dynamic network bandwidth shake up the IBM Cloud Bare Metal Servers for VPC portfolio

3 min read - We’re pleased to announce that 4th Gen Intel® Xeon® processors on IBM Cloud Bare Metal Servers for VPC are available on IBM Cloud. Our customers can now provision Intel’s newest microarchitecture inside their own virtual private cloud and gain access to a host of performance enhancements, including more core-to-memory ratios (21 new server profiles/) and dynamic network bandwidth exclusive to IBM Cloud VPC. For anyone keeping track, that’s 3x as many provisioning options than our current 2nd Gen Intel Xeon…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters