Every day, tens of thousands of patients in seek care for treatment of new or existing conditions. Behind the scenes, a complex net of information about health records, benefits, coverage, eligibility, authorization and other aspects play a crucial role in the type of medical treatment patients will receive and how much they will have to spend on prescription drugs. This means large amounts of data being produced, stored, and exchanged every second, which is also subject of inefficiencies and gaps in its access between patients, providers, and payers given the inconsistencies in how healthcare data interoperability standards are implemented. In the US, these inefficiencies contribute to an increasing healthcare system waste and challenges delivering cost-effective quality care.

For over 20 years, the discussion of how to address this challenge has permeated the industry without a clear resolution. Just in 2020, the Centers for Medicare and Medicaid Services (CMS) published a rule for healthcare systems whereby patients, providers, and payers must be able to easily exchange information. The rule laid out an interoperability journey that supports seamless data exchange between payers and providers alike — enabling future functionalities and technically incremental use cases. Since 2021, healthcare insurance companies also known as payers, that set service rates, collect payments, process claims, and pay healthcare provider claims, have the obligation to comply with the interoperability requirements set in 2020. These requirements enable the exchange of important data between healthcare payers and providers.

Establishing a clear interoperability framework is foundational to enabling administrative simplification, one of the five provisions of the Health Insurance Portability and Accountability Act of 1996 (HIPAA). This provision intends to reduce paperwork and streamline business processes across the health care system, leveraging technology to save time and money. With 63% of physicians reporting signs of burnout, and 47% of clinicians planning to leave their jobs in the next two to three years, this provision could not be more timely and relevant as it is right now.

When combined with artificial intelligence (AI), an interoperable healthcare data platform has the potential to bring about one of the most transformational changes in history to US healthcare, moving from a system in which events are currently understood and measured in days, weeks, or months into a real-time inter-connected ecosystem.

Why is data interoperability an imperative?

Simply put, a healthcare ecosystem where all stakeholders can easily exchange information, enables payers and providers to better partner together to deliver high quality and cost-effective care. The return on investment (ROI) as a result of efficiencies gained, reduction of unnecessary medical spend, and improved member experience scores, can be in the hundreds of millions for mid-sized payer with 3 million members.

Realizing the benefits of the business case, however, can be a daunting task for stakeholders in the healthcare ecosystem, especially considering the number of requirements and standards that need to be assessed and complied with, including the implementation of the Fast Healthcare Interoperability Resources (FHIR) standard for exchanging health care information. CMS recognizes the importance of FHIR in advancing interoperability and national standards to reduce administrative burden.

As healthcare providers and payers are independently assessing the capabilities, maturity, and architectural patterns necessary for FHIR adoption along with the cost of implementation and the impact of adoption on current business processes and analytics, IBM is witnessing different rates of adoption and vastly different enterprise architecture implementation patterns across the industry.

Four levels of maturity in the interoperability implementation

In our view, achieving the goals put forward by CMS and other entities require a flexible, modular framework of capabilities that support the ability to first integrate data from disparate healthcare sources, then conform, standardize, and link this information in a common canonical format. Once persisted in a common canonical format, the data is made available to downstream consumers in a standardized format through APIs. This can be shown is the graphic below were each layer or “ring” supports a new range of use cases, expansion of data, and new technologies.

Ring 1 is the base of the interoperability platform and provides the capabilities necessary to ingest, standardize and integrate data from disparate sources to create the initial Longitudinal Patient Record (LPR). This “ring” of the solution includes key components for data acquisition, terminology standardization, patient matching (master data management), and persistence of the data in FHIR format.

Ring 2 expands the capabilities of the FHIR data platform to perform calculation of Data Exchange for Quality Measures (DEQM). These capabilities are needed to establish patient attribution, identify individual patients with gaps in care, and update the patient care plan with the necessary actions to address the patient risks and care gaps. This also supports the capabilities to insert actionable insights and care plan updates directly into the provider care flow within the Electronic Medical Record (EMR).

Ring 3 uses the capabilities of Ring 1 and Ring 2, including the data integration capabilities of the platform for terminology standardization and person matching. This would break the existing silos in the US healthcare system: physical health and behavioral health silos. FHIR provides a single standard that promotes combining the two silos and understanding the health status, goals, care needs and socioeconomic conditions. The emerging result is an ability to create a care plan that addresses “whole person” needs.

Ring 4 supports the five key provisions to improve health information exchange to achieve appropriate and necessary access to complete health records for patients, healthcare providers, and payers, including the automation of currently manual processes which would greatly benefit from new technologies like AI. These provisions are set forth in proposed CMS rule: Advancing Interoperability and Improving Prior Authorization Processes (CMS-0057-P).

Realizing the benefits of interoperability in prior authorization

The next, but one of the more important steps in the interoperability journey, is leveraging the data to deliver more cost-effective and high-quality patient care, without creating unnecessary administrative complexity.

This is why interoperability is crucial to transforming prior authorization, a process implemented by healthcare payers in utilization management programs that address high-cost medical procedures and medications, where healthcare providers must demonstrate that the care being provided to patients is both medically necessary and compliant with the latest evidence-based clinical quality guidelines. To achieve this without impacting patient care, payers and providers need to exchange information in real-time.

However, the inconsistent adoption of interoperability standards across the healthcare industry, combined with physician burnout and incidence of adverse outcomes because of delays in obtaining approvals to provide needed care, is causing friction among patients, payers, providers, and regulators.

This has also led to a proliferation of point solutions in the market, pushing the boundaries of innovation. Many of these solutions leverage AI, specifically machine learning (ML) and natural language processing (NLP) to enable intelligent workflows that can automate the process of validating medical necessity and compliance with clinical quality guidelines based on patient clinical data either extracted from documents submitted by healthcare providers, or through interoperability with electronic health records (EHR) systems. The introduction of Generative AI offers to take this solution pattern a notch further, particularly with its ability to better handle unstructured data.

Ultimately, while the technology and interoperability standards are there to enable real-time information exchange to automate prior authorization, value remains trapped by fundamental challenges in how clinical data is captured and stored, as well as in how medical necessity criteria and clinical quality guidelines are created and stored.

How IBM can help

Transforming interoperability and prior authorization from end to end is easier said than done. Payers and providers need to have the right combination of people, processes, and technology to execute it. In an environment where resources are limited and the stakes are high, the value of partnering with a systems integrator and process integrator that has the breadth and depth of capabilities IBM does, is indispensable.

That is why IBM developed a comprehensive strategy and approach to guide our healthcare clients in driving value through real end-to-end digital transformation, bringing the best of what market has to offer together with our differentiated technology and consulting capabilities.

One aspect that makes IBM unique is our ability to leverage our clients’ existing investments in IBM technologies and our world-class software development capabilities to fill in gaps that are otherwise not available as off-the-shelf solutions. This enables our clients to access incentives that bring the power of one IBM, Technology and Consulting, together in service of our clients’ needs, all the way from advisory to execution to operationalization.

Embark on a digital reinvention
Was this article helpful?

More from Business transformation

Putting AI to work in finance: Using generative AI for transformational change

2 min read - Finance leaders are no strangers to the complexities and challenges that come with driving business growth. From navigating the intricacies of enterprise-wide digitization to adapting to shifting customer spending habits, the responsibilities of a CFO have never been more multifaceted. Amidst this complexity lies an opportunity. CFOs can harness the transformative power of generative AI (gen AI) to revolutionize finance operations and unlock new levels of efficiency, accuracy and insights. Generative AI is a game-changing technology that promises to reshape…

IBM and AWS: Driving the next-gen SAP transformation  

5 min read - SAP is the epicenter of business operations for companies around the world. In fact, 77% of the world’s transactional revenue touches an SAP system, and 92% of the Forbes Global 2000 companies use SAP, according to Frost & Sullivan.   Global challenges related to profitability, supply chains and sustainability are creating economic uncertainty for many companies. Modernizing SAP systems and embracing cloud environments like AWS can provide these companies with a real-time view of their business operations, fueling growth and increasing…

Re-evaluating data management in the generative AI age

4 min read - Generative AI has altered the tech industry by introducing new data risks, such as sensitive data leakage through large language models (LLMs), and driving an increase in requirements from regulatory bodies and governments. To navigate this environment successfully, it is important for organizations to look at the core principles of data management. And ensure that they are using a sound approach to augment large language models with enterprise/non-public data. A good place to start is refreshing the way organizations govern…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters