Blog Post

Interoperability in Healthcare: Data Pipeline Automation to Achieve FHIR Standards

Learn to Achieve Healthcare Interoperability in a Hybrid IT Environment by Supporting Data Transfers between Your FHIR Server and Applications in both the cloud and on-prem.

Interoperability in Healthcare: Data Pipeline Automation

The words interoperability and healthcare seem synonymous these days, thanks to the 21st Century CURES Act. For those who are not familiar with it, the CURES act was passed by the US federal government in 2016. The goal of the legislature was to standardize the accessibility and availability of patient data and to drive positive patient outcomes within the value-based care model. 

As of 2021, healthcare providers, payers, healthcare information networks, and information exchanges focused on healthcare are all required to participate in interoperable patient information sharing.

In short, the interoperability in healthcare rule dictates what data must be shared and how that data must be shared. Most importantly, the rule puts the patient in control of their own data, enabling them to decide who the data is shared with. 

This blog will deep dive into how IT teams address interoperability in healthcare. In addition, we will offer insights into leveraging IT automation to achieve compliance with this increasingly important regulatory requirement. 

FHIR Standards in Healthcare 

The Fast Healthcare Interoperability Resources (FHIR) standard was developed to support the electronic transfer of healthcare information and protect patient data privacy and security. 

In its simplest form, FHIR creates a standard set of RESTful APIs which enable applications to communicate with one another regardless of the backend system. These APIs provide a communication standard for healthcare systems. The communication standards are implemented in a FHIR server, which is designed to easily share data among the complex ecosystem of providers and payers.  

Let’s summarize: Healthcare-related companies have a nice set of agreed-upon standards in place to pass data back and forth. Seems simple enough. What could go wrong? 

The Reality of the Situation 

Most healthcare IT environments are deeply rooted in legacy mainframe and on-premises distributed server operations. These legacy technologies do not play nice with the API-based systems that the FHIR standards require. The overall complexity of building a secure and reliable data pipeline is also amplified by the adoption of cloud technologies within healthcare.  

As a result, IT teams are searching for a solution that will ingest data from on-premises mainframes, legacy servers, and cloud service providers to further share the transformed data with a FHIR server. 

Why Not Just Create an API for your On-Premises Systems? 

While some have tried solving this issue by building mainframe or distributed server APIs, writing APIs for legacy systems is a daunting task. Further complicated by the number of individual facilities that often need to make their data available, it becomes unmanageable and unscalable. 

From a technical preparedness standpoint, APIs lack visibility and analytics. When something breaks in the API code, identifying and troubleshooting the issue promptly is time-consuming and extremely difficult. This approach inadvertently puts the flow of patient data and, therefore patient care, at risk. 

Knowing that many healthcare payers and providers operate in on-premises environments, data transferability has been a valid interoperability concern from the start. The data recency requirement, known as the “24-hour rule,” makes the situation even more complex. On-premises automation and modernization tools often cannot respond to the API calls in a federated capacity. As a result, many healthcare entities are struggling to meet one of the core interoperability requirements. IT teams simply lack resources to get the data from the legacy resources (with no robust APIs and no real-time capabilities) to a FHIR repository.

New call-to-action

How To Solve Interoperability in Healthcare with SOAP 

As a category coined by Gartner in April of 2020, Service Orchestration and Automation Platforms (SOAPs) evolved from traditional workload automation solutions. Today, SOAPs help Infrastructure and Operations (I&O) teams by providing automation as a service. SOAPs became prominent because of the general move to the cloud, where organizations required automation to be orchestrated across both on-premises and cloud environments. Data pipeline orchestration is one of the primary solutions built within SOAPs and includes the ability to integrate with any 3rd party tool, built-in managed file transfer (MFT), and a graphical workflow designer that is intuitive and easy to use. 

Healthcare organizations have turned to SOAPs to bridge the gap between their on-prem legacy systems and the various cloud solutions they employ. A modern SOAP helps organization comply with the FHIR standards requirement. SOAPs help connect automation between siloed applications and platforms by ensuring the following:

  • Data moves to the right place
  • Data is synchronized between environments
  • Data is moved at the right time, whether in batch or in real-time
  • Data is secure during its transfer 

It is important to note that SOAPs do not replace existing big data tools like ETLs and data warehouses. Instead, they integrate with the data tools used at each stage of the data pipeline. Then the SOAP serves as the master-orchestrator of the automated IT processes that push the data through the pipeline in real-time.

Data Pipeline Orchestration with Containers

Modern SOAPs include managed file transfer (MFT) capabilities that integrate with containers (like Kubernetes), which brings an application-agnostic approach to connect various healthcare data sources. When combined with the workload scheduling technology inherent in a SOAP, organizations have solved the problem of connecting applications deployed in containers to data sources that reside outside of containers. 

This is accomplished by deploying small “agents” within a containerized application, allowing the cloud-based application to communicate with another “agent” that is deployed within an on-premises database. To bring this all back to FHIR and interoperability, imagine a setup with an agent sitting on a mainframe and another agent in a FHIR server. Now, data can move back and forth between the two environments in real time or on a set schedule. No API development is required. Plus, this setup offers a whole lot more in terms of observability, management, and control.

What to look for in a solution:

  • End-to-end orchestration and automation of data pipelines
  • Solution designed to integrate legacy on-prem systems (including the mainframe) with a FHIR server(s)
  • DataOps orchestration enabled, allowing end users to leverage standard lifecycle methodologies (Dev/Test/Prod), that include versioning for end-to-end orchestration
  • Observability into all data transfer processes, including audit trails and logs
  • Real-time service monitoring, alerting
  • Capabilities for users to monitor workloads in real-time
  • Ability to move data back and forth between containers and legacy on-prem systems
  • User-friendly workflow designer that offers a unified view to design and orchestrate event-driven workflows across a hybrid IT environment with no or low-code

Summary

In response to the 21st Century CURES Act, healthcare organizations are building toward exchanging FHIR resources. As a result, the value proposition of a centralized data pipeline orchestration solution grows exponentially. A proper solution will solve the problems associated with passing data back and forth between legacy and modern tools. Are you interested in learning more?  Check out the Stonebranch Big Data Orchestration solution

Start Your Automation Initiative Now

Schedule a Live Demo with a Stonebranch Solution Expert

Back to Blog Overview
New call-to-action

Further Reading

Read the blog | Introducing UAC 7.7: New Capabilities for Enhanced Automation

Introducing UAC 7.7: New Capabilities for Enhanced Automation

Discover the latest updates in Universal Automation Center 7.7, designed to enhance workflow, security, and self-service capabilities in the ever-evolving world…

Read the blog: SOAP Insights from the 2024 Gartner® Magic Quadrant™ Report

SOAP Insights from the 2024 Gartner® Magic Quadrant™ Report

Understand the evolving SOAP market, the role of AI and cloud-native architectures, and why Stonebranch is named as a Leader.

Read the Blog Post | Beyond Job Scheduling: The Modern Role of Mainframes in IT Automation

Beyond Job Scheduling: The Modern Role of Mainframes in IT Automation

Successful automation in many large businesses requires the IBM mainframe and hybrid IT to work seamlessly together in order to achieve business outcomes.

Read the blog: Overcome Automation Tool Sprawl and Unlock Efficiency with a SOAP

Overcome Automation Tool Sprawl and Unlock Efficiency with a SOAP

Streamline automation tool sprawl to reduce costs and improve agility — unify siloed tools into a single, integrated solution for your business processes.