More

    The Future of Healthcare Interoperability: Navigating the Challenges and Opportunities

    Published on:

    I have been fascinated with the possibilities of interoperability in healthcare since my days running a private practice in my small town in Colorado. I remember the days when Health Information Exchanges (HIE’s) were in their infancy. The technology was basic, and the data not standardized. But I was hooked and marveled at the possibilities that interoperability offered. Since that time, I have joined the world of Healthcare Informatics and am trying to do my part to bring this dream to reality.

    So, what are the possibilities that interoperability offers? Why do we care so much about getting this right? My why always takes me back to two important publications, To Err is Human and Crossing the Quality Chasm.

    The first of the two publications, To Err is Human was Published in 1999. To Err is human describes the volume and consequence of medical errors. The report called for change and made several recommendations regarding the standardization of technology as a critical step in improving the quality of care. Interestingly, they made several specific recommendations related to the use of technology in the prevention of error. A few highlights are: 

    Implementing systems that alert clinicians to medication interactions and abnormal laboratory results

    Implementing uniform standards for connectivity, terminology, and data sharing

    Widespread adoption of Computer-based patient records and other systems allowing access to patient data without delay

    The ability to aggregate data from large numbers of patients for research and public health imperatives

    The second important publication, Crossing the Quality Chasm, published in 2001 underscored the critical role of data sharing in transforming the healthcare system. It highlights the need for robust information technology infrastructure, interoperability, patient access to information, and the secure and effective use of data to improve healthcare quality and outcomes. This report highlights the patient’s need to be able to access their data to make informed decisions about their care and that data captured in the care-delivery system will help to improve the entire system, including important public health programs.

    The path to interoperability – a historical perspective

    Before we talk about the current state of interoperability, let’s take a step back in time and review a little history. Follow this link to learn more https://www.healthit.gov/20years/.

    Following the publication of To Err is Human and Crossing the Quality Chasm, a series of regulations established the foundation for healthcare interoperability starting with the 2003 Medicare Modernization Act. Then in 2004, President George W. Bush created the National Health Information Coordinator role to build a nationwide health IT infrastructure to improve care quality, reduce costs and errors, and protect patient data.

    Congress passed the 2009 HITECH Act that attempted to accelerate interoperability by introducing the Meaningful Use program, which incentivized the use of certified EHR technology.

    In 2016, the 21st Century Cures Act marked a significant legislative milestone, modernizing the Health IT and data-sharing infrastructure established through Meaningful Use. Additionally, it mandated the creation of a Nationwide Health Information Network via the Trusted Exchange Framework and Common Agreement (TEFCA), which would set technical standards and data-sharing agreements necessary for nationwide interoperability.

    A series of rules have flowed from the 21st century cures act. Each rule advances the technical standards, minimum data set, and information blocking provisions that are propelling us forward.

    TEFCA is now live, and data is flowing.

    The current state of interoperability – three points of view

    The government view

    According to Health IT.gov, the statistics look really encouraging. As a country, we have moved from just a little over 20% of non-federal acute care hospitals are sending, receiving and integrating summary of care records and searching and querying for any health information in 2014 to 70% in 2023. Pretty specific right? The same statistics for office-based physicians is slightly lower starting with just under 10% in 2015 to just 16% in 2021. The good news there is that in 2021, 53% of office-based physicians are receiving information and 49% can find the information they need when they ask for it. More detail can be found on the ASTP website data briefs for 2024 https://www.healthit.gov/data/databriefs .

    The physician view

    Well, from the physician point of view, those statistics from 2021 haven’t improved all that much by 2024. In fact, according to an article published in the Journal of the American Medical Association (JAMA) open network, only about 70% of primary care physicians are satisfied with there ability to access outside information and sadly, only 28% of those physicians said its was Easy to access that information. In some cases, those physicians said they had too much information and it was difficult to find the information they really needed, and in other cases, the information needed simply was not accessible. My own personal experience, and likely yours, tells me that those statistics are real. In fact, in a recent conversation with a friend I learned of a situation that she experienced where her family member was accessing care through a local, rural emergency room. Because the Emergency room could not access previous imaging records that were important for her family members’ care, they made the decision to move the patient to the location of the previous records. I was truly flabbergasted that in 2024, the best way to connect the patient to their data was to move the patient to the data rather than moving the data to the patient.

    The Patient view

    According to the experts more and more patients are accessing their data using patient and member portals. According to an article posted on the ASTP website, 79% of patients have been offered a patient or member portal and 57% have accessed information via that portal. The experience is good. Again, speaking from personal experience and the experience of my friends and colleagues, not only do these make it easier for me to access my information, schedule appointment, and refill prescriptions, they have been immensely helpful in assisting family members with their medical care and decision making. Thie all sounds great. Not so great is the lived experience of having multiple portals all with different log in credentials, levels of information availability, and user experiences. Some are available via an App on a smart phone or device and some only via a website. Third party apps are available that can connect the information from all (or most) of these portals. These have not yet been widely adopted and most do not fall under HIPAA protection. All of this doesn’t even consider health equity and the disparities in access that individuals have to their information based on socio-demographic factors.

    Challenges to achieving true semantic interoperability

    As indicated above, while we have made great progress, we cannot yet declare victory and proclaim that data is not only flowing seamlessly throughout the healthcare system but that the data is accessible and easily understood by all parties that need access to the data and protected from those that do not.

    Data quality and privacy remain significant barriers to achieving the dream of semantic interoperability. Until we can all agree on the parameters that define data quality and agree to capture and transmit data in a standard and agreed upon way, we will not achieve the goals of the triple aim, higher quality healthcare at lower costs and with less of a burden on clinicians.

    Data quality can be defined in many ways. I define it as data that is complete, accurate, consistent, valid, and relevant. There are of course other characteristics of high-quality data but if we can focus on these five areas, we will have achieved an impressive leap in the quality and value of the data that is being shared across the healthcare system today. One important step towards achieving this level of data quality is agreeing on and using valid and up to date codes to standardize the data. Ideally, the data is codified to standards such as RxNORM for medications, LOINC for observations, and SNOMED CT for problems. The earlier in the data life cycle the data is codified the more accurate, valid, consistent, and relevant that data becomes. In other words, data should be initially captured using the correct terminology standard for the type of information it represents. Realizing that is not always possible, it is important to Map data to these standards before using it in downstream applications and processes.

    Now let’s talk about data privacy. As the industry struggles with sharing relevant and meaningful data between HIPAA covered entities, with the patients consent, and following applicable regulations and organizational policies, there is a tug of war between appropriate sharing and the application of necessary safeguards. In the U.S, we have a patchwork of state laws that govern consent and data privacy that are more restrictive than HIPAA. These laws cover important topics such as when can a minor consent to care? What type of care can they consent to? Who has access to information and under what circumstances. Some states, such as California have put specific restrictions around the sharing of specific reproductive healthcare procedures in the context of legal action by another state where the care rendered in California might not be legal in the state bringing legal action. Contrast this with the need to stay compliant with information blocking rules as codified in regulatory documents such as the HTI-1 rules. These are real challenges that every healthcare organization or data aggregator needs to be aware of and compliant with.

    The future of interoperability

    Yes, there are challenges but there is also much to be excited about with all that is happening today to prepare the US for a truly interoperable future.

    Three things I am particularly excited about and am anxious to see that will bring us closer to true interoperability, include the expansion of the role of the ONC (Office of the National Coordinator for Health IT) to become the Assistant Secretary for Technology Policy (ASTP), the increasing ability to segment data for privacy and security, and the federal governments focus on increasing interoperability throughout the department of health and human services, especially as it pertains to public health.

    Let’s look at the expanding role of the ONC to include data, technology and artificial intelligence under the purview of the National Coordinators office, now known as the ASTP. By bringing Data, Artificial intelligence and technology into the same organization, HSS is making a bold statement about their intentions to coordinate and streamline data and technology across the various Health and Human Services (HHS) organizations. This move ties back to priority 2 of their data strategy released in 2023, foster data sharing. Consider this quote from the 2023 data strategy “However, more and easier data sharing could be unlocked through streamlined processes and the support to navigate complex requirements. Investment in foundational data sharing infrastructure will enable the Department to fully realize the potential of data sharing to further HHS’ mission while observing critical protections.”

    As we continue to move the cause of interoperability forward through rule making, the launch of the trusted exchange and common agreement (TEFCA), and a more focused approach within HSS, it is essential to ensure that data can be protected. Recent advancements in data segmentation and the adoption of the FHIR standard mean that all stakeholders can segment and tag sensitive data to be shared only as necessary for improved care, and not when it might cause harm.

    Finally, the latest proposed rule from the ONC, now ASTP is a clear statement that they intend to support the CDC in their data modernization efforts with significant advancements in certification criteria for public health, going so far as to propose that healthcare IT used in the public health IT space be certified to accept data being sent to them. The 7 (seven) new criteria for public health IT mirror the certification requirements (in place and proposed) found for certified vendors of health IT. These new public health-oriented criteria call for public health IT to be able to receive, validate, parse, and filter the data being sent by certified health IT systems. Combine these requirements with the bombshell release of the draft amendment to the Health and Human Services Acquisition Regulation that will require any health IT acquisition by any HHS agency to be certified to the ONC standards, or where certification criteria are not available, meet the latest standards for data sharing and you have a clear message that the federal government is modernizing its infrastructure and will be bolstering their ability to respond to public health emergencies, Use case B in the Data Strategy.

    Truly, there is much to be excited about in the world of healthcare interoperability. We have come a long way since the days of Meaningful Use, and I see a bright future ahead even if that future is a few years out. The four recommendations from To Err is Human are now implemented to some degree in most healthcare systems in the US. One question remains – Is all this progress helping to reduce errors in the healthcare system overall? Stay tuned, I just might write another article exploring that question.

    References
    Institute of Medicine (US) Committee on Quality of Health Care in America. To Err is Human: Building a Safer Health System. Kohn LT, Corrigan JM, Donaldson MS, editors. Washington (DC): National Academies Press (US); 2000. PMID: 25077248
    Institute of Medicine (US) Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington (DC): National Academies Press (US); 2001. PMID: 25057539.

    Related

    Leave a Reply

    Please enter your comment!
    Please enter your name here


    Cheryl Mason
    Cheryl Mason
    Cheryl L. Mason is a distinguished leader in healthcare informatics, with over 30 years of experience in both clinical and vendor settings. Since joining Health Language in 2013, she has held pivotal roles, including leading the informatics team for six years and currently serving as Product Manager of Terminology. Cheryl’s deep expertise in healthcare terminologies and interoperability standards, such as LOINC, RxNORM, SNOMED, and ICD-10, has been instrumental in driving the adoption of these standards to enhance data accuracy and accessibility across diverse healthcare systems. Her strategic leadership has led to significant advancements in clinical content development for Natural Language Processing (NLP) technologies, supporting healthcare organizations in managing complex data with precision. Cheryl is recognized not only for her technical acumen but also for her collaborative leadership style, which fosters innovation and continuous improvement within her teams. A prolific author and speaker, Cheryl contributes extensively to industry knowledge through presentations at national conferences such as AHIMA and AMIA, and through publications on data normalization and quality metrics. She holds a Master of Science in Health Informatics (MSHI) from Walden University and a Bachelor of Science in Biology from Adams State College, furthering her expertise with specialized courses in SNOMED CT. Actively involved in professional communities like HIMSS and AMIA, Cheryl continues to influence the future of healthcare informatics while currently writing a continuing education course on semantic interoperability for the ONC GETPHIT program. https://leadafi.com/executive-biography/cheryl-l-mason-mshi-leading-healthcare-informatics-innovator/