Home Health Care ONC’s proposed rule promotes interoperability while compounding already critical patient matching problem

ONC’s proposed rule promotes interoperability while compounding already critical patient matching problem


The country is boldly moving toward a new era of nationwide interoperability. Earlier this week, the Office of the National Coordinator for Health IT’s (ONC) proposed new regulations aim to require health data sharing among care organizations, reduce information blocking, and simplify patient access to their own health data. ONC is expected to further bolster these goals with the release of the Trusted Exchange Framework and Common Agreement (TEFCA), which will provide a technical framework to make nationwide heath data sharing a reality.

The new regulations and the eagerly anticipated TEFCA come with a catch. Simply connecting health systems without appropriately addressing the underlying patient matching challenges means there will be an explosion of available data without an appropriate way to access the specific records a provider or patient needs. This will soon be a painful reality for healthcare systems across the country, adding to the strain they already feel from spending significant time and budget addressing duplicate health records in their systems.

The country can no longer afford to overlook one of the most critical issues that can hamper nationwide interoperability: patient matching. Today, enterprises struggle to accurately match patients to their records within the four walls of their hospitals. Regional organizations that facilitate medical record exchange across providers within a geography have even lower patient match rates across enterprises. This challenge will be exacerbated exponentially when more data from more enterprises is exchanged on a national scale.

While the healthcare industry and policy makers alike have brought attention to the challenges around patient matching, the dialogue has been focused on describing the problems and making recommendations around outdated technology and legacy processes to achieve minor improvements. They are ignoring not only the significance of patient matching for enabling nationwide interoperability, but also the innovative technologies that have already been developed, implemented and proven.

Congress, recognizing the need for accurate patient matching, asked the Government Accountability Office (GAO) to study patient matching and identify additional things ONC could do to improve matching. Recently, the GAO released its findings, which, not surprisingly, indicated that accurate patient matching is quite difficult to achieve, even in small geographic areas. The report found that matching efforts remain focused on using algorithms to compare demographic data from each record (name, address, phone, date of birth).

Consequently, most of the work to improve matching by ONC and the industry have focused on incorporating additional data elements to the matching algorithms, and on improving the quality of the data itself. These algorithms might be sufficient within a small geography. However, as the country moves towards nationwide exchange, such tuning will become significantly more difficult since organizations will not be able to adjust for their own geographic cultural variations.

These recommendations and guidelines around data standardization and quality represent incremental solutions that might result in short-term, small scale benefits. Incremental progress in response to the monumental shift in data sharing is going to cause massive challenges for the entire industry. We need solutions that represent a disruptive innovation, and we should be looking to proven identity matching technologies used in other industries.

For example, biometrics present promise, but due to the growing number of different biometric modalities and billions of existing health records still identified by demographic data, they will not single-handedly solve patient matching on a national scale. Unique patient identifiers also have been discussed with potential. However, like a social security number, these can be stolen and are susceptible to data quality errors.

Referential matching should be viewed as a solid answer, as it’s the only new technology developed around demographic matching. Referential matching uses the power of big data and machine learning to assemble a longitudinal history of an individual’s demographic data attributes, then references this additional data during matching as an “answer key” to match records, even if they contain out-of-date addresses, phone numbers, maiden names, missing Social Security numbers or birthdates, and errors or typos.

Referential matching is not limited by geographical and cultural boundaries like traditional matching algorithms since it doesn’t require algorithms to be tuned at a local level. Each patient is simply matched to her reference identity contained in the highly tuned, national reference database. This is what allows organizations to maintain match rates upwards of 98 percent whether they are matching within the four walls of their enterprise, down the road with a partnered facility, or across a region with another health enterprise.

Referential matching can also act as an identity clearinghouse, securely storing not only the demographic data about a patient used for matching purposes, but also local, unique identifiers including medical record numbers or state-issued unique patient identifiers, biometric markers, and/or other “real identifiers” used across health providers.

Referential matching works – at the local, state and national level. Organizations using this technology are already reaping benefits, and it’s time to test this technology on a truly national level, preferably before the onslaught of data sharing the new ONC proposed rule will spawn.

One avenue is to test referential matching at scale, working with national networks such as Commonwell and the eHealth Exchange. As an alternative, the technology could be piloted with the Department of Defense (DoD) and Veterans Affairs (VA) as part of the Cerner implementation, demonstrating the ability to resolve identities on a massive level. Regardless of the specific test model, we need to move quickly, because a tsunami of incorrect data is heading your way unless we can collectively address this problem.

We must stop talking about the problem and congratulating ourselves for minor improvements that won’t scale. Instead, we should invest in new and innovative solutions to address patient matching right now. Referential matching should be included as one of those innovations that can support nationwide interoperability. The future of the healthcare industry depends on it.

Photo Credit: DrAfter123, Getty Images


Source link


Please enter your comment!
Please enter your name here

1 × 3 =