Dr. Don Rucker, the former National Coordinator for Health IT from 2017-2021, said he sees the current proposed structure of the Trusted Exchange Framework and Common Agreement and its reliance on brokered protocols and a 1990s-style page view document architecture as a costly impediment to modern healthcare computing.
Under the proposed interoperability rule requiring electronic health record vendors and others offering healthcare IT, fast healthcare interoperability resources are on the roadmap.
“If we want to be in the business of the country getting value in healthcare, somehow we have to combine the nature of the care with what we pay for that care,” Rucker said during a HIMSS23 session on Bulk FHIR that he co-presented.
“And pre-FHIR, that has not been possible.”
Rucker sat down with Healthcare IT News to discuss what needs to happen in terms of a national healthcare interoperability strategy. Read part one of that interview below and look for part two tomorrow.
Q. What have you been doing in the two years since you left the Office of the National Coordinator for Health IT? Talk a bit about your role at 1upHealth and the work the company is doing to advance interoperability.
A. It’s really this modern vision of computing.
And when you think about healthcare computing, we have a wonderful conjunction now of really I think three things.
So first, through a lot of work, we have lots of electronic healthcare data. Pretty much everybody has an electronic medical record likes to think ONC and my predecessors had had a role in that.
And the ONC team, we have really an elegant underlying networking approach to thinking about data in the cloud.
And then finally, we have something that’s maybe a little bit more subtle, but you know when it’s missing, you certainly noticed it’s missing. We have an elegant data representation standard in FHIR, which the industry has spent 40 years coming to.
I think that combination of lots of data, highly networked platforms that not just share the data but allow you to compute – so cloud – and then having a data standard that you can compute with is dynamite.
And at 1UpHealth, that’s sort of what we do in helping payers, who are under all kinds of pressure to combine claims and clinical data, as well as providing a platform for digital health startups.
Q. During your tenure at ONC, you mentioned the challenge of policymakers solving for multiple interoperability challenges at once – the tech stack, data exchange standards, consensus around semantics and meaning and the business case, all with privacy and security built-in. How do you think we’ve done meeting these complex and overlapping challenges in the five years since you’ve said that?
A. I think the Cures Act, the original Cures Act concept of APIs without special effort – the modern standards-based, not vendor-driven APIs – I think is spot on.
As a country, we were extremely lucky that this was not something we had to invent.
It was sitting there between Restful APIs and OAuth2 for privacy and FHIR. It was there for the taking.
So that combination of a law that went with a technical solution that was precisely what was needed, that’s pretty rare, right? That’s a little bit like a solar eclipse to my mind.
That doesn’t come all that frequently, and is 100% bipartisan.
Q. What needs to happen in terms of national strategy since going forward? What are some policies that maybe aren’t in place that you’d like to see – and perhaps vice versa?
A. Well, I think one thing that is very hard for government is maybe to let things go as opposed to putting more fixes on. Patience and government – they don’t always go together.
I think we have with FHIR, OAuth and Restful [APIs], we actually have and the tech stack we need.
So, what would be helpful? I think there’s two classes of things. One is, what additional technical things would be helpful? And then, what additional policy things – which in healthcare is largely payment – would be helpful?
On the technical side, I think one thing that is also in the Cures Act that I guess CMS is working on, is a national provider directory. We need discoverable endpoints for everybody who’s getting money from CMS. They owe it to the country; we owe it to ourselves.
That should not be run through brokers – [Qualified Health Information Networks] – that should be readily discoverable.
That’s probably the single biggest technical thing ONC has done some good work on – endpoint directories in the certification program.
From a near-term policy point of view, obviously we have the Cures Act’s rulemaking on APIs.
That’s very powerful. That’s in place.
CMS has extended it to the payer provider APIs and the proposed interoperability role that they posed comment on a couple of months ago. That I think is going to be Ground Zero for discussions between payers and providers on things like quality measurement, prior authorization and network design.
It’ll be probably the most used part initially of the CMS family of APIs.
We have to think about prevention. We have to put in long term incentives and healthcare that go over years if not decades. And we need to really figure out as a country, how do we really get prevention into the game?
Q. What are your thoughts on TEFCA and where it’s headed? What benefits could it have for interoperability – and what are some of its limitations, in your view? Is it a net positive? Or could it actually impede wider data exchange?
A. We need lots of computing. Figuring out appropriate care and prior auth has vast computing issues. So you want platforms that have to do that. You need to compute individual data fields.
That’s the advance of the Internet over the last 20 years. I think the concern with TEFCA is it focuses on maybe the classic interoperability case, which is, well back in the day it was:
“I can’t get my paper chart.”
Now of course we solve that with a fax. So TEFCA is sort of the update to faxes when you get right down to it. But it’s a document-centric type of approach.
It’s a real use case and there’s no doubt about the TEFCA addresses. It is just so narrowly focused when you compare what we could do today, to what we could do 20-25 years ago that I think we need to think more broadly.
We need to be able to not just get the record that you got some procedure for your heart, we need to be able to monitor your blood pressure, your blood sugar, your cholesterol, your lipids and microadjust your medicines on a real-time basis.
That’s what we need. Not that there was a document generated three years ago on the last heroic treatment of your heart disease.
We need computing to prevent the next heart attack, and frankly, the next expensive procedure.
So, how do we get that computing into healthcare – it’s not going to happen with documents.
READ PART TWO OF THIS INTERVIEW TOMORROW
Andrea Fox is senior editor of Healthcare IT News.
Email: [email protected]
Healthcare IT News is a HIMSS Media publication.
Source: Read Full Article