There is an excellent discussion currently on the DHN at the moment, which was kicked off by discussing a rumour that England might go and buy Epic nationally. This has flowed into contemplation of the fundamental flaws of the whole EHR approach, and where we actually need to go in future. Fascinating reading.
We are in grave danger of having a stagnant secondary care EHR market
In the primary care arena in the UK, they have had little progress, development, and innovation in EPRs over the last decade or two. This follows a period of rapid digitalisation, where they really led the field.
This looks to be happening in the US with secondary care EHRs. Read this Forbes article (and 2 of 2) for more info:
A platform approach is a clear way forward
Much like the astronomic success of iOS and Android, a platform enables 3rd party innovation to add extensive value to the product.
There is an upcoming battle between open platforms vs closed platforms in the EHR space.
Open EHR is the obvious open contender. The closed platforms are likely to be the evolution of the current large EPR incumbents (Epic, Cerner, Allscripts, Intersystems).
There will always be friction at the boundaries of platforms. But that is preferable to boundaries between every single application.
Databases might not be the best way forwards
Enter the file-based clinical record. Iām not sure what that is, but it represents a major shift from an organisation-centric approach to a person-centric approach on the data level. This video by NHSX outlines it (over an hour):
Itās really been a battle for control all along. FHIR turns out to be virtually unapproachable except through one of the big companies bearing gifts. Arguably the standards effort has had a cynical core from the startā¦look at the nonsense of vs3 and how that crippled the CDAā¦.OpenEhr itself is questionably open by some accountsā¦.
Ignoring the merits of any particular EHR - the change management challenge inherent in any nationwide implementation of a single system in a country of Englandās size renders it a practical impossibility.
Have to disagree about the ābattle for controlā in relation to standards. HL7 FHIR is an open, free to use standard - and associated large worldwide community - that implementers understand and can work with - thus lowering the bar of entry and making interoperability easier and cheaper to achieve. Itās not perfect - but, from an implementation perspective, to say that itās unapproachable couldnāt be further from the truth - hence itās widespread adoption.
The model-first approach of both HL7 v3 and openEHR hasnāt proved to be popular with implementers - those who actually produce the software in use. Those standards certainly have some strengths - but FHIR has proved to be far more successful as itās based on generic web standards.
I fully agree. OpenEHR is seen a great repository of clinically validated record types and contents - but even when used as a basis for specifying actual health records, itās often just at a design stage - using the Archetype and Template editing tools to define the contents of records. However, then the implementations are not directly generated from the definitions. e.g. In Australia, NEHTA defined all the the MyHealthRecord types by using OpenEHR tools, but, as this was in the pre-FHIR era, they mapped them all onto HL7 CDA & V3 for secure transmission (and only Accenture knows what they did to define storage formats for them - Iāll bet thereās a bunch of hand edited SQL tables underneath).
OpenEHRās problem was that it was ahead of its time (or within itās time - the 1990s), and envisaged CORBA middleware to do records distribution and mapping to programming languages and data formats⦠but when IBM, Microsoft and Sun/Oracle chose to disrupt CORBA with that (never was interoperable) WS-* nonsense - there was no obvious way of transporting OpenEHR records around in a secure and interoperable way. Only when the Web Services crowd came back to basics and adopt HTTPS REST+[insert authentication protocol here] did we have a chance of non-middleware-based, Web interoperable, means of communication again.
FHIR launched off that basis, and because the primaries behind it insisted to HL7 that the standards be published openly, and validated by implementation did we get a chance to get mass implementation and popular adoption. You should have seen how hard it was to hire a Computer Science graduate who had any idea of HL7 standards implementation in the 1990s & 2000s⦠Closed standards make for closed shops and few choices of locked in vendors. These days graduates know what FHIR is, and if not, can learn fast due to its use of REST, XML & JSON, which theyāre all competent in.
Hi Keith, I was involved in some of the first HL7 v2 implementations in NZ back in 1993. A case of joining HL7 International to get the specs - which were only available to members in those days - and learning on the job! I had plenty of IT experience, but none in healthcare when I came to NZ in 1992. Luckily, the EHR vendor that I worked for did a lot of co-design with clinicians which compensated for my complete lack of domain knowledge at the time. That certainly taught me the benefits of clinical engagement. I still think that openEHR is a great way of capturing clinical requirements, but am not quite so convinced by some of the stack (e.g. using it at the persistence layer).
Agreed. Persistence is hard to get right, ⦠and efficient. I spent some time researching what linked (object oriented) data to group together when generating an OO-RDBMS translation layer - we used domain specialists (in the construction and telecoms industries in my case) to help us add annotations to the OO models to indicate what was most likely to be needed all at once, or in quick succession, and generated a DB layer that cached āsemantic objectsā - which was some group of data that we could expect to be needed at the same time to avoid too many fine grained queries one after the other (especially if the data was remote).