Tuesday, December 30, 2014
MatchstickTV and FirefoxOS
MatchstickTV is a recent kickstarter project that takes FirefoxOS and uses it to run an HDMI dongle very similar to Chromecast, but based on open source apps on an open source development platform and an open source OS. It's a little cheaper as well, but I suspect that is not an important selling feature. On the other hand... because there is nothing in particular to license here, this sort of thing could become a great conference giveaway over time, just like USB sticks used to be.
Honestly, I think a tablet using FirefoxOS with a bunch of onboard educational applications - similar to the original OLPC program, perhaps - would be a really good thing. There is a niche for browser-based mobile, and Mozilla is doing a lot of smart, good things to capture it.
Monday, December 29, 2014
Atomized Integration, IBM Worklight and AngularJS
In general, my guidance has been to use open source mobility frameworks, PhoneGap for cross-platform, Bootstrap for Responsive Web Design, Angular for templating, and some form of OAuth2 for security, at least until the vendor solutions from IBM, Oracle, et al reach a higher level of maturity, since these are stepping stones.
If you look at the latest Gartner quadrants for enterprise mobility and cloud for the previous year, you will see IBM maturing in the MADP space and Oracle maturing in the cloud space... but maturity in both areas is necessary for enterprise mobility to fire on all pistons.
Worklight does three things really well:
- Simple adaptation on the server-side, using Rhino-based Javascript adapters.
- Integrating with existing Websphere and SAM infrastructure.
- Increasing productivity through modularization and emulation.
I have mentioned previously how much I like Worklight's lightweight Rhino-based adapters. They are intentionally lightweight, eschewing any sort of SOA reusability. A Worklight adapter does one thing, and it does it well. This can be initially quite pleasant, then very frustrating, and then liberating, as you sort out how much integration you need to do in your client applications. My experience has been that a well designed piece of XSL can convert an XML data source into some standards-compliant JSON, and then a client-side library service can take it from there.
For instance, consider that I have an XML data source containing a number of patient records. Let's say it is NIEM compliant XML. I could build a client application that can consume NIEM compliant JSON, and then all I would need to do in Worklight is create a very simple boilerplate adapter that transforms the XML into JSON. This is assuming that my server-side data source doesn't already support JSON-flavoured NIEM, which would be even simpler. In other words, if my intent is to take a NIEM compliant data source and build a NIEM compliant mobile application, this is quite straightforward. Server-side Worklight adaptation transforms XML into JSON; client-side Angular data-binding injects JSON into the HTML-based presentation layer, and presto, you have an application.
Granted, the development process is not that easy, and let's consider now that we have a number of data sources, some of which are NIEM compliant, some of which are HL7 compliant, some of which are based on direct SQL access, and some of which are ad hoc.
When you look at the various Worklight adaptation examples, you might get the idea that RSS is treated preferentially, which is untrue; however, thinking of these adapters as syndication is still a useful approach.
Throughout the past year, I have been working with HL7 FHIR, a draft standard from HL7 that among other things introduces a JSON-based pattern for aggregation and composition that is essentially Atom syndication in JSON instead of XML. It turns out that if all of my Worklight adapters create Atom-compliant JSON on the server-side, then I can use a Javascript Atom library in the client, and it really doesn't matter what format my data sources are using. By the time they reach my client application, they are all Atom-based.
The client-side service that I have written - using Angular for modularization - is responsible for merging multiple Atom streams. Once I have a single Atom stream, data-binding takes place, so that information can be presented. In practice, this can be frustrating because Atom is intended for serialization of information, but an Atom bundle can also contain relative links between entries. This is fundamental to the way HL7 FHIR works, but not NIEM, so I have ended up creating synthetic and essentially schemaless resources as necessary. Ideally, all information could be mapped into Atom-syndicated FHIR resources. Maybe that's a good project for this year.
Adaptation frameworks always run into a problem based around the decision to go lightweight or go modular. I like that Worklight has gone lightweight, but I am frustrated that I can't reuse just a little bit more code between adapters. In particular, I would really like to use a single set of XSL transforms to support multiple adapters. Perhaps there is a way to do this, but for now, I am still forcing myself to prune my adaptation code as much as possible to keep it easy to maintain. If I find myself using the full set of DocBook or DITA transforms in an adapter, it's probably time to rethink my approach.
On the whole, I have enjoyed working with Worklight adapters immensely; I don't think this would be the case if I was not also using Angular or some other Javascript framework to support development of client-side services. I haven't particularly used the built-in Worklight support for Dojo or JQuery, but I'd go so far as to say that without some sort of hybrid framework support, you will lose much of the productivity that Worklight gives you. After a year, I have reached an understanding that I would not enjoy using a framework like Angular without a platform like Worklight, and I would not enjoy using a platform like Worklight without a framework like Angular.
Unless, of course, the platform was also a framework, which is what approaches like Meteor promise.
Saturday, December 27, 2014
Some Canadian Context for HL7 FHIR
Ongoing CDA projects in Canada are bound to continue as such, which will be worth paying attention to as CDA projects in the States start shifting to HL7 FHIR as an implementation standard. The message from Infoway recently here is to use the appropriate standard for the work at hand, and I expect this message to percolate on both sides of the border; but what does this really imply? How do you decide? For new business cases which would previously have required a document standard like CDA, HL7 FHIR is going to be compelling, as well as low risk, local, and greenfield projects.
Worth noting is the four ways that FHIR can be used. As previously discussed, FHIR supports both Messaging and Document use cases; but, perhaps more importantly, FHIR also supports both REST and Service use cases. In addition, FHIR is in many ways custom built for the security and transport requirements of mobile use cases, and contains resource definitions that will enable social use cases like circle of care and information provenance. For existing health information systems and applications, as well as new, FHIR creates new ways to expose, access, and share information; providing not only tools, but also challenges.
Tuesday, December 23, 2014
Yosemite Project and other Chimera
Chimera was also the subject of a presentation by Jeni Tennison, OBE, of the Open Data Institute and W3C TAG, at XMLPrague 2012, entitled "Collisions, Chimera and Consonance in Web Content." In this presentation, she introduces a compelling argument that suggests that currently, in the web, we are dealing with four different formats: HTML, XML, JSON, and RDF.
In many ways, these formats complement one another. Sometimes, they clash, creating impedance and dissonance, and sometimes they merge, forming weird and wonderful hybrids. Tennison's presentation is really quite remarkable, and well worth watching as each of these formats evolves.
As I have previously mentioned, another set of presentations, from Dataversity and SemanticWeb.com, are also worth watching and paying attention to. These deal with the Yosemite Project, ongoing work which intends to position RDF as a Universal Healthcare Exchange Language. This work is important in part because it directly addresses how to go about migrating and transforming between formats, once you can establish a common representation using RDF. In many ways, this is a mythical undertaking, but also very promising.
For instance, with the work underway with Project Argonaut and HL7 FHIR, you are looking at a standard for healthcare that comes in two flavours, XML and JSON; however, like its predecessor HL7 CDA, FHIR relies on a human-readable portion, which in this case means HTML5. Add to that the work underway with Yosemite - go watch the presentations! Now you have an ecosystem that supports appropriate use of HTML, XML, JSON, and RDF - the subject of Dr. Tennison's XMLPrague presentation - now in the context of healthcare. This is really what John Halamka has referred to as the "HTTP and HTML for healthcare".
If you broaden your horizons just a little, you will see some of the work which is also being carried out by Health & Human Services and the NIEM Health Domain, as a counterpart to the work of HL7 International. NIEM is primarily an XML-based standard, but in the last couple years, the underlying tooling there is expanding into UML-, JSON-, and HTML-based representations. With the support of some underlying ontology work, perhaps in concert with Yosemite, NIEM too could be used to create linked health data. These are all very exciting, very important things that are happening very very quickly, and it is a great time to get involved with some of these projects and initiatives.
Monday, December 15, 2014
Project Yosemite, SMART on FHIR, and the Argonauts
- Security
- CCDA to FHIR Mapping
- FHIR Implementation Testing
Josh Mandel, the lead architect behind SMART on FHIR® also spoke recently as part of a series on of five presentations on Project Yosemite, held by SemanticWeb.org and DataVersity. Project Yosemite began a year or so ago with the Yosemite Manifesto, which establishes RDF (the Resource Description Framework that underlies the Semantic Web and Linked Data) as the best candidate for a universal healthcare exchange language. Project Yosemite follows two paths, "Standards" and "Translation", based on the premise that standards adoption is of primary importance, but that there will always be a need to translate between standards, and even between versions of the same standard.
The idea here is that once you build ontological mappings of various healthcare standards into RDF representations, then Semantic mapping tools like SPINMap and TopQuadrant's TopBraid can be used to construct robust migration/translation layers. This is the first step in producing a distributed network of Linked Health providers, similar to the work currently taking place with Linked Data. At this point, the presentation recordings from DataVersity are not yet all available, but they are definitely worth watching.
HL7 FHIR provides a potential successor to several HL7 standards currently in use internationally. Migration is a critical success factor here, and Project Yosemite presents a different way to approach migration. Perhaps coincidentally, RDF and FHIR are both resource-based approaches; RSS is a syndication format that emerged from work with RDF, and FHIR uses a similar syndication format, Atom, to aggregate and compose health resources, like Patient and Observation.
Project Yosemite benfits FHIR and Project Argonaut, Argonaut accelerates the first phase of ONC Data Access Framework (DAF) project. Project Yosemite is involved with ICD-11. This seems like lot of convergence, and the next 6 months will really show how much. It's a great time to get involved.
Wednesday, December 10, 2014
HL7 FHIR and Argonaut in Canada
The Argonaut Project has the backing of a number of American EHR vendors, including Epic, Cerner, Meditech, McKesson, athenahealth, with additional support from Partners HealthCare, Intermountain Healthcare, Beth Israel Deaconess, and Mayo Clinic. The project extends involvement these organizations already have with HL7 International, and promises to deliver implementation guides related to an emerging HL7 standard, HL7 FHIR, by May timeframe 2015.
- athenahealth
- Beth Israel Deaconess Medical Center
- Cerner
- Epic
- Intermountain Healthcare
- Mayo Clinic
- MEDITECH
- McKesson
- Partners HealthCare System
- SMART at the Boston Children’s Hospital Informatics Program
- The Advisory Board Company
This is a diverse group of collaborators and an aggressive timeline, but what does this mean for Health IT projects here in Canada?
Migration and Transformation
Whereas HL7 v2 uses "pipe and caret" notation, and HL7 v3 supports any wire format as long as it is XML, HL7 FHIR comes in two flavours, XML and JSON (which makes it particularly useful for mobile use cases). By design, FHIR is intended to provide a migration path for v2, v3, and CDA. This really reminds me of the intentions behind the development of XML in particular, as a sort of lingua franca for the web, and in that sense, XML has been very successful. As mentioned, for mobile and social use cases, a JSON-based standard for health information will be hugely beneficial as well.In Canada, we have built a foundation of healthcare registries and repositories based on HL7 v3 Messaging, although the applications that are in place in Hospitals and other Health Information sources typically come from U.S. vendors including many of those mentioned above, which requires a transformation layer from v2 to v3 and back again. I'd like to imagine a world where both the foundation and the Hospital information systems can communicate using the same standard, or through an integration layer that uses a common standard. Argonaut is at the very least a step in that direction.
Documents and Messages
Here in Canada, we have built our information access layer for health around Messaging; in the U.S., Document-centric health prevails. Canadian projects may involve the HL7 Clinical Document Architecture (CDA), but these are more limited in scope than the foundational work which has been carried out involving HL7 v3 Messaging. Recent guidance from Canada Health Infoway is to use the most appropriate standard for the job at hand. In many cases, that will be v3 Messaging, simply because the work is already underway.FHIR is quite clever in that it is based around Healthcare resources (Patients, Providers, Observations and so forth), a more granular approach than either CDA or v3 Messaging, and this is how FHIR supports both Message- and Document-based flow of information. This is crucial if your requirements are a hybrid, or if you are currently supporting one approach, but are aware that you will need to support the other. Simply put, FHIR dispels the holy war between Health Messaging and Health Documents. ("Unleash the KRAKEN!!!")
Example: Questionnaires
It goes something like this: you are tasked with creating a set of health questionnaires for a Canadian healthcare organization. Most likely, you will create PDF documents, but you might consider using CDA for a moment, because CDA provides an architecture for Clinical Documents. But that moment would pass. Now, consider this: the FHIR community has already held several connectathons involving questionnaires, and one of its members, David Hay, has already written a series of articles about extending the Questionnaire resource based on his experience.
So that's useful.
In particular, IHE (Integrating the Healthcare Enterprise) is currently developing multiple profiles using FHIR as a basis for mobile access - (MHD, PDQm, RESTful PIX). With Canada Health Infoway as the home of IHE in Canada, I am hoping that we can find uses for these profiles here as well. These profiles are under development, but if the consortium behind the Argonaut Project really wants to make a difference, they can throw their support behind IHE as well.
References
HL7 International Press ReleaseHealthLeaders Media - Argonaut Project is a Sprint toward EHR Interoperability
OnHealthCareTechnology - JASON: The Great American Experiment
HealthcareITNews - Epic, Cerner, others join HL7 project
John Halamka - Life as a Healthcare CIO - Kindling FHIR
Thursday, August 14, 2014
Given the anger, doubt and frustration prevalent in the public discourse around government IT, the only way public trust in the federal government's ability to use technology well for something other than surveillance and warfare will be through the deployment of beautiful, modern Web services that work. Jen Pahlka has explicitly connected government's technical competency to trust in this young century.
"If government is to regain the trust and faith of the public, we have to make services that work for users the norm, not the exception," she told to Government Technology, after leaving the White House. Mayors, governors and presidents are experiencing the truth of her statement around the country, from small towns to 1600 Pennsylvania Avenue.The challenge here is to move beyond secure, mission-critical systems that work in insulated environments - but fail to provide high value - to focus on measurable outcomes, quick(er) wins, higher value services for citizens. This is the holy grail of digitization.
AngularJS and Durandal
I am hoping that the next evolution of AngularJS (3.0) will align much more closely with Durandal, Web Components API, and Polymer. There is really no reason why custom Web Components cannot become just a standard practice for the web. And that's nothing like Angular custom directives, which are confusing, I think, because whereas in most cases Angular balances flexibility and prescription nicely, custom directives are incredibly flexible - transclude? allow directive through attribution or class, or just elements? and so forth. Custom directives are just way too flexible, and they need to be a simple API for doing one thing well, not a combinatorics problem.
On the other hand, what Angular provides that Durandal does not is exactly that balance of prescription and flexibility. Angular tells you how to do things like module structure and model-view-star, and as a development project lead, I appreciate that, because this makes establishing best practices and code reviews manageable. That is why I am expecting great things from Angular, especially if the next version also results in an update to the angular-ui.bootstrap project, providing a ready to use library of web components.
Tuesday, July 22, 2014
Single Page Applications and AngularJS
Mature JavaScript frameworks like Backbone, Angular and Ember have changed this, embodying the notion that you don't find a sweet spot between pure server push and pure client: you either load an application page by page, or you load a single page and construct the application from client-side templates, routing and model-binding. JQuery can support an SPA approach, but doesn't enforce it, and Adobe Flex enforces an SPA approach, but requires Flash to do so.
Of course, Angular is more than just an SPA framework. Amongst the features Angular provides:
Dependency Injection - a core value of the Angular framework, DI clearly defines the interface for what a class consumes through its constructor, rather than hiding class requirements within the class, which makes Angular JavaScript more readable, easier to maintain, and easier to test, since it is clear how internal connections between services are made within your application code base. This results in fewer lines of more maintainable code, and ease of testing.
Templating - Angular templating consists of partial HTML views that contains Angular-specific elements and attributes (known as directives). Angular combines the template with information from the model and controller to create the dynamic view that a user sees in the browser. The result is more natural templates, based on attribution well-formed HTML.
Two-way data-binding - allows you to work with JSON easily, particularly when this JSON is generated from standardized schematics. An example of this would be if an application receives a JSON payload that is constrained by an XML Schema in the server-side API (the API supports both XML and JSON, and the XML complies to an industry standard). In this case, the Angular view could also be generated from the underlying XML Schema.
Modular JavaScript - nothing special here, Angular allows you to separate the concerns of
controllers, services, filters, and directives into separate modules. Encapsulation makes these components easier to maintain and easier to test, for instance by a team with multiple members.
Controllers and URL Routing - aside from Dependency Injection, Angular's MVVM pattern is the big win here, and routing is just something you need to get used to. Originally, JavaScript was the glue-code for the web, and once you have your application sufficiently modularized, you will find that your Angular controllers retain this stickiness, but, as you build reusable services, your controllers remain lightweight. If you have any business or maintenance logic in your controllers, it is time to refactor and create services. Controllers and routing may not be reusable; services and views will be.
Multi-level data scoping - scope is confusing in JavaScript because of the way global scope and declaration hoisting work. Angular simplifies passing scope into a controller or service, and offers a rootScope object that replaces the global namespace. Further, events can be associated with scope at various levels. Data binding, the event model, and service invocation all use the same scope mechanism.
Responsive Design - Bootstrap is a Responsive Web Design library built in CSS and JavaScript. The JavaScript portion of Bootstrap has been ported to Angular directives as part of the AngularUI extension, which fits nicely within the Angular directive paradigm. Directives are fundamental to how Angular works. Using the Bootstrap directives removes some of the need to develop custom directives (custom behaviors and HTML tags). [http://angular-ui.github.io/bootstrap/]
Web Components - with the upcoming partial merging of the Angular framework with the Durandal presentation framework, Angular should move one step closer to supporting the Web Component API, which aligns with the intent behind Angular custom directives, and will bring these more in line with projects like Polymer. By using a common API, these UI libraries become more transportable.
Monday, July 21, 2014
Back to Basics: Rhizome
I started the Rhizome reference implementation a year ago as a way of demonstrating how a combination of client-side services, constructed using Angular and Cordova, and server-side adaptation and integration, constructed using Worklight, could be used to build a mobile health app for the enterprise. The pieces are there, and I have come to their conclusion that the server-side integration, while important, should really just be built into the application server, which hosts the server-side API. If the server-side API is built to an industry standard like NIEM or HL7, then the burden of integration is lightened, and maybe it could take place within a resource-based suite of client-side services.
The greatest illumination for me came when I stopped trying to build the server back end and with a client app extending it, and instead focused on a client app with an HL7 FHIR standardized interface. Do I have to do a lot of adaptation on the server? Depends on the data source, but... In an ideal world, thee data source has low impedance, and it is already FHIR JSON. In that case, an Angular app built around the core FHIR resources just works.
So I'm taking my references implementation in a slightly different direction, less coupled to an enterprise mobility platform, more reliant on a strong client-side architecture which is resource-based and standardized for the health industry, leveraging profiles from organizations like IHE and HL7 where possible, and probably with a more specific focus on care plans and questionnaires, without losing focus of prescription medications.
I'm also going to try posting more frequently, for a variety of reasons, so please feel free to comment. I have really enjoyed working with AngularJS over the last year, and I know I'm not alone in this.
Saturday, July 19, 2014
Tracking the convergence of NIEM and HL7
The two communities could really benefit from sharing an understanding that to save money on implementation and stakeholder engagement, they need tools which provide the ability to easily and visually review and alter exchange packages (IEPD, FHIR Conformance Profiles), to reach absolute consensus; and then generate terse and completely accurate validation packages and conformance suites, so as to increase ongoing information safety. We need to be able to put all of the important details on one page.
NIEM and HL7 are both messaging models based on an underlying information model, and whereas HL7 is moving away from design by constraint towards design by extension, NIEM has always relied upon an extension mechanism. The difference here comes down to the size of the NIEM problem space ("everything"), as opposed to HL7 ("healthcare"), for which you might be able to imagine a totalizing framework that encompasses all workflow in all contexts; however, for HL7 as well, a workable extension mechanism is proving to be essential to success, and this is a change from the paradigm established with HL7v3.
NIEM and HL7 are both moving towards support for multiple wire formats. In domestic U.S. markets, HL7 means either "pipe and caret" v2 or "quasi-XML-HTML" hybrid CDA, but internationally, HL7 is an XML standard which is outgrowing the business cases for XML, much like NIEM. For both of these standards to grow and implement future business cases, they will need to also embrace and support JSON, HTML, and RDF, and given time they will.
HL7 is moving away from a proprietary tooling set towards tooling which is readily accessible, like Excel, Java, and XML editors. NIEM already uses a similar toolset, and has several initiatives in play to support open tooling like CAM Editor and UML tooling. One of the difficulties we have run into with HL7 v3 is difficulty sharing visual models, since these are captured in proprietary tooling, and it is here that the NIEM and HL7 communities would both benefit from demanding better tooling. Put simply, shouldn't these two standards support and be supported by a common toolset which extends beyond XMLSpy or Oxygen? And, given time I'm sure they will.
This is something I feel strongly about. At their core, NIEM and HL7 RIM rely on XML Schemas, and yet, XML Schemas are not sufficient to the task. In the HL7 world, as far as v3 Messaging and CDA are concerned, ISO Schematron fills this gap. For NIEM, OASIS CAM performs a similar task; but there is a disservice here to both of CAM and Schematron, that these are treated only as validation tools, when in fact, they contain key pieces of business. The same is true of UML - these should be the tools we use to visually communicate the business to the business.
Some of the tools will be open source, some of them will come from the product world. If the NIEM and HL7 communities articulate their needs, the tool vendors will follow. In short, HL7 and NIEM are both going to need to converge on a set of XML-based tooling that goes beyond XML Schemas and Visio diagrams. The CAM tooling provides some of this. The Excel-based Resource Profiling in FHIR provides some of this. UML tooling provides some of this.
To reduce the burden of approval for stakeholders, both messaging standards need to allow modelers, implementers, and business stakeholders to meet in a room and review the details of a proposed information exchange on a single page, and this will provide high value. When this is happening, information safety increases because the resulting XML Schemas and documentation produced after this meeting will be simpler, more accurate representations of the business.
Thursday, July 10, 2014
Converting NIEM XML to HTML5
There are 4 main information formats used in the Worldwide Web:
- HTML is ideal for documentation, tables, and open data, because it is easy to publish and forgiving. HTML is fundamental to REST as a way of exposing endpoint documentation.
- JSON (JavaScript Object Notation) is a lightweight data-interchange format. It is easy for humans to read and write. It is easy for machines to parse and generate.
- The Resource Description Framework (RDF) is a language for representing information about resources in the World Wide Web.
- Extensible Markup Language (XML) is a simple, flexible text format derived from SGML (ISO 8879), designed to meet the challenges of large-scale electronic publishing, and plays an increasingly important role in the exchange of a wide variety of data on throughout Web.
- DITA (Darwin Information Typing Architecture) and DocBook are used to assemble documentation out of markup. These will probably both be eventually supplanted by HTML5.
- ATOM and RSS are XML-based syndication formats. JSON-based syndication formats have also been described, although this is less mature.
NIEM currently supports XML-based and JSON-based business cases as a way of quickly and rigorously exposing data for exchange and migration. In addition, the NIEM JSON flavor also supports web and mobile web applications, using the mentioned 4GL frameworks and their like. The quickest way to expose NIEM information, however, is using the HTML information format (most likely HTML5, which is more semantically rich than previous versions).
Basic rules for converting NIEM XML into NIEM HTML:
- Create one element per element, with the exception of lists.
- For node elements, use div.
- For leaf elements, use span.
- Where makeRepeatable, use ol and li, containing either div or span elements as per above.
- For any element, class attribution represents datatype (like "string" or "date")
- For any element, id attribution represents XML element name, including namespace prefix (like "ncPersonName")
In the same way that a full HTML page can be created from NIEM information, it should also be possible to generate partial or natural templating. In essence, this is just a fragment of HTML. This may be required to support platforms like Java-Spring-Thymeleaf, Oracle ADF, or Meteor, which all rely on some sort of direction through attribution. The simplest way to expose information is still to create the entire HTML page, instead of a partial. This is noted here because whenever NIEM JSON is used, there will likely be a requirement to generate a template from the NIEM CAM as well.
Note that NIEM is not currently resource-based; their is no inbuilt facility to support REST by exposing resource identifiers; however, one of the requirements for REST is to expose documentation at the endpoint, and it should be possible to generate this documentation directly from the IEPD (I think Datypic generates something like this for the NIEM Core). In this case, the IEPD may be sufficient.
Working with multiple standards for Health
Obviously, this creates a space of impedance mismatch where continuity of service is put at risk. As a way of mitigating this risk, v3 Messaging is augmented with a companion specification, CDA, the Clinical Document Architecture, which promises to supports health documents like Continuity of Care, Health Questionnaires and Care Plans, as well as business cases using CDA to handle data in migration. Again, in the U.S., HL7 CDA has been used as an alternative to v3 Messaging to support exchange of health information, and in Canada we may benefit from following that path, but if we do, we should be aware that this path is probably morphing as we speak into a thing called "C-CDA using HL7 FHIR XML".
As discussed here and elsewhere, FHIR is a successor standard to all three HL7 standards, providing support for JSON and REST which have not been previously available, as well as the ability to essentially re-implement CDA using a similar XML standard. FHIR has a lot of potential in Canada and abroad in order to enable mobile health applications, but in order to design an build these applications, we need to reconsider the iEHR architecture on which we are currently building.
To that end, I have a number of suggestions:
- Foster communication between systems using like standards: for instance, we have invested substantially in communicating clinical information between clinical systems in hospitals and the foundation layer of Labs, Pharmacies and Diagnostic Imaging; but can we find quick wins through improved intercommunication amongst the domains in the foundation, or between the enterprise systems that use v2 natively?
- Create an adaptation layer supporting lightweight secure access: this is where FHIR may play a part, used to expose high value information across the enterprise. The danger in providing an incomplete picture is that people will take it for a complete picture; because FHIR is rooted in extension, composition and aggregation, it may provide a way to build a fuller picture of longitudinal patient information.
- Registries like Provider, Client and Location should provide more comprehensive Identity Assurance; again, this really means removing continuity gaps within the services available to a patient, thus providing the history of interactions which is a necessary part of guaranteeing identity.
- Create an application layer that supports developing mobile and web applications that can connect directly to the resources exposed in step 2.