The Complexities to Creating Real Electronic Health Records
If you have been paying attention (even if you do not live in the United States), you know that the U.S. Congress passed a broad, sweeping piece of legislation, purported to reform health care in the United States. What you probably do not know (even if you do live in the United States) is that this is not the first piece of legislation to target the issue of health care. In fact, a number of pieces of legislation, and a fair amount of money, has been thrown at the issue of health care in the United States in the last few years, specifically in the area of electronic health records. But with all of this focus on streamlining and digitally electrifying health records, I began to wonder where did the Open Source community stand and where is its input? There is certainly a lot of money sitting out there for someone who wants to try to build the better mouse trap.
What got me thinking about this was an article in the Spring 2010 issue of 2600 that reminded me that the issue of record keeping, one of the so called linchpins of reducing a lot of the costs of health care, is not as simple an issue as you might expect. The article cites the break-in and ransom of the Commonwealth of Virginia’s Prescription Monitoring program. This is not the first time I have written about it. But for me, it highlighted one of the many issues that need to be considered when we talk about an eHealth record system. To me, this and a couple of other issues need to be highlighted to better understand the full scope of the problem.
Electronic health records need to be about more than just cost recovery. You could argue that electronic health records are already widely used. And I would not argue with you. I have records in more than my share of medical facilities. But the core purpose of these records is not about patient care, but financial recovery. Nothing made this clearer to me than when I went in for a routine procedure last year. I sat in front of a gentleman who typed in all my important data (name, insurance company, credit card number), made me sign, electronically, a Yes, I will pay form (oh, and a living will) and strapped a human readable and bar code band around my wrist and sent me on my way. While I was lying on the gurney and a procession of medical professionals came by for my physical signature on a ream of paper, each page of which went into a three ring binder, I began to wonder just how automated this all was. The answer, of course, is not very…until the bill arrived.
The health insurance payment process is a cottage industry: a very large, poorly organized cottage industry. In the course of my day job, I had the opportunity to talk with a gentleman from one of the giant heath care insurance agencies in the United States. We started talking about the entire process of processing insurance bills and the complexity of it all. The process is byzantine. A large amount of the physical number crunching is done on what can only be described as legacy systems, with input and output bolted on to them. Old fashioned EDI rules the roost, not XML or similar forms of data transformation, although they are gaining a foothold in some areas. Many of the processes, both input and output, are highly manual, prone to error and as you might expect, expensive.
Electronic health records are not just for hospitals. Like most of you, I have my fair share of doctors that I have to visit on a semi-regular basis, and I seem to add more each year. I have a stable of about eight medical professionals that look after my family including professionals, and of these eight, only one, my ophthalmologist, is completely paperless, except for the actual signature on the charge sheet. The rest have racks and racks and racks of files, filled with inches and feet of medical records. And in most of these offices, there is a dedicated person (or team of people) whose only job is dealing with insurance billing and paperwork but no one is dedicated to the files per se. What this says to me is that any new system has to be easy to use or medical offices will need to hire more people to do less medical work. One thing surprised me, reading the article in 2600. When you are injured and taken to hospital, the EMTs also have to fill in paper and as illustrated by the author, sometimes multiple copies to feed multiple, disparate and disconnected systems, to recover their funds, which means that remote devices will have to be a part of the solution as well.
So if these are some of the issues, and as we look around, are there existing models or solutions that might give us a nudge in one direction or the other? One example of how it could work is the Veterans Hospital Administration VistA system, which has won numerous awards and accolades for technology. The VistA system automates everything from patient care to benefits management to remote prescription dispensing. Doctors can share diagnostic information, images and decisions locally or through consultation across the nation. And the code is Open Source.
But as good as VistA is, it is not the ultimate solution. For starters, the implemented system uses legacy equipment and languages most people no longer program. Further, it is a closed system in terms of who it talks to and who it receives data from, which means data are secure but getting data to and from the system still requires a degree of manual sneakernet processes.
This leads us to another, different set of issues.
How do you get data into and out of the system? And do it securely? I would argue that the banking system probably has a good model, but I cannot speak to it completely or whether it would actually be a good model. Clearly, encryption, shared and public keys, and dedicated pipes would be required. And that infrastructure costs money.
How do you set about equipping, what are essentially mom and pop shops, in the form of the doctors’ offices? And find a way to easily keep the software running and up to date and doing break fix. A whole cottage industry could evolve around this--but with very thin margins. Most doctors offices are run on a shoestring, if they make money at all, so the added expense of hardware, software and the other moving parts that make up the system have to be economical. Makes cloud computing or hosted applications or whatever you want to call it--make sense in certain cases (and yes, I can think of hundreds of reasons why it should not be hosted in the cloud).
Then we have to wrestle with the standards for data interchange. That simple term, data interchange, is loaded with so many complications, permutations, combinations, and issues that the term just is not sufficient to cover the mess that needs to be addressed. And coming up with acceptable standards is going to be a long, tortured process.
In all cases, though, all the roads seem to lead to modernization. No real improvements, in terms of cost savings or speed of processing, will occur without a great deal of modernization. You will not be able to implement some of the necessary standards if the underlying OS is incapable of utilizing them (or the hardware cannot absorb the protocol). I realize it is hard to imagine, in 2010, that there are any systems out there that do not speak TCP/IP, but there are. A staggering number of them fronted by gateways of questionable stability and capability. And of course, any modernization costs money.
And have I mentioned security? In the United States, there is a series of laws, know as Health Insurance Portability and Accountability Act (HIPAA), which defines a number of (sometimes expensive) requirements to protect the data of the individual. These include but are not limited to encryption, access controls, audit controls, authentication controls. Entire monitoring systems could be developed just to ensure that HIPAA requirements are being met.
Is any of this insurmountable? Certainly not, and there are a number of areas where the Open Source community can play a valuable roll. But on reviewing the challenges, I also see why there is not more involvement. Many of the issues require specialized knowledge of a number of aspects of what is today a black box to many. There are few defined requirements and even fewer good directions. But then, is that not what is the best about the Open Source community? We see a problem and try to solve it. Perhaps the right problem has not been presented. Or maybe it is such a niche that I have not seen the strides we have made. But as the big guns of the consulting world are looking at solving this, I hope they will leverage the Open Source model, as well as solutions used elsewhere. And perhaps, just perhaps, a year-five year-from now, we will be talking about Open Source eHealth software the same way we talk about network management software or VoIP software.
|Happy Birthday Linux||Aug 25, 2016|
|ContainerCon Vendors Offer Flexible Solutions for Managing All Your New Micro-VMs||Aug 24, 2016|
|Updates from LinuxCon and ContainerCon, Toronto, August 2016||Aug 23, 2016|
|NVMe over Fabrics Support Coming to the Linux 4.8 Kernel||Aug 22, 2016|
|What I Wish I’d Known When I Was an Embedded Linux Newbie||Aug 18, 2016|
|Pandas||Aug 17, 2016|
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- Happy Birthday Linux
- Updates from LinuxCon and ContainerCon, Toronto, August 2016
- New Version of GParted
- ContainerCon Vendors Offer Flexible Solutions for Managing All Your New Micro-VMs
- What I Wish I’d Known When I Was an Embedded Linux Newbie
- The Great Software Schism
- NVMe over Fabrics Support Coming to the Linux 4.8 Kernel
- Tor 0.2.8.6 Is Released
- Blender for Visual Effects
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide