Bloated and sluggish electronic health records are commonplace in healthcare today, along with the allure of new hardware that promises to speed the systems up.
There are problems, though, with that approach: Hardware is expensive, and IT upgrades are not in the budget of many hospitals. In many cases, it is not even the best solution.
A tsunami of data
Systems have become overwhelmed by a tsunami of data advancing at a rate unlike in other industries, said James D’Arezzo, CEO of Condusiv Technologies, a tech vendor known for its file-system defragmentation software package for Microsoft Windows and OpenVMS. The underlying problems that bog down EHRs are software problems that can be remedied, he said.
Focusing on how data is processed and structured can impact the out-of-the-box performance of a new system as well as return efficiency to an old one. Still, D’Arezzo cautioned that without true interoperability between EHRs, the problem will persist.
“It’s just like a highway system,” said D’Arezzo. “You build as fast as possible, but if cars are loading on faster, you’re going to get traffic jams.”
If a system is being overloaded by data, just buying one’s way out of the problem with faster hardware might work – for a while. But when a system with a high rate of duplicate entries cannot distinguish between John A. Smith, J Smith and Smith, John, people who need access to clinical data are bound to get lost, or worse, the wrong information. Having disparate EHRs that cannot even communicate fully does not help.
A double whammy
“You’ve got this double whammy going on,” says D’Arezzo. “You still don’t have interoperability but so much data coming into your system.”
The problem goes deeper. Operating systems like Windows are fundamentally flawed in how they process data, although simply switching to another OS will not really help. They are inefficient at managing the flow of data to be read and written, processing in a linear fashion that might work well on a home PC but begins to show signs of strain when handling an entire hospital’s worth of data, he explained.
However, a software problem, D’Arezzo said, has a software fix. A utility manager that can “pack more payload” by routing the flow of data more efficiently can increase a system’s performance. Taking a closer look at how an OS handles large quantities of data and investing in a real-time batch processing solution can give older systems an edge as well as extend the life of the investment in new hardware.
“Maybe in the future, hardware will be so powerful and so cheap” that efficient data processing will not be necessary, D’Arezzo said. “But we’ve been hearing that for 30 years.”
Benjamin Harris is a Maine-based freelance writer and former new media producer for HIMSS Media.
Twitter: @BenzoHarris.