By James D’Arezzo, CEO of Condusiv Technologies.
While advancements in data collection, analytics and electronic health records can lead to better healthcare, it also creates challenges for the healthcare sector’s IT departments. That’s because serious obstacles exist in terms of systems management and capacity. The sheer amount of healthcare data has skyrocketed from even a decade ago. Through advanced predictive analytics, this data can save lives by fostering the diagnosis, treatment and prevention of disease at a highly personalized level.
To maximize the benefits all of this information can offer, healthcare organizations will need to make significant investments in data storage and infrastructure. With simple software fixes, many healthcare IT departments could easily free up half their bandwidth — essentially doubling IT budgets — by more efficiently using the infrastructure already in place.
The health data tsunami
Healthcare institutions must comply with more than 629 different regulatory mandates in nine domains, costing the average community hospital between $7.6 and $9 million. Much of that spending is associated with meaningful use requirements –- government standards for how patient records and data are stored and transmitted. The average hospital spent $760,000 on meaningful-use requirements and invested an average of $411,000 in hardware and software upgrades for their records systems in 2016 alone
Because of the demands of healthcare record-keeping and continued advancements in medical technology, IT spending is rising exponentially. Along with that, medical research and development is booming to the point that institutions can’t keep up with the amount of data that needs to be stored and analyzed. Pharmaceutical and healthcare systems developers are also affected by the gap between data acquisition and analysis. Life sciences companies are launching products faster and in a greater number of therapy areas.
This fast-paced technological evolution places even more pressure on healthcare IT departments to deliver both innovation and efficiency.
Performance degradation occurs over time as the input/output (I/O) movement of data between the storage and computer/presentation layers declines. This degradation is particularly prevalent in the Windows environment. Luckily, targeted software solutions do exist that can improve system throughput by up to 50 percent without additional hardware.
If I/O drags, performance across the entire system slows, which primarily impacts computers running on Microsoft SQL servers (the most popular database in the world.) The Microsoft operating system is also notoriously inefficient with I/O. In fact, I/O degradation is much more common than most organizations realize. More than a quarter of organizations surveyed last year reported that poor performance from I/O-heavy applications was slowing systems down.
To handle this escalating volume of data, and to reap the enormous promise of impending medical developments, the healthcare sector’s IT chiefs need to stay focused on the basics of what they are being asked to do. While investments in storage and infrastructure will be helpful to a degree, big data is primarily a matter of processing a certain volume of information at a certain speed. The ability to do that is fundamentally dependent on the overall system’s I/O capacity—which can be affected only to a limited extent by additional hardware. New hardware can promise more I/O per second, but if the data it is processing is filled with small, fractured, random I/O it quickly becomes unmanageable.
While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by up to 50 percent and should be part of the IT toolkit for any large-scale healthcare organization. Appropriate system software is just as important as hardware. Data analysis is inherently slower than data acquisition. It can be made a great deal faster by optimizing the performance of existing servers and storage.
Industry experts agree. A biomedical journal commenting on the recent Precision Medicine World Conference (Jan. 20-23, 2019, in Santa Clara, Cal.) observes, “The results from this type of research will likely offer countless opportunities for future clinical decision-making. In order to implement appropriately, however, the challenges associated with large data sets need to be resolved.” A recent report from the California Precision Medicine Advisory Committee echoes this assessment, adding that precision medicine will require significant investments in data storage, infrastructure and security systems in the coming years to achieve its full potential.
The American healthcare industry is already wasting up to $1.2 trillion a year on completely unnecessary processes — $88 billion because of the ineffective use of existing technology alone.
The first place many healthcare administrators can look to manage unnecessary spending is their current IT performance: I/O fixes can easily double system performance and speeds, creating more breathing room while accommodating increasing data storage needs. Maximizing the capabilities of your existing systems is also far more economical than expensive hardware upgrades.