The medical chart is set to become a thing of the past. Those thick folders containing your medical history are steadily being replaced by electronic health records, or EHR. The Veteran's Administration initiated the first large-scale implementation of these computerized files in the 1970s, but the concept was slow to catch on in general practice until the advent of a combination of powerful and affordable hardware, fast and secure internet, and reliable and seemingly endless cloud storage capabilities. Since then, EHR systems have been shown to make physician visits faster, help coordinate care between multiple offices, and improve health outcomes. Can EHR bring the same success to the fight against hospital acquired infections?
The unsung benefit of EHR is not just the storage, access, and transmission of health records: It's the analysis of the data contained in those records, and not just by humans. Physicians, of course, can use the data to see how their diabetes or high blood pressure or elderly patients are doing, or to see trends in demographics. However, some of the more advanced EHR systems have built-in safeguards that alert doctors to patients for things like drug allergies or dangerous drug combinations. These systems can also remind medical staff to perform certain tests or monitor for specific side-effects or symptoms. If a healthcare worker makes a mistake, it easy to correct and the correction is quickly made through multiple records simultaneously.
EHR systems can also alert a physician to patterns in a patient's file that could be a result of an infection. The system pulls data about the patient, their medications and their test results and runs them through a filter of known conditions, alerting the care team if the results could indicate a life-threatening infection such as sepsis. While healthcare workers assess this data as part of their professional duties, the alert serves as a back-up in case of overlooked or hard-to-detect symptoms.
Another way EHR may soon impact healthcare is the use of medical records rather than insurance claims as a way to measure and track infection rates. Looking at sepsis records, researchers found that claims data showed a 10% increase in sepsis rates over a period of 5 years with a 7% decrease in mortality, while EHR data showed no significant change in either rates or mortality. The study concluded that changes in sepsis numbers may be more as a result of changes in coding practices rather than changes in disease diagnosis.
Researchers at UC San Francisco shared how its Health Informatics team used EHR data to improve their infection control by using time- and location-stamps for all procedures involving patients with a known Clostridum difficile infection. By analyzing patients who crossed paths within 24 hours with those infected patients, the informatics team was able to identify the CT scanner as a likely source for transmission (patients were more than twice as likely to acquire C. difficile if they used a scanner within 24 hours of an infected patient). As a direct result of the data revealed through analysis of EHR records, the hospital adjusted the CT scanner cleaning protocols.
COVID-19 pushed Piedmont Healthcare to increase efficiency using electronic medical records to automate their HAI drill down process. As the system was implemented, the software identified more cases of HAIs that would have been missed with past methods. This lead to a recommendation to the NHSN to set new processes for benchmarking hospitals fairly against others using EHR for data reporting.
As with all fields feeling the impact of "Big Data," medical care will have to invest time and effort into creating the algorithms that turn mountains of data points into meaningful results. As software companies, entrepreneurs, and healthcare professionals work to create this infrastructure, we consumers are sure to see many changes in the years to come.
Editor's Note: This post was originally published in December 2017 and has been updated for freshness, accuracy and comprehensiveness.