We know a lot (or actually most) of our posts about infection and hospitals can be terrifying. But here's some good news: A least you don't live in the time before antibiotics and infection control!
The era of modern medicine is but a blink compared to the millennia of human existence. Considering that medical leaps of understanding have only occurred during the last two hundred years, just thank your lucky stars you live now.
There are some major innovations (beyond even infection control) that have helped medical professionals improve the survival rates of their patients. The context of these innovations may remind us of the Greek story of Cassandra. This beautiful princess caught the attention of the god Apollo, who gave her the gift of prophecy. Unfortunately, she rejected Apollo's advances, so he placed a curse on her that no one would ever believe her predictions. Everything she predicted would occur, but no one would ever believe her. To get a sense of that particular form of agony, let's look at the history of infection control.
#1. This is not a butcher shop.
The first, and possibly most overlooked "innovation," is the concept of not immediately cutting into your patient to try to find and/or remove the problem. While there is anthropological evidence of surgeries from as long as 12,000 years ago, only a handful of remains indicate that many patients survived invasive procedures. We can only surmise about the Cassandras of this time period and for millennia thereafter since historical records, if they exist at all, tend to only record the majority view. However, we know from ancient medical texts that health care communities, with close ties to religion and spirituality, were very rigid and had little room for innovation.
But even after medical procedures became more conservative (ie, don't just start chopping or bloodletting), there was the inevitable, unavoidable, and sometimes demonically-attributed complication: Infection. The Sumerians and the ancient Greeks used wine as a cleanser, and Hippocrates (of the Hippocratic Oath) was aware that the formation of pus was not natural or to be encouraged, but hardly anyone listened to him. Hippocrates, the Father of Western Medicine, is now recognized as having brilliant and revolutionary ideas. But in his day, he had to fight to have his ideas heard, and was even imprisoned for 20 years for his efforts.
#2. There is something at work here (but we don't know what).
A second major innovation was the connection of disease spreading (and infection) to physical contact, including airborne transmission. This included person-to-person contact, surfaces and clothing, and sneezes/coughing. Just this concept, that something unseen was causing infection, was a major breakthrough in a time when infection was often attributed to the Devil (or was thought to just pop into existence spontaneously). This understanding helped control transmission somewhat, but it was not enough to stop or slow the massive epidemics that hit the human race (the Plague, Spanish Flu, etc.). Part of the reason no real impact was felt is that people in general did not believe that things you could not see could hurt you. (Sound familiar?)
#3. Danger comes in small packages.
There are two parts to infection control's third innovation. First, scientists discovered the presence of bacteria and other microorganisms. This happened about one microsecond after the invention of the microscope in 1674, but it was still not understood until 200 years later. Which brings us to the second part - connecting these microorganism to infection. Louis Pasteur and Robert Koch played significant roles in demonstrating that bacteria led to disease, leading to Pasteur's invention of the vaccine.
So now that the medical professional knew that microorganisms existed, caused illness and disease, and could be spread by contact, how could they be destroyed? How could these microorganisms be stopped if we couldn't even see them?
In the mid 1800s, one doctor, Ignaz Semelweiss, was able to show that women died in childbirth if their doctors failed to wash their hands after post-mortems. What a simple solution! Washing hands! Unfortunately, his solution was mocked and ignored, making him infection control's true Cassandra. He had data that proved that hand-washing impacted patient outcomes, he just couldn't prove why hand-washing made a difference in mortality. So hardly anyone listened to him.
Finally, the study of bacteriology combined with the evidence of hygiene benefits resulted in actual changes to medical practice. Over the course of about a decade in the late 1900s, mortality rates decreased significantly as medical staff began using antiseptics, washed their hands between patients, wore sterile gowns and caps, used heat to sterilize equipment, wore masks, and began teaching hygiene in medical school. (Truly terrifying to realize these are "modernizations.")
Fortunately, modern hospitals have come a long way in infection control, with new inventions joining the fight all the time. The most revolutionary is the discovery of antibiotics, which have saved millions of lives since Alexander Fleming discovered penicillin in 1928. But with mounting bacterial resistance to antibiotics, preventing infection in the first place is a top priority in today's hospitals, and the fight takes place in 4 main categories.
Editor's Note: This post was originally published in June 2019 and has been updated for freshness, accuracy and comprehensiveness.