I’ve had to spend much of the last couple weeks up at Stanford Hospital to help with a family medical issue, which has given me the opportunity to observe the current state of health care in action. Stanford isn’t just any hospital; it consistently ranks around the top in virtually every form of medicine, so we consider ourselves lucky to have access to that level of care. I used to live about a mile from Palo Alto so it has also been interesting seeing how the area has changed over the past decade or more.
Although it was a planned visit, we still had to check in via the emergency room, which was standard procedure for a quick overall medical checkup. The entire paperwork process took all of two minutes… a quick swipe of the insurance card and a verification of current information. The process was undoubtedly efficient… so efficient that when I returned home briefly a couple days later I already had the first bill waiting for me! The rest of the admitting process was just as efficient.
However I was soon rather amazed at the manual processes for taking regular vitals… blood pressure, pulse, etc. A nurse would come in wheeling the equipment, take the readings, then scribble them on a scrap of paper that she’d then jam back in her pocket. I’m not kidding… a real, live scrap of paper. Even a soiled napkin in one case. Apparently she’d then go back and transcribe her "notes" into the electronic record-keeping system that Stanford uses. And we wonder why errors create big problems in hospitals?
Later that first week I saw a display in the lobby announcing their "e-health record" initiative, which included mobile stations for taking vitals that were linked via wireless network directly into the records system. I saw it in action a day or so later… two nurses wheeled a computer contraption into the room. With one nurse trying to explain (errr…. "train") the other, they tried to take basic vitals. The screen was obviously not very intuitive or the training was imperfect as there was considerable frustration. One vital did swoop straight from measuring device into the records system, but then they gave up on the blood pressure and wheeled their old unit back in. And scribbled the results on a scrap of paper.
There was obviously a lack of training, and perhaps a lack of intuitive interface design. But the visual guy in me started to wonder about the seamless transfer of data from the measuring device into the mobile station and then to the records system… with no human intervention. The old process, while obviously prone to error, required the nurse to perform at least a cursory data review as he or she transcribed the data onto the napkin and then again into the system. An outlier could be immediately recognized as either bad data or something that should be dealt with quickly. Presumably an algorithm could also be trained to identify such outliers, but could it make a quick qualitative judgment call?
What is more important… reducing errors from transcription, or missing an opportunity to react to data? Or can both be resolved?
This is similar to what happened when calculators became cheap and readily available: students lost the sense of what a "good" result looks like. Another similar situation exists in manufacturing. How many companies trust the electronic MRP system and lose the gut knowledge of what real demand should look like? How many let the system calculate standards and even prices, without a gut feel, driven by experience, for knowing if the results are correct?
Fellow blogger Mark Graban writes quite a bit on the subject of lean healthcare, so you might want to check out his blog for more on the subject.