Policy Before Data
I’ve been thinking about Canadian women’s breasts. Well, not their breasts individually, and not for aesthetic evaluation. I’m talking, if you haven’t heard, about the big study out of Canada that failed to show mortality benefits from mammogram. This post is actually not about the results of the study. For my purposes the results are irrelevant. This is about Data. Evidence. And how we get it.
First of all, that Canadian study could never have been done here in the US. You’d never find 45,000 women to consent. You’d never get it past the institutional review board. Because mammograms are now what we call “standard of care”.
But let’s talk about the greater issue of randomized clinical trials (RCTs) in general. The Innovation Center got criticized recently for not doing enough RTCs in its effort to find more efficient and cost effective ways to deliver health care. I think I talked about some reasons why that might be. But here’s another reason. In this country policy comes before data.
A couple of examples. Take the whole Time Out movement. Someone thought we could prevent more wrong-side surgeries or wrong-patient surgeries if we implemented a Time Out, in which every member of the surgical team has to stop before incision and review the patient’s medical record number and correct side. Got national attention and everybody implemented it. No one said “Hey, that’s a great thought. Let’s see if it is actually true that time outs reduce medical error!” We did it backwards. Now we can’t do 100,ooo surgeries and randomize them to time out or no time out. The policy is already in place. And even if someone did such a study, people who are invested pro or con in time outs wouldn’t change their minds, or their policy.
So thats Time Out, but TO just costs a few seconds and a bunch of educated people standing around with an intubated and draped patient with a bone sticking out of his leg intoning medical record numbers and agreeing that, yes, it is the right leg that’s broken. Some policies have environmental implications, if you care about that sort of thing. A few years ago a hospital got criticized by the accrediting organization for not having post-operative anesthesia notes on out-patient, ambulatory surgery patients. Someone decided it would be good for everybody having a knee scope to get a note in the chart from an anesthesia provider that said everything went fine. Now, everything was going fine before, mind you. No one had been sent home inappropriately, no one complained, no one felt ignored. But there was a policy. No one said “Hey, that’s a great thought. Let’s see if it is actually true that anesthesia post-operative notes improve patient outcomes.” Uh uh. Backwards. Now we generate, lets see, conservative estimate 40 patients per day, 20 days a month, 12 months a year, thats…(I hate math) like, 10,000 pieces of paper. Per year. Policy before data.
I could go on. Let’s take a slightly different approach. On a national scale, CMS (Center for Medicare and Medicaid Services) has decided that one of the quality measure for anesthesia is whether or not patients were adequately warmed in the OR. OK, fine. I want to be warm if I’m operated on. CMS feels so strongly about this measure that they won’t pay you for doing the anesthesia for the surgery unless you can prove you addressed the warming issue. To be fair, there is plenty of evidence that warm is better than cold for surgical patients. So anesthesia departments everywhere put a little check mark in their EMRs saying “patient appropriately warmed” or something to that effect. Fine. Except that’s not a measure of how many patients got warmed. It’s a measure of how many people clicked the box. In this case, we have data, we have policy, but the policy is based on flawed or meaningless data.
The problem with health care in this country is not data. We’ve got data. We’re drowning in data. This data has very little impact on what people believe or what they do. More importantly, we are in danger of implementing policy with bad data, or no data at all. A good idea is not proof.