That controversial McKinsey & Company study, the predicting severe disruptions from health care reform, isn't going away. It may be at odds with what the Congressional Budget Office, Rand Corporation, and Urban Institute found. It may raise severe doubts among the firm's employees, one of whom has said, flatly, that the survey is "not a good tool for prediction." But conservatives are still citing as proof in their attacks on the Affordable Care Act. The latest is Karl Rove, who uses the story as the peg for a Wall Street Journal column titled "The Obamacare Bad News Continues."
Actually, the article itself isn't as bad as the headline. Rove, to his credit, doesn't say that the McKinsey prediction is correct. He merely suggests that it, along with a handful of other studies, "call into doubt" the official predictions that relatively few employers will drop insurance after the new law takes full effect in 2014. Of course, none of those studies matches the CBO, Rand, or Urban for intellectual rigor. And the McKinsey study itself remains a huge mystery, with the company declining to divulge details about its methodology.
As writers like Kate Pickert have noted, employer surveys have only limited predictive value, and even then only if the survey architects designed them correctly. Of particular concern in this case is the way McKinsey "educated" respondents before getting their responses, for reasons that the New America Foundation's Sam Wainright explains:
Without knowing the survey questions, the “educational” script, or the methodology, it’s impossible to know whether or not the design of the survey would itself generate an anti-health reform result. Such a survey is certainly not a sufficient base to support the authors’ prediction of “a radical restructuring of employer-sponsored health benefits.”
Maybe answers will come soon, thanks to a formal request from Senate Finance Chairman Max Baucus. McKinsey representatives have apparently agreed to meet with Finance Committee staff. When they do, Baucus would like them to provide answers to a set of detailed questions about how they conducted their survey.
The questions, which you can see below, are precisely the right ones to ask. Of particular importance are the questions about firm makeup and present insurance offerings, two variables that could have a significant role in shaping results.
To repeat something I said yesterday, it's possible that CBO is wrong and that more employers might drop coverage once the Affordable Care Act is fully in place. It might even be a positive development, depending on the circumstances. It's a complicated issue and one I hope to address, at length, sometime soon.
But until we know more about how McKinsey did its research, the firm's prediction is no more reliable than if Karl Rove or some other conservative operative had invented it out of whole cloth.
Not that they would ever do such a thing.
Update: Greg Sargent makes a really good point:
It seems like some big news orgs were far more eager to write about the study’s initial release -- even though there was no way to evaluate its integrity -- than they are in following up on the Dems’ demand for its methodology, and the company’s refusal to release it. Here’s hoping that these incoming bombs from Dems get their attention and change that.
Questions to McKinsey and Company from Sen. Max Baucus
1. Who funded the survey? If McKinsey & Company sponsored the survey, what account did the funding come from? Who are your biggest clients? Do you expect McKinsey & Company to benefit financially from the results of this survey?
2. Have the results of the study been featured in any presentations, whether written or oral, to potential new clients? Have they been featured in presentations to existing clients considering additional consulting work for McKinsey?
3. What was your sampling design?
4. Were the results statistically significant?
5. What was the breakout of the survey responses by:
- industry,
- geography,
- employer size,
- the type of benefits the employer offers, and
- the percentage of low-wage workers at the company?
6. Were there any oversamples, and if there were, how did you account for this? What was the margin of sampling error?
7. How were participants chosen? What methods were used to recruit the participants? What percentage of participants were McKinsey clients?
8. Did you use eligibility criteria or screening procedures? What was the participation rate? How were interviewees selected?
9. What position did the interviewees hold within their respective companies?
10. How were the interviews conducted? What script was used to “educate respondents” before asking questions? Could the script used to “educate respondents,” the wording of the questions, or the order of the questions have influenced the interviewees’ responses?
11. What questions were asked, including their exact wording? What was the order of the questions?
12. What was your internal review process for questionnaire? Did any outside experts review the survey and its results? What procedures were taken to verify the data?
13. Deciding not to offer health benefits is a major business decision that will be made by more than one person, how did you account for this?