Benefits Think

It’s time to believe the research: Wellness isn’t working

The highly-publicized, randomized control trial of the wellness program at BJ’s Wholesale Club published recently in the Journal of the American Medical Association found virtually no value in the program.

While most pundits applauded the study, wellness vendors — whose livelihoods, of course, depend on believing the opposite — attacked it. Two common threads among the attacks, including one right here in Employee Benefit News, were that the program was bad (“anachronistic” in this case) and that one can’t draw conclusions from one study. The specific argument: “Recent research has disappointingly focused on a single — typically anachronistic — program to make sweeping statements on wellness programs more broadly.”

Far from being anachronistic, the JAMA study was as mainstream as studies get — screenings, risk assessments, coaching, learning modules. This is what wellness has been all about. True, there are a few new things today — like employee health literacy education — which is my own business, but they are too early in the life cycle to be evaluated.

And far from being a one-off, this study’s finding of zero impact was completely consistent with the previous 11 published studies for which data was provided. Let’s review those studies.

The National Bureau of Economic Research found no change in risk factors in a program for the University of Illinois. The conclusion was that virtually all improvements attributed to wellness in the past were, in fact, attributable to the ubiquitous participants vs. non-participants study design itself.

Speaking of participants vs. non-participants, three other studies showed that 100% of the “impact” of the program was, in fact, attributable to that study design. Accounting for that bias left a 0% improvement attributable to the program in all three cases. In one case, Newtopia, an editorial adviser to the journal that published the study apologized for not noticing the fallacy, and admitted the study should never have been published. (See the comments.)

In another case, the population was split into participants and non-participants two years before the study began. During that period (2004 to 2006 on the graph below), participants nonetheless dramatically outperformed non-participants — even without a program to participate in.

AlLewisImage.5.2.19.png

In a third case, a study found that advising diabetic participants to eat more carbohydrates and less fat generated dramatic savings. This, of course, is an impossible result given that advice is wrong. And only 2% of participants admitted to taking it anyway. Hence the difference was entirely attributable to study design.

Along with those three were seven other studies not using that methodology. The state of Connecticut found that costs went up after they instituted a wellness program requiring more annual physicals and other preventive care. No surprise there, because preventive care doesn’t save money and annual physicals may possibly, on balance, create more problems for patients than they solve.

The Health Enhancement Research Organization — the wellness trade association itself — published a study showing that wellness only saved $0.99 per employee per month before taking into account the cost of the program, which by their own admission put the program under water.

Award-winning studies also show no impact. The three most recent winners of the C. Everett Koop Award, presumably given to the best programs, showed something similar. The program run by Wellsteps for the Boise School District generated a dramatic deterioration in health status measured both objectively and subjectively. As reported right here in EBN, the previous winner, McKesson, achieved no improvement in biometric values, while weight increased.

Before that, the Nebraska state program showed that of 20,000 state employees, only 186, on balance, reduced a risk factor. The state also had originally claimed that 514 people were cured of cancer as a result of the program, but was forced to walk back that claim.

Two studies in Health Affairs reached a similar conclusion. Pepsico lost more than 50 cents for every dollar invested in wellness. Barnes Hospital lost money but achieved a reduction in hospitalizations. However, that reduction may have been due to the fact that Barnes employees had 10 times as many hospitalizations for diabetes and heart disease as the average employer.

Finally and ironically, the authors of the April 18 rebuttal of the BJ’s Wholesale Club study published on EBN both work at Vitality. Vitality conducted a wellness program for its own employees and the employees of its parent company. According to the report that Vitality published, its employees gained weight and their eating habits deteriorated.

So let us set the record straight: Conventional so-called “pry, poke and prod” programs do not work, and the studies — even those conducted by wellness advocates — are quite consistent on this point. A comprehensive review of all studies can be found here but — spoiler alert! — shows exactly the same thing.

Editor’s note: Read the counterpoint to this article here: New research says wellness isn’t working. Here’s where it’s wrong

For reprint and licensing requests for this article, click here.
Wellness programs Wellness program ROI Wellness Healthcare costs Healthcare benefits Health insurance Employee engagement Benefit strategies
MORE FROM EMPLOYEE BENEFIT NEWS