Updated 9/4/21
In 2015, Qliance still towered over all in the Direct Primary Care Bragging World with its claim of 20% overall cost reductions. Even that, of course, was quite a come down from the extravagant claims previously spewed under the Qliance banner; fond memories still linger of those heady days when the Heritage Foundation drooled over a non-existent British Medical Journal study alleged to have found that Qliance’s patients had 82% fewer surgeries, 66% fewer ED visits, and 65% fewer specialist visits.
Yet, by the middle of 2016, Qliance was toppled; the pages of Forbes proclaimed the attainment of 38% reductions in medical costs by a Qliance rival, Paladina, at its clinic in Union County, NC. By November of 2016, even upstart Nextera Healthcare was bragging its “DigitalGlobe” study at the 25% level. The following month Nextera reached the brag-summit when Paladina’s Union County brag shrank to a still competitive 23%.
In early 2017, while DPC World eagerly awaited a counterpunch from its former leader, Qliance instead went bankrupt. But at least the torch had been passed!
For various reasons, including a bit of good luck, Paladina’s Union County clinic emerged in the three ensuing years as the principal poster child of the DPC movement.
In May of 2020, just as the Union County clinic’s iconic status reached it apogee, the game-changing Milliman study came along, . The tools of actuarial science, risk adjustment most prominently, were brought to bear in an independent study of the cost-effectiveness of a single clinic, and that one just happened to be Paladina’s Union County clinic. The result was not pretty.
The Milliman study essentially kicked to the curb ALL prior DPC cost-effectiveness studies, including both the Paladina and Nextera studies, rejecting the lot for want of proper risk adjustment. In fact, the Milliman study found that after claims cost risk adjustment to account for the health differences between the studied Union County populations, even Paladina’s most current and more modest savings claim of 23% vanished. The Milliman team made plain as day its estimate that Paladina’s Union County clinic program had not produced any significant cost savings at all.
Worse, the Milliman analysis had been wildly over generous to Paladina; it’s conclusion that Paladina’s program was only slightly worse than a break even proposition for the county was based on an estimate of $61 PMPM for the average DPC fee paid by the County. In fact, the publicly available contract between Paladina and the county, makes clear that the average DPC fee paid was $95 PMPM. While DPC supporters had bragged of over $1,000,000 in net annual savings, what the Paladina contract wrought was a net annual increase in the County’s health care costs in excess of $400,000.
The Milliman team also admonished that:
It is imperative to control for patient selection in DPC studies; otherwise, differences in cost due to underlying patient differences may be erroneously assigned as differences caused by DPC.
Grzeskowiak and Busch, What our study says about Direct Primary Care
This admonition might have had a useful, if sobering, effect on direct primary care, if the DPC community were actually interested in advancing the movement based on the proficiency of DPC medical doctors rather than on the shamelessness of DPC spin doctors.
I can honestly say that the previous champions of DPC cost-effectiveness data, Nextera and Paladina, have met this challenge with more than mere lip service. Instead, they mixed fraud and incompetence with that lip service, and raced anew to the top of Mount Brag Bullshit.
A few months after the Milliman report was published, Nextera made the first move, a bold coupling of a brag at the 27% level with a remarkable stunt:

“KPI Ninja** conducted risk score analysis in partnership with Johns Hopkins’ ACG® research team [.]” KPI Ninja’s Nextera study, page 7.
“KPI Ninja brought in the Johns Hopkins research team that has significant expertise in what is called population risk measurement. . . . We took that extra step and brought on the Johns Hopkins team that has this ability to run analysis. It’s in their wheelhouse and they applied that . . . [The] Johns Hopkins Research Team did the risk analysis[.] Nextera presentation at 2020 Hint DPC Summit meeting.
Ah, but:
“We were not directly involved in this analysis.” Associate Director, HopkinsACG.org.
Not only was there no academic team involved in Nextera’s deeply flawed study, there was no risk adjustment actually performed. It was an heroic risk adjustment charade.
When Nextera bragged a 27% cost savings sporting both fake academic robes and fake risk adjustment, imagine how alarmed competitor Paladina – still reeling from Milliman’s conclusion that risk -adjustment brought the Union County savings down from 23% to 0% – would have felt, if they took Nextera at its word.
Don’t fret; knowing the truth of Milliman, Paladina understood at once that all they had to do was launch their own charade.
Paladina, on its website, moved in January of 2021. After its own lip service to risk adjustment and even lavish praise of the Milliman study, Paladina’s spin doctor went on to declare that “Paladina Health’s Union County client, the employer case study featured in the Milliman report, also prospered from adopting the direct primary care model. . . . Union County taxpayers saved $1.28 million in employee health care costs . . .. 23% . . ..” No matter that the Milliman team had actually exploded that very conclusion when it concluded that the County had barely broken even.
And, as noted above, the reality is that the Paladina adventure cost the county $400,000 more per annum than the Milliman team had assumed. The net lie is over $1.7 million dollars.
Consider, too, the sheer misrepresentational brilliance of the Paladina webpage’s careful selection of two raw data apples and two risk-adjusted data oranges drawn from four of each in Milliman’s basket.

Paladina is okay with using risk-adjusted data, but only when it cuts in Paladina’s favor.
But wait. Paladina’s spin team wasn’t quite done. Six months later, they went ahead and matched Nextera’s invocation of academic work from Johns Hopkins that never took place by invoking academic work from Harvard that never took place. With Paladina newly rebranded as Everside, the spin team trotted out a “white paper” that, in addition to repeating their misrepresentations of the Milliman study, relied substantially on additional findings, favorable to the DPC model, from ” the Harvard DPC cost study”.
There was no “Harvard DPC cost study”. There was a Harvard study of the effect on insurance based fee for service providers of receiving PMPM payments to supplement fee for service payments. That study did not address, or even mention, direct primary care practices that eliminate FFS payment and rely on membership fees.
Technical query. Given that Nextera ventriloquized Johns Hopkins and Paladina ventriloquized both Milliman and Harvard, is it still “Charades”’?
Instead of racing to the top of Mount Bullshit, why not stick to calm, truthful analysis that reveals direct primary care’s actual ability to reduce the costs of care?
[**] KPI Ninja is an “analyst” that has a special division dedicated to compiling brags for Nextera and other DPC companies. See here for more on KPI Ninja.