Actuarially adjusted study makes clear that Nextera’s DPC clinic was a flop.

Relying on deeply flawed studies and strained interpretations, as set out here and elsewhere, Nextera and Paladina (now part of Everside) still brag that their respective “school district” and “county government” direct primary care option programs for employer heatlh care plans produce huge overall health care cost savings. 2024 saw publication of two university doctoral dissertations centering rigorous, quantitative studies of direct primary care selection and cost savings — one to address each of these vendor’s “poster-child” programs. Now-Dr David Schwartzman concluded that savings in claims costs fell short of offsetting the school district’s investment in Nextera’s DPC monthly fees. More sensationally, however, now-Dr Gayle Brekke concluded that total medical service expenditures for Union County employees enrolled in Paladina’s DPC rose by more ($107 PMPM) than the amount paid in monthly DPC fees ($106 PMPM); Union County would have been better off shredding $106 PMPM in cash and leaving its health care program unchanged.

Brekke’s result is not as portentous as it may seem; the seemingly outlandish results likely reflect questionable employer decisions more than they reflect hugely cost-ineffective primary care delivery at the county’s direct care clinic. On the other hand Schwartzman’s findings of more moderate cost-ineffectiveness rest on very firm ground.

In this post, I compare Dr Schwatz’sman work to Nextera’s own study. I will address Brekke’s study and a related study by Milliman actuaries in a separate post.


David Schwarzman’s doctoral dissertation for Washington University of St Louis, examined the Nextera DPC program’s results for the St Vrain Valley School District. He addressed the same issues of risk selection and cost-effectiveness discussed in a study of that clinic developed by a Nextera-paid analyst, the now-vanished KPI Ninja. As I discuss in my own 2020 critique of that study, Nextera and KPI Ninja originally misrepresented that the key risk analysis underpinning the bulk of a massive savings brag had actually been performed by “the Johns Hopkins ACG® Research Team”. After Johns Hopkins explicitly denied that, Nextera/KPI Ninja quietly deleted the attribution of the risk analysis to that prestigious university.

Even so, Nextera and KPI Ninja held fast to the same finding they had misattributed to Hopkins, to wit, that there was no significant difference between the health care cost risk of patient-members receiving care at its DPC clinic and that of those who elected traditional fee for service (FFS).  Accordingly, when comparing claims costs and utilization between DPC and FFS cohorts, they made no relative “risk adjustment” or other adjustment for adverse selection away from Nextera. 

There were, however, clear warning signs in DPC plan design that an accurate claims cost comparison would require some adjustment for selection effects. First, the school district provided significantly richer benefits (details below) for downstream care to FFS cohort members, inviting FFS plan selection by members expecting high costs. Second, the DPC plan’s additional premium charge for an employee’s children was $1500 per year lower per year than that for the FFS plan, inviting DPC plan selection for the young and healthy. In fact, the DPC population was, on average, about six years younger than the FFS population.

Unsurprisingly, Dr Schwartzman, working from two years of pre-DPC claims records, found that those who later elected to receive primary care in the DPC clinic had previously had 22% lower medical spending than did those who declined DPC.

By deeming “insignificant” and then deliberately ignoring substantial population differences between DPC and non-DPC members, the KPI Ninja report gave Nextera a head start of 22% lowered costs of care for DPC members. The Nextera report bragged that “healthcare costs for Nextera members are significantly less than their non-DPC counterparts – about $913 less per member per year”, about 28% lower. Deduct the head start, however, and the purported savings fall to 6% or $195 PMPY. Three-quarters of the Nextera brag simply evaporates.

To be sure, $195 PMPY is still a respectable savings brag. But three-quarters of even the remaining purported savings rests on KPI Ninja failure to include member out of pocket costs in its claims cost comparison. “OOPs” for DPC members were vastly higher than those for FFS members because, with the expressed intention of at least partially defraying the cost of the monthly DPC fees, the school district had chosen to increase cost-sharing for downstream care rendered to DPC clinic participants. DPC members were excluded from a $750 HRA program and their coinsurance rate of 20% was twice that of FFS members. Per Schwartzman’s examination of claims records for the same period covered by the Nextera report, the effect was that DPC members paid an average of $159 per year more than their FFS counterparts.

The differences in claims cost sharing are not mentioned or included in the Nextera KPI. Ninja report . As a result, that report attributes those dollars to the efficiency of the Nextera clinic. Correcting that $159 PMPY misattribution reduces Nextera’s savings brag to $ 36 PMPY, rather than $913 PMPY. That’s about 1% rather than 28%.

After applying both difference in difference (“DiD” analyses that properly account DPC members health cost history, and also instrumental variables (IV) methology, all with a wide range of controls and checks for robustness, Schwartzman concluded:

“Overall, I do not find evidence that DPC reduces total medical spending or non-primary care spending. I also find mechanical increases in spending at lower levels of the spending distribution and increases in patient out-of-pocket medical spending. The results are driven by the increased preventive health spending incurred from the DPC fee without evidence of an offsetting impact on downstream spending.”

My own conclusions in 2020 were essentially the same as those of Dr Schwartzman in 2024, although they came with heavier caveats because I lacked direct access to claims data, and had used indirect methods that applied rough actuarial estimates to the claims and enrollment data presented in the Nexetera report.


Dr Schwartzman also took a stab assessing the relative quality of care delivered by the clinic, finding “some indications that the value of health consumption may increase” for DPC members. Only one finding in this area of his investigation met the test of statistical significance at the p < .05 level: DPC members had a decrease in frequency of low value cardiac imaging, specifically that given to patients who do not have ischemic heart disease, hypertension, or COPD, and are not over 40 with diabetes mellitus. All well and good.

Schwartzman explains this result by positing that more time with their patients allows DPC physicians to substitute away from low value cardiac imaging. I suggest it might be better explained by the higher cost-sharing burden that the school district packaged with DPC membership. Stress-tests with radioisotopes are expensive.


Employers other that the St Vrain Valley School District have also put forward direct primary care option programs where the DPC and FFS cohorts face different cost-sharing regimes. The foregoing paragraphs highlighted how the so-called analyst hired by Nextera failed two challenges raised by such differences. Foremost, mere shifts of claim costs between the employer and enrollees should not be taken as reflecting one way or the other on a DPC plan’s ability to reduce overall health care costs. That challenge can most be simply mitigated by basing plan cost comparison on the difference in total allowed claims costs, inclusive of both employer and members’ shares.

Cost-sharing differences may influence plan selection, to some degree driving higher health risk members toward the more generous of competing plans. The challenge of assuring that cost-sharing differences, if present, do not distort assessments of DPC performance strengthens an otherwise still very strong case for studying DPC using only methods that control for health status.

“Induced utilization” presents a third challenge. All else equal, members of the more generous of two plans with different cost sharing will more heavily utilize the services to which lower cost-sharing applies. Even when risk adjustment results might perfectly control for adverse plan selection, the effect of “induced utilization” by a more generous plan should be considered in any claims-based study of DPC cost-effectiveness.

A DPC plan married to a relatively parsimonious cost sharing scheme, like the Nextera plan, may look a bigger success (or a smaller flop) than it is. A DPC plan married to heavy DPC-specific cost sharing reductions may look like an abject failure. See what that looks like in this post addressing a program by a Nextera competitor, Paladina (now known as Everside).


In what seems his best insight, Dr Schwarzman suggests that the school district’s direct primary program was more successful in delivering cost-effective care to members with higher overall costs. Less healthy patients with greater health care needs provide more opportunities for a direct primary care to deliver meaningful cost reductions. On the other hand, direct primary care for those in good health incurs added primary care expenditures despite limited opportunity for offsetting savings. Unfortunately, as Schwartzman observes, regulatory restrictions make targeting the medically most needy members of an employer group a challenge.

Leave a comment