KPI Ninja’s Nextera analysis: more than enough problems.

Three major adjustments are needed, even without correcting the IP admit rate problem or arriving at a more reasonable risk adjustment.

Comparing data from Nextera patients and non-Nextera patients in the SVVSD programs requires three major adjustments which KPI Ninja never attempted. Computations here.

  1. Because of the different benefit structures, the district’s claim costs for Nextera members reflect a large measure of savings, not due to Nextera, but due to the fact that the district pays less for the exact same claims from Nextera members than for “Choice” plan member warranting an downward adjustment of the district’s total costs for Choice members by a factor of 0.905.
  2. The much richer overall benefit structure for non-Nextera also induces utilization, warranting a second downward adjustment of Choice total costs (by a factor of 0.950%
  3. The data also need risk adjustment. For this computation we used the 7.5% difference computed by Nextera’s analyst, although the adjustment actually needed is likely north of 21%.

Applying all three adjustments reduces the claimed $913 savings to $255, bring the percentage savings down from 27% to less than 8%. Even that value assumes that the Nextera report was correct in its astonishing finding that the non-Nextera population of teachers and children had a IP admission rate of 246 per thousand.


The weight of external evidence suggests that supplying missing pharmacy data will not rescue any significant part of Nextera’s claim.

After acknowledging the complete absence of pharmacy cost data, KPI Ninja dismissed concern about the omission by repeatedly suggesting that inclusion could only showcase additional savings for Nextera members. The only support KPI Ninja offered for that suggestion was KPI Ninja’s trust-me account of its own, unidentified, and unpublished “research” performed in the course of paid service to the direct primary care industry. 

The opposite conclusion — that pharmacy data might well reveal additional direct primary care costs — is supported by actual evidence. The only independent and well-controlled study of a direct primary care clinic, the landmark Milliman’s landmark study, found that after risk adjustment, the direct primary care cohort members had a slightly higher pharma usage than their counterparts. And a non-independent study that relied on only on propensity matched controls plainly suggests that one successful DPC strategy is to reduce high expense downstream costs through increasing medication compliance; the Iora clinic involved saw a 40% increase in pharmacy refills alongside significant  reductions in various levels of hospital utilization. 


Nextera’s claim to reduce the employer’s cost of chronic conditions suffers from some of the same problems as Nextera’s broadest claims — plus an even bigger one. 

The report’s largest table, found on page 12, ostensibly shows various employer costs differences between Nextera patients and Choice patients associated to a selection of sixteen chronic conditions. For 10 of 16 Nextera claims employer cost reductions while, for the remaining six, Nextera confesses increased employer costs.  Here is a 

selected, condensed line from that table with two added amending lines. The first line of amendments applies the previously discussed adjustments to the employer’s cost for induced utilization (0.950).[1] This adjustment cuts the supposed savings by $62, a mere warmup act. 


[1] We omit cohort wide risk adjustment in this table to avoid the risk of over-correction, knowing that people on the same chronic conditions line have already been partially sorted on the basis of shared diagnostics. We omit the plan benefit adjustment so, in our second line of amendment, we can introduce the cost of primary care for the chronic conditions of Nextera members without fear of duplicating the portion of the primary care cost intrinsic to our global adjustment (0.905) for benefit package design.


The second amending line is added to remove additional skew that arises because for Choice members, employer claim costs may flow from both primary care payments and downstream care payments, while for Nextera members employer claim costs come only from downstream care.

Nextera members do receive primary care — some of the most expensive primary care in the world, in fact. Nextera’s subscription fees average $839 PMPY. Fair comparison of employer costs for chronic conditions requires an accounting of Nextera’s fees as part of the employer costs for chronic conditions. Including Nextera’s fees turns the chronic conditions table significantly against Nextera’s claims. Nextera has not demonstrated its ability to lower the costs of chronic disease.

The same issue infects the Nextera report’s computation of savings on E & M visits, on page 10.


The study omitted important information about how outlier claimants were addressed.

While KPI Ninja did address the problem of outlier claimants, it is frustrating to see about 40% of total spend hacked away from consideration without being told either the chosen threshold for claims outlier exclusion or the reasoning behind the particular choice made. 

KPI Ninja makes clear that it excluded outlier claims amounts from the total spend of each cohort. But it is also salient whether members incurring high costs were excluded from the cohort in compiling population utilization rates or risk scores. 

The analyst understood that a million-dollar member would heavily skew cost comparison. If, however, the same million-dollar member had daily infusion therapy, this could heavily skew KPI Ninja’s OP visit utilization comparison. And, if that same member and a few others had conditions with profoundly high risk coefficients that might have a significant effect on final risk scores.

The better procedure is to avoid all skew from outliers. The Milliman report excluded members with outlier claims from the cohort for all determinations, whether of cost, utilization or, even, risk. In their report, KPI Ninja addressed outlier members only in terms of their claims cost. There is no indication that KPI Ninja appropriately accounted for outlier patients in its determination of utilization rates or population risk.

A significant aspect of risk measurement may have been clarified by accounting properly for outlier. And a good chunk of that astonishing IP admit figure for Choice patients might have vanished had they done so. 

A study design error by KPI Ninja further skews cost data in Nextera’s favor by a hard to estimate amount.

“The actuary should consider the effect enrollment practices (for example, the ability of participants to drop in and out of a health plan) have had on health care costs.”

Actuarial Standards BoardActuarial Standard of Practice No. 6, § 3.7.10 (b). See also §2.26 and §3.7.3

But the actuarial wannabees at KPI Ninja did not do that. The only claims cost data marshaled for this study were claims for which the district made a payment. Necessarily, these were claims for which a deductible was met. Because KPI Ninja did not follow the guidance from the actuarial board, however, it ended up with two significantly different cohorts in terms of the cohort members’ ability to meet the district’s $2000 annual deductible and maximum out of pocket expense limit.

Specifically, the average time in cohort for a non-Nextera member was 11.1 months; for Nextera it was only 10.1.[2] On average, Nextera members had twice as many shortened deductible years as non-Nextera members. And shortened deductible years mean more unmet deductibles and mOOPs, and fewer employer paid claims; in insurance lingo, this is considered less “exposure”. The upshot is that the reported employer paid claims for the two cohorts are biased in Nextera’s favor. 


[2] Most of the difference is related to KPI Ninja’s choosing a school year for data collection when plan membership and the deductible clock runs on a calender year. Nextera has publicly bragged of seeing a boost in membership for its second plan year. Those new members had spent only eight months of plan membership when the study period ended.


To largely eliminate this distortion, KPI Ninja need only have restricted the study to members of either cohort who had experienced a complete deductible cycle. To estimate the amount of distortion after the fact is challenging, and the resulting adjustment may be too small to warrant the effort. What would make more sense would be for Nextera to just send the data where it belonged in the first place, to a real actuary who knows how to design an unbiased study.


A related error may have infected all of KPI Ninja’s utilization calculations, with potentially large consequences. KPI Ninja’s utilization reduction claims on page 10 appear not to have been normalized to correct for the difference in shortened years between the two cohorts. If they have indeed not been so adjusted, then all the service utilization numbers shown for Nextera members on that page currently need an upward adjustment of 10%. One devastating effect: this adjustment would completely erase Nextera’s claim of reducing utilization of emergency departments.

There is no evidence that the utilization data were normalized to correct for the one-month shortfall of Nextera members “in cohort dwell time”.


Summary of all current Nextera posts.

The two astonishing claims of Nextera’s school district advertising blitz are deeply misleading and unsupported. In no meaningful sense, does Nextera save 27% or $913 per year for every patient served by Nextera’s doctors rather than by the Saint Vrain Valley region’s traditional, fee-for-service primary care physicians. In no meaningful sense, do patients served by Nextera doctors have 92.7% fewer inpatient hospital admissions than those served by the Saint Vrain Valley region’s traditional, fee-for-service primary care physicians.  

The KPI Ninja report on Nextera is at war with the best available evidence on direct primary care, that from the Milliman study. The KPI Ninja analysis head-faked risk adjustment, an essential feature of plan comparison, but actually performed none at all. The vast bulk of the reported cost savings turn on the dubious finding that a low risk population had a Medicare-like hospital admissions rate they could have reduced by choosing Nextera’s physicians.

An adequate study required not only risk adjustment, but also adjustments for induced utilization and for normalizing employer share cost based on the benefit plans.  Combined all adjustments cut the purported Nextera savings down from $913 to $255, even accepting as given a freakishly high IP admission rate and a freakishly low risk adjustment of 7.5%. 

Every single one of the report’s claims that Nextera lowered the cost of various downstream care services is tainted by one or more of these unaddressed factors. 

Credibility requires a well-designed study and careful analysis, transparency, candor, and a firm understanding of the many effects of benefit design. The KPI Ninja report on Nextera had none of it.  It is, at best, a spin doctor’s badly concocted puff piece.Promotion of KPI Ninja’s work on behalf of one hundred Nextera physicians — by video, by press release, and by distribution of the report by social media and otherwise — is false advertising that demands correction.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: