Nextera’s Next Era in Cherry-Picking Machine Design

Nextera HealthCare has a hot new brag:

These results were not risk adjusted. But they desperately need to be.

The St Vrain Valley School District had this health benefit structure for its employees during the period studied:

The school district’s 10% coinsurance rate for the PPO predates the arrival of the Nextera option. The school district also has a Kaiser Permanente plan that includes 10% coinsurance. The school district created the unique 20% coinsurance rate for Nextera DPC patients to help fund the added primary care investment involved. Here’s how that benefit structure impacts employees expecting various levels of usage in an coming year.

As the Nextera image above shows, $5,000 per year is about an average utilization level for an employee member of the district; an employee expecting average utilization can gain over $900 dollars by rejecting Nextera. Every penny of that advantage for the employee comes out of the employer’s hide — and then it shows up in Nextera’s table as a Nextera win. A employee with moderately heavy utilization – more than twice the average — might even hit the jackpot of shifting $1787 from his pocket to the employer, simply by rejecting Nextera. Heavier utilizers will do no worse than break even by rejecting Nextera.

This benefit design pushes a large swath of risky, costly patients away from Nextera.

And notice that even after the Nextera harvests the relatively healthy, those Nextera members who end up needing significant downstream care needs face vastly more cost-sharing discipline under this benefit plan than their PPO counterparts.

If the employer’s claims costs are adjusted for both the health risk difference between Nextera and non-Nextera populations, and the confounding effect of subjecting the Nextera cohort to stronger cost-sharing discipline for downstream costs, Nextera’s cost savings brag will likely be shredded.

Indeed, we have good reason — from the recent Milliman study — to suspect that a population risk-adjustment of about a third is quite likely. Adjust the Nextera brag by that third and the savings will not simply vanish, they will turn into increased costs.

Epiphany. Dr Gross’s risk adjustment metholodogy for direct primary care stands contrary to contemporary understandings of how to assess the relative expected costs of differing populations.

Dr. Lee Gross’s Epiphany Healthcare provides DPC services for some of the employees and some members of of some of their families at a hospital in Florida. Some hospital employees decline Epiphany; they and some members of their families receive instead traditional insurance based primary care. Unusually for such arrangements, a recent assessment of the dollar value of the claims experience for the DPC cohort exceeded that of the FFS component by about 15%.

There appears to have never been a case in which a DPC provider applied risk-based adjustment to validate raw claims cost differences that seemed to reflect well on DPC. When, on the other hand, raw data went the opposite way for him, Dr. Gross decided the time had come too became a data analytics pioneer and develop a risk adjustment model for direct primary care.

Conveniently, widely-used models developed by professional actuaries in and for CMS (in making patient population risk adjustments for use in Medicare Advantage plans and under the Affordable Care Act) and Dr Gross’s share a common feature – each adds up, in certain circumstances, the total number of a patient’s chronic conditions to develop an adjustment factor. This facilitates a comparison.

To assess the precise role of the common feature of the two models, respectively, I will summarize the more widely used CMS model and then explain some key differences between it and the Gross model.

In CMS’s method, depending on the case, up to four different components come into play. [FYI, the lettering system is mine and is intended to simplify the explanation.]

A. There is, of course, a significant demographics component scored to reflect the historical utlilization by persons, specified by ages, gender, and other factors. Basic demographics are at the core of risk adjustments used by CMS for the ACA; over 75% of ACA individual enrollees under 65 have no adjustment-worthy chronic conditions and are risk-adjusted on demographics alone.

B. There is a specific conditions component in which a patient is given a score for each condition she has off a list of over eighty broad health conditions, each of which has a specific score reflecting that conditions historical correlation with the use of medical services – so much for asthma, so much for porphyria, CHF, etc.

C. There is also an “interactions” component in which the patients can receive additional scores for as many as she may have off a list for certain specific combinations of conditions from “A” above that are historically correlated with enhanced need for services when the those health conditions are combined.

D. a recently added number of chronic conditions component reflects considerations generally similar to those in Dr Gross’s index

(D)(1) there is a specific list of about two dozen chronic conditions of the eighty condition from “A” above

(D)(2) if a patient has FOUR or more of the conditions in the list in (D)(1), then patient is given an additional score that reflects how having that many of those conditions has historically been correlated with an enhanced need for services.

Component (D) scoring is non-linear: the adjustment for five conditions reflects historical correlation between costs and having five conditions; scoring is computed separately for four, five, six, etc. Eight conditions are not simply given twice the score given for four conditions.

Now let’s look under the hood of the Gross model to see how these factors play out in his risk adjustment model for direct primary care.

Gross’s model removes the demographic component.

Under Gross’s model the number of different specific conditions being scored is reduced from a hierarchy of over eighty condition categories down to a single data point: a “chronic condition”.

Gross’s model removes the interactions component.

Gross model counts all chronic conditions equally and assumes a linear relationship between health costs and the raw number of chronic conditions.

A theoretical advantage of Gross’s method is ease of application.

On the other side, Gross’s is untethered to any historical utilization data whatsoever. Within the one component where there is any measure of similarity between the Gross model and the CMS model, Gross’s approach expressly contradicts CMS’s actuaries’ explicit determination that a linear correlation did not accord with historical reality. (CMS actuaries did agree that there was an additive effect for each additional conditions.)

On every other component Gross’s model is in even sharper contrast to the views of the professionals. Notwithstanding Dr Gross’s model, demographics strongly correlate with costs. As noted, over 75% of patients under 65 have their risk adjustments based only on demographics as they have no chronic conditions among the many dozens that the professionals felt needed to be includes. Notwithstanding Dr Gross’s model, among the 25 or less with actionable chronic conditions, costs vary profoundly from one condition to another; conditions are hierarchical. Notwithstanding Dr Gross’s model, multiple conditions do interact with each other.

Gross’s risk adjustment model for direct primary care looks like something that has been reverse engineered from adverse data for the sole purpose of rescuing Dr Gross’s corporate brand at its largest employer.

As with all other aspects of his otherwise relentless promotion of the results at Desoto Memorial Hospital, Dr Gross has expressly declined to make public any details of his methods of gathering and processing data. Even if the very outline of his risk methodology was not at war with those of his analytical betters, important questions would remain.

For example, what criteria did he use to identify and count chronic conditions in the DPC cohort (his own patients) and those in the non-DPC patient cohort (who get primary care elsewhere). Especially for gathering chronic condition data on those who are not his patients what, if anything, did he do to validate data accuracy and consistency?

For these reasons, I report only this:

The raw data shows that non-Epiphany patients at Epiphany’s largest employer clinic have lower claims costs.

Dr Gross has produced a slide show that includes, as far as I can tell, all the public information about the Epiphany experience at that employer.

The slide show utilizes the Gross risk adjustment methodology for direct primary care discussed above.

On Slide 22, Dr Gross made an error of double adjustment. Here is what appears.

Chronic Conditions846 per 1,000623 per 1,000
Relative Chronic Conditions1.350.74

There is at least one other, similar error.

The mixed bag of Milliman earns a final grade: B

Skillful actuarial work on risk adjustment. A clear warning against relying on studies that ignored risk adjustment. Implicit repudiation of a decade of unfounded brags.
An admirable idea on “isolating the impact of DPC model” from the bad decisions of a studied employer. But then, a failure to realize an important prerequisite for performing that isolation.
Milliman should have recognized that the health service resources that go into providing direct primary are vastly more than $8 PMPM that emerged from its modeling and should have done more to subject the data on which the number rested to some kind of validation.
Upshot: there is still no solid evidence that direct primary care results in a reduced level of utilization of health care services. Milliman’s report needs to clearly reflect that.

Overview: A core truth, and a consequence.

The Milliman report on the direct primary care option in Union County has significant truth, an interesting but unperfected idea, and staggering error. The core truth lay in Milliman determining through standard actuarial risk adjustment that huge selection effects, rather than the wonders of direct primary care, accounted for a 36% difference in total health care costs between DPC and FFS populations. Both Union County and the DPC provider known as Paladina Health had publicly and loudly touted cost differences of up to 28% overall as proof that DPC can save employers money. But naysayers, including me, were proven fully correct about Union County — and about a raft of other DPC boasts that lacked risk adjustment, like those regarding Qliance.1

The estimated selection pattern in our case study emphasizes the need for any analysis of cost and utilization outcomes for DPC programs to account for the health status and demographics of the DPC population relative to a control group or benchmark population. Without appropriate consideration for how differences in underlying health status affect observed claim costs and utilization patterns, analyses could attribute certain outcomes to DPC inappropriately. We urge readers to use caution when reviewing analyses of DPC outcomes that do not explicitly account for differences in population demographics and health status and do not make use of appropriate methodologies

Page 46 of Milliman/Union County study

Still, Union County had made some choices that made the overall results worse than they needed to be. That’s where Milliman’s ingenuity came into play in what might be seen as an attempt to turn the County’s lemon into lemonade for the direct primary care industry. And that is where Milliman failed in two major ways, one of unmeasured consequence, and the other more than important enough to make the lemonade deeply unpalatable.

The Union County DPC program was even more of a lemon than Milliman reported.

To Milliman’s credit, it did manage to reach and announce the inescapable conclusion that Union County had increased its overall health care expenditure by implementing the direct primary care option. Even then, however, Milliman vastly understated the actual loss. That’s because its employer ROI calculation rested on an estimate of $61 as the average monthly direct primary care fee paid by Union County to Paladina Health; the actual average month fee paid was $95. There had been no need for a monthly fee estimate as the fees were a matter of public record.

Though $1.25 millions in annual savings had been claimed, the total annual loss by Union County was over $400,000. Though 28% savings had once been bragged, the County’s ROI was actually a negative 8%. Milliman’s reliance on an estimate of the fees received from the County, rather than the actual fees collected, made a fiscal disaster appear to be close to a break even proposition.

Milliman’s choice likely spared the county’s executive from some embarrassment.

For more detail on Union County’s negative ROI and Milliman’s understatements of it, click here and/or here.

Milliman’s came up with a sound idea, but flubbed the design details.

To prevent the County’s specific bad choices about cost-sharing from biasing impressions of the DPC model, Milliman developed a second approach that entailed looking only at a claim cost comparison between the two groups. According to Milliman, “this cost comparison attempts to isolate the impact of the DPC delivery model on the overall level of demand for health care services“. [Italics in original]. Milliman should be proud of thinking of this approach. It seems likely to work where both the DPC and FFS cohorts face identical cost-sharing incentives for downstream care.

But that was not the case in Union County, where DPC and PPS patients faced very different cost-sharing regimes when making downstream care choices. The County’s cost-sharing choices are imprinted on the downstream claims that fuel the isolation model; for the average FFS patient cost-sharing discipline for downstream claims was much reduced relative to that experienced by the average DPC patient.

Until it completes the hard actuarial work needed to quantify the skew of the downstream claims cost data that fueled its isolation model, Milliman should back off its claim to have successfully determined the health care utilization impact of DPC.

For more detail on Milliman’s faulty attempt to reinvent the DPC cost reduction wheel, click here.

The Milliman calculation of 12.6% overall savings turns on a a massive underestimate of the cost of the direct primary clinic studied.

Milliman needed to determine utilization for the DPC clinic.

Assuming, arguendo, that downstream claims data for Union County are not contaminated by the Union County cost-sharing schema would bring us to the next step. Milliman’s model for relative utilization is simple:

Milliman used claim costs for both downstream components of the computation, and for the FFS primary care utilization. But because DPC is paid for by a subscription fee, primary care utilization for the DPC patients cannot be determined from claims costs.

One reasonable way of estimating the health services used by DPC patients might be to use the market price of DPC subscriptions, about $61 PMPM. With this market value, the computation would have yield a net utilization increase for DPC. Milliman eschewed that method.

Another reasonable way of estimating the health services used by DPC might be to estimate the costs of staffing and running a DPC clinic, using available data about PCP salaries and primary care office overhead, probably at least $40 PMPM by a conservative estimate. With this low number, the computation would have yielded a net utilization decrease for DPC well under 5%. Milliman eschewed that method.

The lower the value used for utilization of direct primary care services, the more favorable DPC appears. Ignoring models that would have pointed to $61 and $40 values, Milliman used a methodology that produced an $ 8 PMPM value, which resulted in a computed 12.6% reduction in overall usage.

Milliman’s “ghost claims” method was ill-suited for DPC and vulnerable.

Milliman’s “solution”, however, turned on the stunning assumption that utilization of subscription-based holistic, integrative direct primary care could be accurately modeled using the same billing and coding technology used in fee for service medicine. As a group, the DPC community loudly disparages such coding both for failing to compensate their efforts and for wasting their time. D-PCPs don’t even use billing friendly EHRs.

Yet Milliman chose to rely on the clinic’s DPC physicians to have accurately coded all services delivered to patients, used those codes to prepare “ghost claims” resembling those used for FFS payment adjudication, and then to have submitted the ghost claims to the employer’s TPA, not to prompt payment, but solely for reporting purposes. The collected ghost claims were turned into the direct primary care services utilization by application of the Union County FFS fee schedule. The result was $8 PMPM.

The $8 PMPM level of clinic utilization determined by the ghost claims was absurd.

Valuing the health services utilization for patients at the direct primary care clinic at a mere $8 PMPM is at war with a host of things that Milliman knew or should have known about the particular clinic, knew or should have known about the costs of primary care. and knew or should have known about the nature of direct primary care. Clinic patients were reportedly receiving three visits a year; this requires more than $96 dollars. The length of clinic visits was stressed. County and clinic brag 24/7 access and same day appointments for 1000 clinic patients. The clinic was staffed at one PCP to 500 members; at $96 a year, clinic revenue would have been $48,000 per PCP. This does not pass the sniff test.

The only visible path to Milliman’s $8 PMPM figure for health services demand for the delivery of direct primary care is that the direct primary care physicians ghost claims were consistently underreported. About what one might expect from “ghost claims” prepared by code-hating D-PCPs with no motivation to code and claim accurately (perhaps, even with an opposite motivation). Milliman even knew that the coding habits of the DPC practitioners were inconsistent, in that the ghost claims sometimes contained diagnosis codes and sometimes did not. Report at page 56.

Yet, Milliman did nothing to validate the “ghost claims”.

Because the $8 PMPM is far too low, the 12.6% overall reduction figure is far too high. As noted above, substituting even a conservative estimate of the costs of putting a PCP into the field slashes 12.6% to something like 4%. If in place of the $8 PMPM , the $61 market price determined in the survey portion of the Milliman study is used, Milliman’s model would show that direct primary care increases the overall utilization of health services.

For more detail on the erroneous primary care data fed to Milliman’s isolation model, click here.

Milliman should amend this study, first accounting for the confounding effect of Union County’s cost-sharing scheme on downstream care costs, and then adapting a credible method for estimating the level of health services utilized in delivering care at the DPC clinic.

Milliman’s good work on risk adjustment still warrants applause. Indeed, precisely because the risk adjustment piece was so important, the faulty work on utilization should be corrected, lest bad work tar good, and good work lend credibilty to bad work.

1 The reaction to Milliman’s making clear the necessity of risk adjustment by those who had long promoted the Qliance boasts was swift and predictable: never ignore what can be lied about and spun. DPC Coalition is a lobbying organization co-founded by Qliance; a co-founder of Qliance is currently president of DPC Coalition. DPC Coalition promptly held a legislative strategy briefing on the Milliman study at which the Executive Director ended the meeting by declaring that the Milliman study had validated the Qliance data.

The raw downstream cost claims data fed into Miliman’s “isolation” model were imprinted with Union County’s (likely pro-DPC) model of downstream cost-sharing.

Note. The foregoing post was essentially completed and copied to Milliman in late May or early June 2020. In a footnote to an internet essay at the end of June, two members of the Milliman team presented new material addressing the issues raised below. I will address this new material in a new post.

In its recent assessment of the direct primary care model, Milliman took a two track approach. An employer ROI based approach included comparing claims experience for a group of employees who opted to receive primary care from a DPC clinic versus those using traditional FFS PCPs; in addition, the ROI analysis also assigned costs to employer of the differing cost-sharing structures the employer has required of the two groups. This computation resulted in the conclusion that in the plan under study DPC actually increased employer costs.

In a second approach, however, Milliman looked only at a claim cost comparison between the two groups. According to Milliman, “this cost comparison attempts to isolate the impact of the DPC delivery model on the overall level of demand for health care services“. [Italics in original]. Milliman should be proud of thinking of this approach. It seems likely to work well where both the DPC and FFS cohorts face identical cost-sharing incentives for downstream care. But that was not the case in Union County.

Since Union County offered very different cost sharing landscapes to the two groups, the purported isolation is simply not possible.

For example, all FFS employees (and no DPC employees) had $750 HRAs, which present FFS employees with “use or lose it” choices affecting their decisions on incurring claims. In that situation, FFS employees will be prone to over-consume health services, largely on downstream care, a significant swath of void of cost sharing discipline. By Milliman’s calculation (Fig 12, Line I), the average FFS member does not exceed the HRA-protected level of usage in a normal year.

Accordingly, that everage FFS member never reaches a second feature of the Union County plan, a 20% coinsurance level. On the other hand, all the downstream medical claims of an average DPC members is subject to the discipline of 20% copayment.

Combining the effect of the HRA and the coinsurance, the average DPC group member is likely to under-consume downstream health care services when compared to the average FFS member.

These real decisions by real employees surely affected Union County’s downstream claims experience. A cost sharing plan is not merely a dispensable ornament in an ROI computation. Rather, Union County’s cost-sharing structure presents important “confounders” that Milliman’s isolation model, as originally conceived and executed, simply ignored — with a likely effect of overstating direct primary care’s ability to reduce the downstream care costs of average utilizers.

Because the raw downstream claims cost data fed into the model intended to isolate DPC impact from the vicissitudes of Union County’s cost-sharing scheme had already been imprinted with the vicissitudes of Union County’s cost-sharing scheme to an unknown degree, Milliman’s attempted reinvention of the wheel results in an indeterminately bumpy ride.

To be fully clear, it is certainly possible that, if applied across all members at all levels of downstream care consumption, Union County’s unique and complex cost-sharing landscape for downstream care for non-DPC members, and it’s simpler coinsurance focused scheme for downstream care for DPC members, might combine to have a negligible net effect, or even tilt in favor of DPC. I don’t know. As of the date of its May 2020 report, neither did Milliman.

Until it completes the hard actuarial work needed to quantify the skew of the downstream claims cost data that fueled its isolation model, Milliman should back off its claim to have successfully isolated and determined the health care utilization impact of DPC.

Milliman and Milliman wannabes would be especially well-advised to use caution before attempting to apply the Union County isolation methodology in case where a DPC option is automatically tethered to an HDHP , even though the HDHP may be available on a voluntary basis to a quasi-control group.

Virtually all DPC practices take pains to sign on members with HDHPs, through employer arrangements or otherwise. Once such price-shoppers are signed on, they can be relied on to incur relatively modest downstream costs. The enrolling provider is, of course, happy to claim any reduced downstream costs as attributable to herself and/or to the DPC model. Borrowed valor. Good example here.

For more about Milliman or Union County, use Search.

Did Ratner Industries uncover the secret of health care cost reductions?

To kick off our open enrollment period two years ago, we at Ratner Industries held a company wide employee meeting. There we dusted off our brand new offering of a high deductible plan option. To get a rough idea how many employees planned on electing each option we offered free bags of M&Ms to employees as they left, requesting that those leaning toward the high deductible option each take a bag of peanut M&Ms, while those remaining in our standard plan were asked to take only traditional M&Ms. About equal numbers took bags of each of the two kinds of M&Ms.

We just ran some claims data (fully risk adjusted, of course) for the first full year under these plan choices. We found that those who received peanut M&Ms had 12% lower utilization of services overall compared to those receiving traditional M&Ms.

Have we have proven the power of peanuts? Caiaphas, a commenter, has suggested that this a case of borrowed valor, with the HDHP plan having done the hard work for which the pro-peanut caucus has “improperly” taken credit. Caiaphas’s anti-peanut bias is evident.

Attn: AEG/WP. Milliman study implies 12.6% downstream care cost reductions for DPC.

The AEG/WP plan still isn’t likely to work. A $95 PMPM fee, increasing at the same rate as other medical expenses, and coupled to a 12.6% reduction down stream would evaporate all of AEG/WP’s claimed billion savings.

Healthcare Innovations in Georgia:Two Recommendations”, the report prepared by the Anderson Economic Group and Wilson Partners (AEG/WP) for the Georgia Public Policy Foundation, made some valuable contributions to deliberations about direct primary care. The AEG/WP team clearly explained their computations and made clear the assumptions underlying their report.

This facilitated public discussion that the Georgia Public Policy Foundation sought to foster in publishing the report. I have examined those assumptions in many prior posts. A large number of them addressed a focal assumption in the AEG/WP report’s calculations: that DPC participation could reduce the claims cost for downstream care by 15%, a number represented by AEG/WP as a low end estimate. The sole support offered in the AEG/WP report for this 15% presumption is a statement that, “the factor is based on research and case studies prepared by Wilson Partners.”

In addressing the 15% claim, I looked over all available evidence then available in support of the 15% claim. I found a lot of corporate brags, and some propaganda from partisans with their own smash-public-spending agenda, but nothing n the way of independent, actuarial sound evidence derived from risk-adjusted data from DPC clinics.

That has changed with a study by actuaries from Milliman in a May 2020 report to the Society of Actuaries concerning a thinly disguised employer (Union County, NC). Although I take issue with some of the lessons others have drawn from that report, the report implies that deploying the direct primary care model can reduce downstream care cost by 12.6%. [Maybe even 13.2%.]

Some cautions.

  • Milliman’s is only one study, and so far one of a kind.
  • The Milliman study confirms nearly all of what I have said about how deeply flawed all the prior studies were, largely because they did not look at issues around risk adjustment.
  • In light of the Milliman indicating reductions of 12.6%, the AEG/WP suggestion that 15% figure represents a low end estimated should be rejected.
  • For downstream care cost reductions, replacing AEG/WP 15% with Milliman’s 12.6% takes away 1/6th of AEG/WP claimed savings.
  • My other two major criticisms of the AEG/WP report still stand, and one of them is actually reinforced
    • AEG/WP’s assumption that effective DPC can be purchased for $70 per month even more clearly lowballs the likely cost
      • the clinic Milliman studied was paid $95 per member
      • a $95 per member fee would reduce AEP/WP claimed savings in the individual market by nearly half, and would result in no net savings in the employer markets. See calculator here.
    • AEG/WPs assumption that a $70 fee for direct primary care will remain flat for a decade is still incorrect.
  • A $95 PMPM fee, increasing at the same rate as other medical expenses, and coupled to a 12.6% reduction down stream will evaporate all of AEG/WP’s claimed billion savings.
  • The final point.
    • There is good reason to suspect that a direct primary care clinic receiving resources of $95 PMPM will outperform a direct primary care clinic receiving resources of $70 PMPM.
      • Milliman studied a clinic had that invested $95 PMPM in direct primary care and attained a presumed 12.6% downstream cost reduction; the increase in the spend for primary care exceeded the downstream savings; the employer had a net loss for using direct primary care.
      • a $70 direct primary care clinic will need a larger patient panel than its $95 competitor; its PCPs will have less time with patients and less availability; it will be less able to deliver same day appointments; so there is strong reason to expect that AEG/WP’s proposed $70 DPC will fall well short of the 12.6% downstream cost reduction level.

Milliman’s valuation of DPC health care services at $8 PMPM rests on faulty data.

If I were a direct primary care practitioner, I’d be mildly miffed at Milliman’s reducing what I do to a series of CPT codes. I’d be more worried by Milliman’s team setting the value of my health care services at $8 PMPM.
The $8 PMPM figure Milliman declared as the health care service utilization to deliver primary care to DPC patients was based on apparent underreporting by the studied direct primary care provider, of a single class of data: the quantum of patient care actually delivered.
Although this data was of central importance and would have warranted a validation process for that reason alone, Milliman evidently took no steps to validate it. But there were clear warning signs warranting extra attention including the employer’s public reports — known to Milliman — that DPC patients were visiting the DPC clinic three times a year.
Correcting the $8 PMPM to something reasonable shows that Milliman has vastly overstated net savings associated with DPC.

Note: Update of 6/24/2020.

The resources used by direct primary go beyond what has been recorded in CPT codes. DPC docs and advocates used to be the first to tell us that. Here’s a DPC industry leader, Erika Bliss, MD, telling us “how DPC helps”.

A large amount of DPC’s success comes from slowing down the pace of work so physicians can get to know our patients. While it might sound simplistic, having enough time to know a patient is fundamental to providing proper medical attention. Every experienced DPC physician understands that walking into the exam room relaxed, looking the patient in the eye, and asking personal questions dramatically improves treatment. [Emphasis supplied.]

Slower-paced and longer visits use real resources. As do all the other elements claimed to generate DPC success, such as same day appointments, nearly unlimited access 24/7, extended care coordination. The principal justification for the subscription payment model is that too much of the effort required for comprehensive primary care escapes capture in the traditional coding and billing model.

The Milliman report found no net cost savings to Union County from the money it spent on its DPC plan, a negative ROI. But some DPC advocates seek salvation in Milliman’s claim that application of its novel, CPT-code based, isolation model to Union County’s claims data turns that lemon into lemonade.

[T]he DPC option was associated with a statistically significant reduction in overall demand for health care services(−12.64%).

Milliman report at page 7.

As noted, that computation marks overall demand reduction across the system, in which lowered downstream care demands are measured as part of all demanded health care services including the health care services demanded by direct primary care itself.

Lemonade by Milliman — initial steps.

Downstream care utilization for both DPC and PPS patients, along with primary care for utilization non-DPC patients was assumed to be represented by the County’s paid claims. Milliman, in other words, felt it was actuarially sound to use the employer’s negotiated fee schedule as the appropriate yardstick to measure health care services utilization.

But DPC providers are not paid on a claims basis; they are paid on a subscription basis for nearly unlimited 24/7 access, same day appointments, long, slowed down visits, extensive care coordination and the like. How then is the “utilization” of direct primary care services to determined? Is there anything comparable to Union County’s negotiated fee schedule for fee for service medical services that might fit the bill for subscription primary care ?

How about Union County’s negotiated fee schedule for subscription direct primary care from the DPC? An average of $95 PMPM. Had that number been used in Milliman’s alternative model, I note, direct primary care delivered by the DPC would have been “associated with” a substantial increase in overall demand for health care services. Milliman, having found that Union County’s ability to negotiate fees was sauce for the FFS goose, did not find that Union County’s negotiating skill was an appropriate condiment for the subscription DPC gander.

How about setting the utilization of direct primary services at an approximation of market price for subscriptions to bundled primary care services, using perhaps the reports of DPC fees gathered in a survey that was part of the Milliman report? An average of $61 PMPM. Had that number been used in Milliman’s alternative model, I note, direct primary care delivered by the DPC would still have been “associated with” a modest increase in overall demand for health care services. But, hey, what do markets know? Milliman went a different route.

A cost approach, perhaps? I expect that Paladina, Union County’s client, would have declined, if asked, to provide data on the prices it paid for the inputs needed to provide Union County with the contracted direct primary care services. And it could well be that Paladina is as bad a price negotiator as Union County itself.

But these costs can be estimated, and the result would have more general applicability. A very conservative estimate of those costs would be $39 PMPM (based on Union County’s panel size of less than 500, a low PCP compensation package of $175k/yr, and overhead at a low 33% of PCP compensation). Had that number been used in Milliman’s alternative model, I note, direct primary care delivered by the DPC would have been “associated with” a modest decrease in overall demand for health care services of about 5% percent. Using AAFP reports of average PCP salaries and overhead instead of conservative assumptions would turn that number negative.

Using an estimate of the actual costs of putting a PCP into a DPC practice as a means of putting a value on the health care services demanded when a PCP is actually put into a DPC practice seems sensible.

But Milliman took a different course.

Breakthrough in Lemonading: the elements of the Milliman method for computing the health services utilization of direct primary care.

  • Assume that utilization of subscription-based holistic, integrative direct primary care can be accurately modeled using the same billing and coding technology used in fee for service medicine.
  • Ignore that the explicit justification most frequently given for subscription-based direct primary case is that the fee for service billing and coding methodology can not accurately model holistic, integrative direct primary care.
  • Ignore that direct primary care physicians as a group loudly disparage coding as a waste of their valuable time, strongly resist it, and do not use standard industry EHRs that are designed for purposes of payment, relying instead on software streamlined for patient care only.
  • Rely on disbelieving, reluctant DPC physicians, using EHRs ill-equipped for the task, to have accurately coded all services delivered to patients, used those codes to prepare “ghost claims” resembling those used for payment adjudication, and submitted the ghost claims to the employer’s TPA, not to prompt payment, but solely for reporting purposes.
  • Have the TPA apply the FFS fee schedule to the ghost claims.
  • Carefully verify the accuracy of the FFS fee schedule amounts applied to the ghost claims.
  • Do precisely nothing to verify the accuracy of the ghost claims to which the verified FFS fee schedule amounts were applied.
  • Perform no reality check on the resulting estimate of health care services utilization
    • Do not compare the results to those articles on Union County you have consulted, referred to or even, in your study’s literature review, quoted.
    • Do not compare the results to the market prices for direct primary care services revealed in your own study’s market survey

Anyone see a potential weakness in this methodology?

This methodology resulted in $8 PMPM. That, I note, was the number which when used in Milliman’s alternative model, showed that direct primary care delivered by the DPC was “associated with” a decrease in overall demand for health care services of a 12.6%.

Milliman identifies its methodology as a tidy “apples-to-apples” comparison of FFS primary care services and direct primary care services measured by a common yardstick. But that look comes with the feeling that Milliman have emulated Procrustes, gaining a tidy fit to the iron bed of the fee schedule by cutting off the theoretical underpinnings of direct primary care model.

DPC practitioners, however, are very much bottom-line people who will endure repudiation of their ideology in Milliman’s study details as long as the ostensible headlines serve up something they might be able to monetize: a supposedly “actuarially sound” demonstration that the direct primary care model saves big bucks.

That demonstration, however, hinges on the $8 PMPM result being somewhere near accurate. But that result is at war with reality.

Milliman’s $8 PMPM result defies known facts and common sense — and does indeed contradict the core values of the DPC model.

Whether for the average patient panel size (~450) reported in Milliman’s survey of DPC practices, or for the specific panel size (~500) for the DPC practice in Milliman’s case study, $8 PMPM ($96 PMPY) works out to less than $50,000 per PCP per year. That’s not credible.

That Union County DPC patients see their PCP around three times a year is apparent from the public statements of the employer’s then-director of human resources and his successor and even from an article on Union County from which the Milliman study’s literature review quoted verbatim. The three visits are said to have lasted at least half an hour, as long as a full hour, and to be available on same day basis. $96 a year does not pay for that.

Consider also the logical implications of accepting that $8 PMPM yield by Milliman’s process accurately reflected actual office visit duration and frequency for the DPC population. That’s roughly one garden-variety visit per year. In that case, what exactly is there to account for downstream care cost reduction?

Were those reductions in ER visits caused simply by writing “Direct Primary Care” on the clinic door? Were hospital admissions reduced for patients anointed with DPC pixie dust?

What Milliman misses is magic, just not that kind.

It’s the magic of hard, but slowed down, work by DPC practioners. It’s their time doing things for which CPT codes may not or, at least, may not yet exist.* It’s relaxed schedules that assure availability for same day appointments. It’s 24/7 commitments. It’s knowing your patient well enough to ask the personal questions that Dr Bliss mentioned. Collectively this demands more health service resources than are captured by the CPT codes for little more than a single routine PCP visit.

The data set from which Milliman calculated utilization of direct primary care services underreported the patient care given at the clinic.

The only visible path to Milliman’s $8 PMPM figure for health services demand for the delivery of direct primary care is that the direct primary care physicians ghost claims were consistently underreported. That’s a kind of outcome that can reasonably be anticipated when disbelieving, reluctant DPC physicians, using EHRs ill-equipped for the task are expected to accurately code all services delivered to patients, use those codes to prepare “ghost claims” resembling those used for payment adjudication, and submit those ghost claims to the employer’s TPA, not to prompt payment, but solely for reporting purposes.

In fact, Milliman even knew that the coding habits of the DPC practitioners were inconsistent, in that the ghost claims sometimes contained diagnosis codes and sometimes did not. Report at page 56.

Yet, Milliman did nothing to validate the “ghost claims”.

Whatever the justification for Milliman’s reconstructing the utilization of direct primary care health services demand from CPT codes collected in the situation these were, no meaningful conclusions can be drawn if the raw data used in the reconstruction is incomplete. Milliman does not appear to have investigated whether this key data set was accurate.

As a result of its apparent failure to capture the true resource costs of DPC-covered services rendered by the DPC, Milliman’s determination that the DPC model reduces overall utilization by 12.6% is far too high.

A plausible estimate of the demand for health care services for direct primary care services could be derived from widely-acccepted estimates of primary care physician compensation and practice overhead. Substituting any estimate of those costs greater than $45 PMPM for the $8 PMPM at which Milliman arrived would bring the calculated OVERALL medical services utilization gap between DPC and FFS well below four percent.

Another plausible estimate of the demand for health care services for direct primary care services is the market price of DPC services. Milliman’s estimate of that number was $61 PMPM. Substituting that market price for the $8 PMPM at which Milliman arrived turns the health care services utilization gap between DPC and FFS in favor of FFS.


* After the period Milliman studied, for example, CMS came up with 99491.

ATTN: Milliman. Even if Union County had not waived the $750 deductible, the County still would have lost money on DPC.

The lead actuary on Milliman’s study of direct primary care has suggested that the employer (Union County, NC, thinly disguised) would have had a positive ROI on its DPC plan if it had not waived the deductible for DPC members. It ain’t so.

Here’s the Milliman figure presumed to support that point.

It is true that removing the $31 figure of Line H, would lead to a tabulated result of total plan cost of $347, which would suggest net savings.

The problem is that the $61 figure of Line J of the Milliman report has been too low all along — and by more than $31.

Milliman got the $61 by estimating the plan cost of DPC membership, rather than learning what it actually was. $61 was the result of Milliman applying a 60:40 adult child split to fee levels drawn from Milliman’s survey of $75 adult and $40 child. But the publicly recorded contract between the DPC provider, Paladina, and Union County set the fees at $125 adult and $50 child, and $95 is the correct composite that should have been in Line J, representing $34 PMPM missed by Milliman.

Accordingly, even if the $31 cost that fell on the County for waiving the deductible is expunged from the calculation, the total plan costs for DPC would work out to $381 and would still exceed the total plan costs for FFS. The County’s ROI was indeed negative.

I can not tell you why Milliman used estimated fees of $61 rather than actual fees of $95. But doing so certainly made direct primary care look like a better deal than it is.

Risk adjustment and more badly needed for KPI Ninja’s Strada-brag

Amended 6/26/20 3:15AM

The Milliman report’s insistence on the important of risk adjustment will no doubt see the DPC movement pouring a lot of their old wine into new bottles, and perhaps even the creation of new wine. In the meantime, the old gang has been demanding attention to some of the old wine still in the old bottle, specifically, the alleged 68% cost care reductions attributed to Strada Healthcare in its work with a plumbing company of just over 100 persons in Nebraska.

Challenge accepted.

KPI Ninja’s study of Strada’s direct primary care option with Burton Plumbing illustrates why so much of the old DPC wine turns to vinegar in the sunlight.


At an extreme, there will be those who anticipate hitting the plan’s mOOP in the coming year — perhaps because of a planned surgery or a long-standing record of having “mOOPed” year-in and year-out due to an expensive chronic condition; these employees will be indifferent to whether they reach the mOOP by deductible or other cost-sharing; for them, moreover, the $32 PMPM in fixed costs needed for DPC option is pure disincentive. Furthermore, any sicker cohort is more likely to have ongoing relationships with non-Strada PCPs with whom they wish to stay.

An average non-Strada patient is apparently having claims costs of $8000. With a $2000 deductible and say 20% coinsurance applied to the rest that’s an OOP of $3200 and a total employee cost of about $6100; with a $3000 deductible that’s an OOP of $4000 and a total cost of $7250 . Those who expect claims experience of $8000 are unlikely to have picked the DPC/$3K plan. Why $1100 pay more and have fewer PCPs from which to choose?

But what about an employee who anticipated claims only a quarter that size, $2000. With the $2000 deductible that would come to an OOP of $2000 and a total cost of $4860. With the $3000 deductible that would come to an OOP of $2000 and a total cost of $5250. For these healthier employees, the difference between plans is now less than a $400 difference. Why not pay $400 more if, for some reason, you hit it off with the D-PCP when Strada made its enrollment pitch?

The sicker a Burton employee was, the harder this paired-plan structure worked to push her away. It’s a fine cherry-picking machine.

Strada’s analyst, KPI Ninja, recently acknowledged Milliman’s May 2020 report as a breakthrough in the application of risk adjustment to DPC. In doing that, KPI Ninja tacitly confessed their own failure to work out how to reflect risk in assessing DPC for its string of older reports.

To date, as far as I can tell, not one of KPI Ninja’s published case studies has used risk-adjusted data. If risk adjustment was something that Milliman invented barely yesterday, it might be understandable how KPI Ninja’s “data-analytics” team had never used it. But CMS has been doing risk adjustment since the year 2000. It’s significantly older than Direct Primary Care.

KPI Ninja should take this opportunity to revisit its Strada-Burton study, and apply risk adjustment to the results. Same for its Palmetto study and for its recently publicized, but risk-adjustment-free study, for DirectAccessMD. Or this one about Nextera.

Notice that, precisely because they have a higher deductible plan than their FFS counterparts, the Strada-Burton DPC patients faced greater cost-sharing discipline when seeking downstream care. How much of the savings claim in the Strada report owes to the direct primary care model, and how much to the a plan design that forced greater shopping incentives of DPC members?

It’s devilishly clever to start by picking the low-risk cherries and then use the leveraged benefit structure to make the picked cherries generate downstream cost savings.

The conjoined delivery of Strada DPC and enhanced HDHP makes the enhanced HDHP a “confounder” which, unless resolved, makes it virtually certain that even a risk adjusted estimate of DPC effectiveness will still be overly favorable to Strada DPC itself on utilization.

I have no doubt that risk adjustment and resolution of the confounding variable will shred Strada’s cost reduction claims. But, of course, if Strada is confident that it saved Burton money, they can bring KPI Ninja back for re-examination. It should be fun watching KPI Ninja learn on the job.

I’m not sure it would be fair for KPI Ninja to ask Strada to pay for this work, however. KPI Ninja’s website makes plain that its basic offering is data analytics that make DPC clinics look good. Strada may not like the result of a data analytic approach that replaces its current, attractive “data-patina” with mere accuracy.

I’ll skip explaining why the tiny sample size of the Strada-Burton study makes it of doubtful validity. Strada will see to that itself, with vigor, the moment it hears an employer request an actuarially sound version of its Burton study.

Special bonus segment. Burton had a bit over 100 employees in the study year, and a large fraction were not even in the DPC. I’m stumped that Burton had a one-year hospital admission rate of 2.09 per thousand. If Strada/Burton had a single hospital admission in the study year, Strada/Burton would had to have had 478 covered lives to reach a rate as low 2.09. See this spreadsheet. If even one of 200 covered lives had been admitted to the hospital, the inpatient hospitalization rate would have been 5.00.

The use of the 2.09 figure suggests that the hospital admission rate appearing in the whitepaper was simply reported by Strada to the KPI Ninja analyst. A good guess is that it was a hospitilization rate Strada determined for all of its patients. Often, DPC practices have a large number of uninsured patients. And uninsured patients have low hospitilization rates for a fairly obvious reason.

For Qliance, a plausible net savings is 6.8%

There are three main steps to get from a 19.6% savings claim by Qliance to a plausible number: (1) examining the validity of Qliance’s claim that it collected $251 more per employee than the employers were spending for fees for service primary; (2) including the drug costs which Qliance chose to omit from the data set; and (3) borrowing a generic DPC risk adjustment per Milliman, which brings the number down to 6.8%. Still, I probably wouldn’t bet that DPC can reduce net costs.

STEP ONE — Address the credibility of Qliance’s core claim

In early 2015, Qliance issued a press release that included a table of internal data, a package purporting to show that engagement of Qliance as a direct primary care provider for a subgroup of employees the employers resulted having “19.6 percent less than the total claims” when compared to those employees of the same employers who obtained primary care through traditional fee for service primary care practice. In dollar terms, the reported savings was $679 per person per year. No attempt was made to examine the degree to which the apparent savings might be due to differences in medical risk between the two populations. Some ambiguous wording in the text of the release was clarified by the table itself making clear that the 19.6% savings was intended to be net of Qliance’s monthly direct primary fee. And, a footnote to the table also mentioned that the claims costs analyzed included all claims data except for prescription drug claims.

Here’s the table as presented by Qliance.

The press release and table did not mention the amount of Qliance’s monthly direct primary care fees. These fees do appear, however, in contemporaneous publications such as this article about the Qliance clinic at Expedia. Qliance’s fees to employers were age-dependent, ranging from $49 to $89 per month. Assuming those 65 and older have a top bracket all to themselves, and at least roughly linear age-based pricing for the remaining employees, $64 per month ($768 per year) corresponds to a mid-point and a reasonable estimate of Qliance’s average per employee receipts.

Qliance’s table indicates that the employers’ primary care annual outlay for non-Qliance patients is $251 less Qliance’s annual fee. That would mean these employers were paying $547 per year primary care per employee.

Qliance’s table equates $679 and 19.6% of claims costs, excluding prescription drug costs. Dividing $679 by 19.6% yields $3464 as the total claim costs, excluding the cost of prescription drugs, for non-Qliance patients. The Health Care Cost Institute indicates that in 2014 the average annual prescription drug costs in the West Region where Qliance operates were $684 per person. Adding that amount to their $3464 of other claims costs, brings the total annual employer cost of care for non-Qliance member employees to $4148.

For non-Qliance employees, then, the $547 primary care spend corresponds to over 11.3% of total health care spend. This is a remarkable number. The American Association of Family Physicians expresses horror when it tells us that primary spending falls in the 5-8% range; a recent outgoing AAFP president took great pride for his role in two state intiaitives that pulled the primary care spending percentage into 12-13%. Family Medicine for America’s Health, an alliance of the American Academy of Family Physicians and seven other family medicine leadership organizations.

Presumably, we are to believe that, even though the non-Qliance employees were already approaching the pearly gates of primary care heaven with 11.3 % invested, Qliance’s swooped in and brought them a further 19.6% cost reduction.

Committing $251 more dollars to primary care while netting down $ 679 on total would mean that primary care for the Qliance patients reached 22% of total spend, a fifty per cent increase above the 14% seen in European countries thought to be top performers. A level of primary care spending unknown anywhere other than in Qliance clinics!

All this strongly suggests that Qliance’s math or method is just wrong. But Qliance has not disclosed how the calculation was done. Indeed, as already noted, the Qliance news release proudly claiming large saving Qliance did not even disclosed the monthly fees being paid by employers, an amount that is central to that calculation.

The Qliance table also presents some puzzling details about downstream care. Qliance patients are noted as having 14% fewer ER visits than their FFS counterparts. But the next cell in the same table reports that the average cost of ER claims for Qliance patients was higher, by $5 per annum, than the average cost of ER claims for non-Qliance patients. In percentage terms, an average Qlaince patient incurred ER costs that were slightly more than 14% higher than those of non-Qliance patients.

It is certainly plausible that Qliance patients visiting ERs might present a somewhat different case mix than their counterparts. But Qliance patients having greater average ER costs than their fee for service counterparts stands in sharp contrast to one of the most stressed talking points advanced by advocates for direct primary care.

The Direct Primary Care Coalition is lobbyist for direct primary care. Its current chairman was one of the founders of Qliance. In its advocacy to Congress and others, the Coalition often relies on the 2015 Qliance press release. DPC Coalition has addressed the apparent anomalyregarding ER in an interesting way. In a letter to members of the Senate Committee on Finance, the DPC Coalition produced a modified version of the 2015 table that solved the apparent problem — by simply deleting the data on ER visits!

Letter, DPC Coalition 1/26/2016 Despite removal of the ER visit claims data, the Coalition version of the table continues to represent that it contains all-claims data, and has at least one error of arithmetic.

In the wake of Qliance’s nondisclosure of details and subsequent closure of operations through bankruptcy, I have done a computation based on assumptions which I believe would have been made in the course of due diligence by a potential investor from whom Qlaince sought capital. The assumptions are:

  • that Qliance received, on average, the mid-point fee of $64 per member per month;
  • that $684 in prescription drug costs, being an average annual expenditure in the US West Region, should be included in total health care spend;
  • that non-Qliance patients incurred primary care claims at a rate of 6.7% of total health care spend, which corresponds to the mid-point of AAFP estimates.

Computed in this way, the non-risk adjusted percentage net savings for Qliance patients is 10.7%.

Here is a link to the computation in a downloadable spreadsheet showing all cell formulas. That spreadsheet is imaged here:

Google Spreadsheet

If there’s a better way to make this computation, or if I’ve totally blown it, let me know in the comments section.

The reader is also invited to visit the spreadsheet, get it onto their own device, and run their own variations. There is a table that notes how the “$251 assumption” and “19.6% assumption” combine to fix the relationship between putative Qliance fees and corresponding primary care spend percentages for non-Qliance employees. If, for example, you were to assume that average Qliance fee was $49 (despite the report that $49 was the minimum fee), the implicit non-Qliance primary care spend percentage would be 8.1% (a number actually over the top of the range that AAFP reported as typical of the US) and the implicit Qliance primary care spend would be 16.9% (and still well beyond AAFP’s wildest dreams) . In that scenario, the final figure, after risk adjustment and accounting for prescription meds, would still be less than half of Qliance’s initial claim of 19.6%

STEP TWO — Adjust computation to include prescription drug costs

The table above fills the gap created by Qliance’s excludion of prescription drug claims. I do not know the reason for this exclusion. We do know that inclusion of prescription drugs claims in “total claims” would have lowered the amount and percent amount and/or percentage of savings that Qliance reported.

Iora, a direct care practice leader whose work has been featured right alongside Qliance in the legislative advocacy of the Direct Primary Care Coalition, has reported data that certainly make it seem that substantial reductions in other downstream spending can, in effect, be purchased by large increases in prescription usage. In one of its clinics, a 40% increase prescription refills seem to have largely countered huge drops in hospital costs, including ER visits, so that true total spending reduction remained modest.

If Qliance was like Iora in this regard, the inclusion of prescription drug expenses would have significantly reduced what was literally the bottom line of it press-release table.

We don’t know whether Qliance was like Iora.

There appears to be only one careful study explicitly addressing the usage of prescription drugs by DPC patients relative to FFS patients. It’s not Iora’s.

Milliman’s case study for the Society of Actuaries carefully compiled DPC and FFS patient utilization data in all areas of medical care services for an employer contract similar to Qliance’s contracts. For prescription drugs, they measured a 1% greater utilization by the DPC patients.

Applying the Milliman study number to the Qliance work decreased the estimated total annual savings from using DPC by $7. But the largest inpact comes not from deducting $7 from the net savings. Figuring in drug costs increases the denominator of the % savings calculation by $648, so that overall cost savings fall to 16.2% even without any adjustment other than including drug costs.

STEP THREE — Risk adjustment

We urge readers to use caution when reviewing analyses of DPC outcomes that do not explicitly account for differences in population demographics and health status.

Milliman study for the Society of Actuaries

The Milliman study for the Society of Actuaries stands alone (in June of 2020) as the only examination by independent experts of the effect on health care costs of demographic and health risk differences between employees who elect direct primary and those who elect traditional primary care. For Milliman’s employer, raw costs needed to adjusted downward by 36% to account for health factors favorable to the employee group that elected direct primary care group.

Should we assume a general similarity between the employees studied by Milliman and the employees of the employers served by Qliance to reflect the health risk characteristics of the sub-population that elected direct primary care? The Milliman study authors note that when an employer offers a direct primary care option — with its exclusive PCP relationship — employees with lower health care needs and fewer anticipated PCP encounters may, ceteris paribus, be more likely to elect DPC. Milliman connects this with the fact that, historically, narrow access plans like HMOs see favorable selection effects relative to PPOs.

The heroic equating of the employee groups from the Qliance and Milliman studies is probably the best available way to address risk issues in the Qliance data. Qliance has been out of business since 2017; it left the field without giving us risk adjusted data. Milliman is, at least for now, is the best we have to try to fill that gap.

Applying the 36% adjustment from Milliman results in a plausible estimate that Qliance adoption was associated with a total cost savings (inclusive of prescription drugs) of 6.8% all employer costs.*

An alternative way to fill the gap draws upon work that is both less sophisticated and less impartial in its analysis, Nextera/DigitalGlobe white paper addressed in this prior post. This was also a much smaller study than Milliman’s and covered a far shorter period of time. It points to a level of risk adjustment within striking range of that from Milliman. Nextera obtained employee claims data for a five month period prior to the availability of a DPC option. The DPC cohort had pre-option claims costs that were lower than the FFS cohort more than 30%. Applying an alternative Nextera-based risk adjustment to Qliance data would have resulted in an estimate that Qliance adoption was associated with a 7.4% overall cost reduction. Due to its provider-independent sourcing, its development by professional actuaries, its larger size and longer duration, I choose to rely on the Milliman adjustment for my headline.

Addendum: July 12, 2020. Benefit plan structure can have a substantial impact on costs and utilization. Although, for the foregoing analysis, I assumed that the employers involved offered employees in the two Qliance and non-Qliance cohorts effectively equivalent cost-sharing obligations, an additional layer of selection bias may well be present if these employers offered significantly different benefit structures to the different cohorts.

In all, I advise against staking anything of value on any claim that Qliance produced net cost reductions. That issue was, in effect, crowd-sourced in early 2017 when Qliance desperately searched for fresh capital before declaring bankruptcy in late Spring.

*This would imply a downstream cost care reduction of about 14%.