Do you remember when Union County’s three year DPC commitment for 2016-2018 was claimed to be saving Union County $1.25 Million per year? So why did Union County’s health benefits expenditure rise twice as fast as can be explained by the combined effect of medical price inflation* and workforce growth?
For the first year or two, a clinic owner in an employer DPC option may get away with presenting the employer with data brags that package selection bias artifact as DPC cost-effectiveness. The wiser employers will figure selection bias out before making the mistake of tossing all their employees into direct primary care based on non-risk-adjusted data.
For those employers who do not figure this out, it will be interesting to watch the DPC clinics adapt to the influx from the sicker, older population. Better, it will be outright fun to hear DPC clinic owners explain why having more employees in their clinic led to an increase in PMPMs for DPC patients. Since it will be hard for them to suddenly come to Jesus on the cherry-picking issue, where will they turn? Probably to blaming a combination of Covid-19, insurance companies, and Obamacare.
*For medical cost inflation I used the Bureau of Labor Standards medical goods and services marketplace statistics. I thank economics student D.S. for pointing out that in the employer insurance sub-market the rate of increase was higher. Kaiser Family Foundation’s annual Employer Heath Benefits Survey data indicate that for the period 2015-2018, costs increased 11%. Substituting 11% from KFF for the 8.5% from BLS, I compute that the 47% increase in Union County was not 200% of that expected, it was a mere 175% of that expected.
Of course, I might have attempted a narrower focus. I fully realize that the average increase might have been different for North Carolina employers, or for county employers, or for county employers in North Carolina. Then, too, there are a total of counties named Union in a total of 17 states. I confess that I deliberately refused to determine the employer health coverage cost inflation rate for the other 16 Union Counties. Life is short.
It might also be the case that bringing DPC to Union County created such great opportunities for affordable health care that DPC employees married at a higher rate, and decided to have more children. And, of course, the presence of DPC might have helped Union County recruit a large tranche of new employees with enormous families.
In 2016, the share of people between 65 and 74 who were still working was over 25%. Any of them working at employers with more than twenty employees covered by group health plans are required by law to be included in the employer’s plan. They may also enroll in Medicare Part B. Some employer plans even require their Medicare-eligible employees to enroll in Part B. When optional for the employee, the choice to add Medicare Part B and have dual coverage is typically made by relatively heavy utilizers looking to meet cost-sharing burdens by having Medicare as a secondary payer.
In any event, elder employees with dual coverage will have low, or even zero, out-of-pocket expenses, whether for primary care visit fees; for the types of services, like basic labs, often included in an employer DPC package; or for downstream care. These elders have relatively little incentive to join DPCs, especially in cases where the employee pays more for a DPC option than for a non-DPC option (such as Strada Healthcare’s plan for Burton Plumbing).
Many with dual coverage will have even more incentive to avoid DPC. A large majority of DPCs, including DirectAccessMD, Strada Healthcare, and many Nextera-branded clincs, have opted out of Medicare. Medicare-covered employees who receive ancillary services that the DPC performs for separate charge will be expected to see that the DPC gets paid, but will receive no Medicare payment for those services. A Medicare-covered employee in Nextera’s St Vrain Valley School District plan, for example, would be denied the ability to have Medicare pick up his cost-share for Nextera’s in-office labs and immunizations, Nextera’s on-site pharmacy, or Nextera’s on-site cabinet of durable medical equipment. Were a dual covered employee to choose the Nextera clinic, she would have to make a point of declining to have Nextera draw her blood work or put her in a walking boot.
Most employer workforces have a relatively small percentage of employees over age 64. But provider health coverage for these elders is apt to be very costly. The employees likely to be most costly are the very ones that will find Medicare Part B’s annual premium of less $1750 a good bet for avoiding cost-sharing burdens like those in Nextera’s SVVSD plan – a $2000 deductible and a $4000 mOOP.
Accordingly, those with dual coverage are likely to be high utilizers of services with nothing to gain from DPC. Or, worse: some will pay more in employee contributions; some will have added costs and/or inconvenience owing to Medicare opt-out by the DPC provider.
These high-cost, dually-covered employees will disproportionately end up in the non-DPC cohort under most employer DPC option plans. And every one of them will skew non-risk-adjusted claims data, contributing to a selection bias artifact masquerading as DPC savings.
Much the same reasoning will apply to other employees who have a secondary coverage, such as being a covered spouse. Dual coverage usually comes at a price, such as a premium add-on for spousal coverage. But the price will often be worth it for high utilizers whose primary coverage has high cost sharing burdens that can be brought to negligible levels. For these high utilizers, the incentives to select a DPC option are minimal, even negative if the DPC option comes with a larger employee contribution.
Finally, whatever the source of secondary coverage, the heavy utilizers for whom it is particularly desirable are also the very people most likely to cling to particular PCPs who have served them well in the past, rather than sign on with a DPC clinic offering a narrow choice.
Two recent DPC brags fit together in a telling way.
Nextera Healthcare reported non-risk-adjusted claims data indicating that employees of a Colorado school district who selected Nextera’s DPC option had total costs that were 30% lower than those who selected a traditional insurance option. But that employer’s benefit package confers huge cash advantages (up to $1787) on risky, older patients if they choose the non-DPC option.
Comes now DirectAccessMD with a report of non-risk-adjusted claims data that employees of a South Carolina county who select a DPC option have percentage cost “savings” that are less than half of those shown for Nextera, a mere 14% lower than those who selected a traditional insurance option. But that employer’s benefit package design has an opposite effect to Nextera’s, conferring a significant cash advantages on risky, older patients if they choose DPC option. (And, good on the County and the DPC for it!)
Nextera’s program works to push risky, older patients away from DPC; DirectAccessMD’s program welcomes them. Pushing the sick and old away helps boost Nextera’s corporate brag to more than double that of DirectAccessMD who, proudly, invite the same people in — even if it makes them appear less successful.
Even without a fancy actuarial analysis, or even a basic one based on patient demographics, it’s apparent that most of Nextera’s brag is merely selection bias artifact masquerading as Nextera DPC cost-effectiveness.
Is DirectAccessMD’s clinic free of selection bias? Not at all. Selection bias can also arise when older and/or riskier patients make enrollment decisions based on differences in access to physicians of their choice. Older, riskier patients tend to cling to a long-standing PCP rather than select from the relative few offered by DirectAccessMD. Think of this as a primary care form of cherry-picking by narrow-networking .
We have learned that Anderson County’s DPC patients are 2.7 years younger than their non-DPC counterparts. Applying CMS risk adjustment coefficients for age and sex reduces DirectAccessMD’s “savings” from 14% to 2% .
If you would like to bet that the age difference between the two Nextera school groups discussed above is less than 2.7 years, please use the contact form. I’ll be right there.
The DirectAccessMD clinic that serves the employees of Anderson County, SC, is run by a tireless advocate for, and deep believer in DPC, Dr J Shane Purcell. Here the employer, with Dr Purcell’s apparent support, has taken steps that seems to have somewhat mitigated the selection bias that is baked into most other direct primary care option arrangements. Specifically, the dual benefit plans here have both a lower deductible ($250) and a lower co-insurance maximum ($1250) for DPC patients than for non-DPC patients ($500, $2500). Where other benefit plans structures, like the Nextera SVVSD plan reported here, push higher risk patients away, the Anderson County plan is more welcoming to those patients. I applaud the County and Dr. Purcell.
In fact, a high risk Anderson employee can see more than $1500 per year in added costs if she declines DirectAccessMD. A patient expecting the average utilization seen for the FFS cohort (~$4750) in Anderson County would likely incur about $375 in added costs by declining DPC, where an average patient in Nextera’s SVVSD plan would have saved $925 for doing the same. Again, this important difference is a feather in Dr Purcell’s cap.
In November 2020, we applied CMS’s actuarial value calculator to compare the County’s plans. The traditional plan had an actuarial value of 82%, the DPC plan of 87%. More of this update below.
Yet, as the recent Milliman study suggests, high risk patients may be reluctant to disrupt standing relationships with their PCPs, and may choose to resist other incentives if it means having to select a new PCP from a small panel at a given DPC clinic. Consider also that older employees, even those not at high risk, are more likely than younger employees both to have deeper attachments to their long-standing PCP and to have more disposable income to spend on keeping that relationship going. On average, therefore, we would expect employees who eschewed the direct primary care package to be an older and/or riskier group. Let’s go to the tape.
Not surprisingly, raw data — without any risk adjustment — from the employer indicates a noticeably smaller percentage of purported savings than has been bragged about by other DPCs in the past. Anderson County’s net cost for DPC members came in at 9% less than for non-DPC members, but the employees in DPC paid OOP only about half of what their non-DPC counterparts did. Combining both employer and employee costs, the average total spend for Anderson County DPC patients came to about 14% less than for non-DPC patients, purported savings of $56 a month.
But note these warning from the Milliman study: “We urge readers to use caution when reviewing analyses of DPC outcomes that do not explicitly account for differences in population demographics and health status and do not make use of appropriate methodologies.” Or this more recent one: “It is imperative to control for patient selection in DPC studies; otherwise, differences in cost due to underlying patient differences may be erroneously assigned as differences caused by DPC.”
An risk analysis of the health status of all the county’s patients, fully detailed as to all chronic coniditions, may not have been financially feasible for a modest operation like Dr Purcell’s. But a sensible population demographic methodology is at hand: comparing the ages of the two populations and using that as a predictor of utilization. This is certainly a “rough approximation”. But, not only is a rough risk adjustment likely to be better than no risk adjustment at all, the reasonableness of using age as a proxy for predicted utilization is affirmed by the fact that nearly all DPC practices use age-cost bands, and no other risk-based factor, in setting their subscription rates. Basic demographics are at the core of risk adjustments used by CMS for the ACA; over 75% of ACA enrollees in insurance plans under 65 have no adjustment-worthy chronic conditions; they are risk-adjusted on demographics alone.
The coefficients for age/sex risk adjustment used by CMS for ACA plans in 2020 can be seen here. Dr. Purcell’s slide pointing out the DPC cohort of Anderson County employees was 2.7 years younger than the traditional cohort is here. Going back to the tape, I estimated the risk adjusted overall medical costs for the DPC membership to be about 11.5% lower than for traditional primary care group.
A second adjustment points the other way. All other things being equal, richer plans are known to produce “induced utilization”. CMS’s risk transfer machinery applies an induced utilization factor to adjust for benefit richness. As shown on the calculation sheet also linked above, this adjustment would increase employer costs by 3.8%.
This brings a tentative measure of savings, pending more definitive risk adjustment, to about 6% overall, about $25 PMPM.
Maybe it is not what what Dr. Purcell hoped, but his results are more promising than most others.
The great caveat, of course, is that proper risk adjustment could turn this estimate up or down. In the Milliman study, the difference between actuarial of the benefit packages of the DPC and FFS programs was modest, yet the Milliman team still found massive adverse selection into the FFS. Milliman accounted for this as the result of sicker people clinging to the trusted PCPs who had served them in the past. I think of that as adverse selection by narrow primary care panel. Whatever the explanation, Milliman found that the selection bias required an overall upward adjustment of 36% of DPC costs; and they predicted that most employer DPC option clinics would see similar. On the other hand, a fairly large difference in benefit packages favoring DPC members, as in Anderson County, is something the Milliman team appears not to have contemplated, and it must surely drive some of the risky into the DPC pool.
I am still not betting on DPC saving big money. But, if you call me with your proposed wager, I’ve shortened the odds.
In a prior post, I suggested that Milliman’s use of downstream claims data in assessing utilization in Union County’s employee health plans may have been distorted in favor of DPC because that downstream data had not been adjusted to reflect the effects of the County’s cost-sharing design on utilization.
In a footnote to a recent comment on a Milliman web forum, two Milliman actuaries addressed similar concerns for the first time.
In addition to increasing an employer’s share of costs, benefit changes can also affect how much care members utilize. This affect (sic) is commonly referred to by actuaries as induced utilization, and should be considered by employers when structuring DPC options and by those evaluating the impacts of DPC. For our case study, the benefit design under the DPC option was slightly richer in aggregate than the PPO option, meaning that based on benefit design differences alone, members would be expected to use slightly more services in aggregate when enrolled in the DPC option than when enrolled in the PPO option. This is due to the employer waiving cost sharing for primary care services as well as the medical deductible for all services under the DPC option. Since the difference was relatively small, we conservatively did not apply an induced utilization adjustment in our estimates. If we had, the reduction in demand corresponding to enrollment in the DPC option would have been slightly greater.[Emphasis supplied.]
I thank Milliman for putting a proper actuarial name on my concerns. I am not an actuary.
When I first saw this, I took issue, suggesting that the details of the small difference might matter if the whole thing was looked at with sufficient granularity.
I since learned how to deploy the CMS AV calculator and CMS rules on induced utilization adjustment for risk transfer to get the granularity I sought.
Silly me. They were right. In fact, the induced utilization adjustment would be about about 0.2% in favor of the DPC.
If you want an example of induced utilization, consider the Nextera SVVSD clinic — where the non-DPC cohorts gets a $750 HRA unavailable to DPC members, while paying only 10% co insurance versus 20% for the DPC members. This not only creates a huge selection bias that pushes the sick away from DPC, it drops healthy family members who were just brought along into a very favorable space for health care consumption.
At last, it dawns on me. Selection bias is baked into virtually every DPC cake.*
Direct primary care usually comes with a significant price and a package of financial incentives revolving around primary care (and, sometimes, around some downstream care). For some, the game may be worth the candle. The incentives, typically the absence of primary care visit cost-sharing and free basic labs and generic drugs, have their best value for those who expect total claims to fall near but still short of their deductibles. These people are relatively low risk.
For those expecting to have total claims that will exceed their deductibles even if they receive the incentives, the dollar value of those incentives is sharply reduced — usually to a coinsurance percentage of the claims value of the incentive. These people have risks levels that range run from a bit below average risk to well above average risk.
The least healthy people have the highest claims. At a next level, and all the way up to the stratosphere, are insured patients expecting to hit their mOOP in an upcoming year. For a typical employer contract, however, these people are not necessarily extreme; for an employee with a $2000 deductible and a $4000 mOOP, this represents a $12,000 claims year. That’s not even a single knee replacement at the lowest cash price surgery center. For these, DPC’s financial incentives have essentially zero financial value.
Higher risk patients have significantly less incentives to elect direct primary care. DPC patient panels are enriched for low risk patients while higher risk patients tend to go elsewhere.
Financial considerations apart, the higher risk patients are also likely to be the ones least interested in replacing established relationship with particular PCPs with primary care from a narrow panel DPC practice. A second reason why DPC patient panels are enriched for low risk patients while higher risk patients tend to go elsewhere.
The upshot: Virtually any employer-option DPC clinic can trot out unadjusted claims data that shows employers having lower PMPMs for DPC patients than for FFS patients. After risk adjustment, however, not so much.
*I recently came across an employer health benefit system that included both a DPC option and cost-sharing features that apparently mitigated selection bias somewhat. But note, in that program, employees who chose to retain relationships with PCPs not affiliated with the DPC clinic paid up to $1250 per year for that privilege. Layouts of that order seem likely to correlate with either profound health impairments or advanced age. I have learned that the non-DPC population at that employer is, on average, two years older than the DPC population. On a standard age-cost curve of ~4.6 to 1, every penny of the difference between the groups can be full accounted for.
Dr. Lee Gross’s Epiphany Healthcare provides DPC services for some of the employees and some members of of some of their families at a hospital in Florida. Some hospital employees decline Epiphany; they and some members of their families receive instead traditional insurance based primary care. Unusually for such arrangements, a recent assessment of the dollar value of the claims experience for the DPC cohort exceeded that of the FFS component by about 15%.
There appears to have never been a case in which a DPC provider applied risk-based adjustment to validate raw claims cost differences that seemed to reflect well on DPC. When, on the other hand, raw data went the opposite way for him, Dr. Gross decided the time had come to became a data analytics pioneer and develop a risk adjustment model for direct primary care.
Conveniently, widely-used models developed by professional actuaries in and for CMS (in making patient population risk adjustments for use in Medicare Advantage plans and under the Affordable Care Act) and Dr Gross’s share a common feature – each adds up, in certain circumstances, the total number of a patient’s chronic conditions to develop an adjustment factor. This facilitates a comparison.
To assess the precise role of the common feature of the two models, respectively, I will summarize the more widely used CMS model and then explain some key differences between it and the Gross model.
In CMS’s method, depending on the case, up to four different components come into play. [FYI, the lettering system is mine and is intended to simplify the explanation.]
A. There is, of course, a significant demographics component scored to reflect the historical utlilization by persons, specified by ages, gender, and other factors. Basic demographics are at the core of risk adjustments used by CMS for the ACA; over 75% of ACA individual enrollees under 65 have no adjustment-worthy chronic conditions and are risk-adjusted on demographics alone.
B. There is a specific conditions component in which a patient is given a score for each condition she has off a list of over eighty broad health conditions, each of which has a specific score reflecting that conditions historical correlation with the use of medical services – so much for asthma, so much for porphyria, CHF, etc.
C. There is also an “interactions” component in which the patients can receive additional scores for as many as she may have off a list for certain specific combinations of conditions from “B” above that are historically correlated with enhanced need for services when the those health conditions are combined.
D. a recently added number of chronic conditions component reflects considerations generally similar to those in Dr Gross’s index
(D)(1) there is a specific list of about two dozen chronic conditions of the eighty condition from “B” above
(D)(2) if a patient has FOUR or more of the conditions in the list in (D)(1), then patient is given an additional score that reflects how having that many of those conditions has historically been correlated with an enhanced need for services.
Component (D) scoring is non-linear: the adjustment for five conditions reflects historical correlation between costs and having five conditions; scoring is computed separately for four, five, six, etc. Eight conditions are not simply given twice the score given for four conditions.
Now let’s look under the hood of the Gross model to see how these factors play out in his risk adjustment model for direct primary care.
Gross’s model removes the demographic component.
Under Gross’s model the number of different specific conditions being scored is reduced from a hierarchy of over eighty condition categories down to a single data point: a “chronic condition”.
. Gross’s model removes the interactions component.
Gross’s model counts all chronic conditions equally and assumes a linear relationship between health costs and the raw number of chronic conditions.
Gross’s model starts the count of multiple chronic conditions needed to trigger the multiple chronic conditions factor at one (1), while the CMS model believes that number of conditions requiring a complexity factor should start at four (4).
A theoretical advantage of Gross’s method is ease of application.
On the other side, Gross’s is untethered to any historical utilization data whatsoever. Within the one component where there is any measure of similarity between the Gross model and the CMS model, Gross’s approach expressly contradicts CMS’s actuaries’ explicit determinations that a linear correlation did not accord with historical reality and that the count should begin at four (4). (CMS actuaries did agree that there was an additive effect for each additional condition, but unlike Gross they denied that the additive effect was necessarily linear.)
On every other component Gross’s model is in even sharper contrast to the views of the professionals. Notwithstanding Dr Gross’s model, demographics strongly correlate with costs. As noted, over 75% of patients under 65 have their risk adjustments based only on demographics as they have no chronic conditions among the many dozens that the professionals felt needed to be includes. Notwithstanding Dr Gross’s model, among the 25 or less with actionable chronic conditions, costs vary profoundly from one condition to another; conditions are hierarchical. Notwithstanding Dr Gross’s model, multiple conditions do interact with each other.
Gross’s risk adjustment model for direct primary care looks like something that has been reverse engineered from adverse data for the sole purpose of rescuing Dr Gross’s corporate brand at its largest employer.
As with all other aspects of his otherwise relentless promotion of the results at Desoto Memorial Hospital, Dr Gross has expressly declined to make public any details of his methods of gathering and processing data. Even if the very outline of his risk methodology was not at war with those of his analytical betters, important questions would remain.
For example, what criteria did he use to identify and count chronic conditions in the DPC cohort (his own patients) and those in the non-DPC patient cohort (who get primary care elsewhere). Especially for gathering chronic condition data on those who are not his patients what, if anything, did he do to validate data accuracy and consistency?
For these reasons, I report only this: (See update below.)
The raw data shows that non-Epiphany patients at Epiphany’s largest employer clinic have lower claims costs.
Dr Gross has produced a slide show that includes, as far as I can tell, all the public information about the Epiphany experience at that employer.
The slide show utilizes the Gross risk adjustment methodology for direct primary care discussed above.
On Slide 22, Dr Gross made an error of double adjustment. Here is what appears.
Actually, I have decided to go step further by announcing an opinion: I think Dr Gross’s approach is complete bullshit. And yet it still may be a bit more justifiable than the bullshit of the KPI Ninja/Nextera report; Gross, after all, is not falsely claiming the imprimatur of a prestigious academic research team.
Skillful actuarial work on risk adjustment. A clear warning against relying on studies that ignored risk adjustment. Implicit repudiation of a decade of unfounded brags.
An admirable idea on “isolating the impact of DPC model” from the bad decisions of a studied employer.
Milliman should have recognized that the health service resources that go into providing direct primary are vastly more than $8 PMPM that emerged from its modeling and should have done more to subject the data on which the number rested to some kind of validation.
Upshot: there is still no solid evidence that direct primary care results in a reduced overall level of utilization of health care services. Milliman’s report needs to clearly reflect that.
Overview: A core truth, and a consequence.
The Milliman report on the direct primary care option in Union County has significant truth, an interesting but unperfected idea, and staggering error. The core truth lay in Milliman determining through standard actuarial risk adjustment that huge selection effects, rather than the wonders of direct primary care, accounted for a 36% difference in total health care costs between DPC and FFS populations. Both Union County and the DPC provider known as Paladina Health had publicly and loudly touted cost differences of up to 28% overall as proof that DPC can save employers money. But naysayers, including me, were proven fully correct about Union County — and about a raft of other DPC boasts that lacked risk adjustment, like those regarding Qliance.1
The estimated selection pattern in our case study emphasizes the need for any analysis of cost and utilization outcomes for DPC programs to account for the health status and demographics of the DPC population relative to a control group or benchmark population. Without appropriate consideration for how differences in underlying health status affect observed claim costs and utilization patterns, analyses could attribute certain outcomes to DPC inappropriately. We urge readers to use caution when reviewing analyses of DPC outcomes that do not explicitly account for differences in population demographics and health status and do not make use of appropriate methodologies
Still, Union County had made some choices that made the overall results worse than they needed to be. That’s where Milliman’s ingenuity came into play in what might be seen as an attempt to turn the County’s lemon into lemonade for the direct primary care industry. And that is where Milliman failed in two a major ways more than important enough to make the lemonade deeply unpalatable.
The Union County DPC program was even more of a lemon than Milliman reported.
To Milliman’s credit, it did manage to reach and announce the inescapable conclusion that Union County had increased its overall health care expenditure by implementing the direct primary care option. Even then, however, Milliman vastly understated the actual loss. That’s because its employer ROI calculation rested on an estimate of $61 as the average monthly direct primary care fee paid by Union County to Paladina Health; the actual average month fee paid was $95. There had been no need for a monthly fee estimate as the fees were a matter of public record.
Though $1.25 millions in annual savings had been claimed, the total annual loss by Union County was over $400,000. Though 28% savings had once been bragged, the County’s ROI was actually a negative 8%. Milliman’s reliance on an estimate of the fees received from the County, rather than the actual fees collected, made a fiscal disaster appear to be close to a break even proposition.
Milliman’s choice likely spared the county’s executive from some embarrassment.
For more detail on Union County’s negative ROI and Milliman’s understatements of it, click here and/or here.
See here: I’ve changed the grade. Milliman’s came up with a sound idea, but flubbed the design details.
To prevent the County’s specific bad choices about cost-sharing from biasing impressions of the DPC model, Milliman developed a second approach that entailed looking only at a claim cost comparison between the two groups. According to Milliman, “this cost comparison attempts to isolate the impact of the DPC delivery model on the overall level of demand for health care services“. [Italics in original]. Milliman should be proud of thinking of this approach. It seems likely to work where It is feasible because both the DPC and FFS cohorts face near identical cost-sharing incentives for downstream care.
But that was not the case in Union County, where DPC and PPS patients faced very different cost-sharing regimes when making downstream care choices. The County’s cost-sharing choices are imprinted on the downstream claims that fuel the isolation model; for the average FFS patient cost-sharing discipline for downstream claims was much reduced relative to that experienced by the average DPC patient.
Until it completes the hard actuarial work needed to quantify the skew of the downstream claims cost data that fueled its isolation model, Milliman should back off its claim to have successfully determined the health care utilization impact of DPC.
For slightly more detail on why I’ve changed my view on this, click here.
The Milliman calculation of 12.6% overall savings turns on a a massive underestimate of the cost of the direct primary clinic studied.
Milliman needed to determine utilization for the DPC clinic.
Assuming, arguendo, that downstream claims data for Union County are not contaminated by the Union County cost-sharing schema would bring us to the next step. Milliman’s model for relative utilization is simple:
Milliman used claim costs for both downstream components of the computation, and for the FFS primary care utilization. But because DPC is paid for by a subscription fee, primary care utilization for the DPC patients cannot be determined from claims costs.
One reasonable way of estimating the health services used by DPC patients might be to use the market price of DPC subscriptions, about $61 PMPM. With this market value, the computation would have yield a net utilization increase for DPC. Milliman eschewed that method.
Another reasonable way of estimating the health services used by DPC might be to estimate the costs of staffing and running a DPC clinic, using available data about PCP salaries and primary care office overhead, probably at least $40 PMPM by a conservative estimate. With this low number, the computation would have yielded a net utilization decrease for DPC well under 5%. Milliman eschewed that method.
The lower the value used for utilization of direct primary care services, the more favorable DPC appears. Ignoring models that would have pointed to $61 and $40 values, Milliman used a methodology that produced an $ 8 PMPM value, which resulted in a computed 12.6% reduction in overall usage.
Milliman’s “ghost claims” method was ill-suited for DPC and vulnerable.
Milliman’s “solution”, however, turned on the stunning assumption that utilization of subscription-based holistic, integrative direct primary care could be accurately modeled using the same billing and coding technology used in fee for service medicine. As a group, the DPC community loudly disparages such coding both for failing to compensate their efforts and for wasting their time. D-PCPs don’t even use billing friendly EHRs.
Yet Milliman chose to rely on the clinic’s unwilling DPC physicians to have accurately coded all services delivered to patients, then used those codes to prepare “ghost claims” resembling those used for FFS payment adjudication, and then to have submitted the ghost claims to the employer’s TPA, not to prompt payment, but solely for reporting purposes. The collected ghost claims were turned into the direct primary care services utilization by application of the Union County FFS fee schedule. The result was $8 PMPM.
The $8 PMPM level of clinic utilization determined by the ghost claims was absurd.
Valuing the health services utilization for patients at the direct primary care clinic at a mere $8 PMPM is at war with a host of things that Milliman knew or should have known about the particular clinic, knew or should have known about the costs of primary care, and knew or should have known about the nature of direct primary care. Clinic patients were reportedly receiving three visits a year; this requires more than $96 dollars. The length of clinic visits was stressed. County and clinic brag 24/7 access and same day appointments for 1000 clinic patients. The clinic was staffed at one PCP to 500 members; at $96 a year, clinic revenue would have been $48,000 per PCP. This does not pass the sniff test.
The only visible path to Milliman’s $8 PMPM figure for health services demand for the delivery of direct primary care is that the direct primary care physicians ghost claims were consistently underreported. About what one might expect from “ghost claims” prepared by code-hating D-PCPs with no motivation to accurately code or claim (or, perhaps, even with an opposite motivation). Milliman even knew that the coding habits of the DPC practitioners were inconsistent, in that the ghost claims sometimes contained diagnosis codes and sometimes did not. Report at page 56.
Yet, Milliman did nothing to validate the “ghost claims”.
Because the $8 PMPM is far too low, the 12.6% overall reduction figure is far too high. As noted above, substituting even a conservative estimate of the costs of putting a PCP into the field slashes 12.6% to something like 4%. If in place of the $8 PMPM , the $61 market price determined in the survey portion of the Milliman study is used, Milliman’s model would show that direct primary care increases the overall utilization of health services.
For more detail on the erroneous primary care data fed to Milliman’s isolation model, click here.
Union County paid $95 a month to have Paladina meet an average member’s demand. That Milliman computed the value of what the members received as $8 per month is absurd.
Milliman should amend this study by adapting a credible method for estimating the level of health services utilized in delivering primary care at the DPC clinic.
Milliman’s good work on risk adjustment still warrants applause. Indeed, precisely because the risk adjustment piece was so important, the faulty work on utilization should be corrected, lest bad work tar good, and good work lend credibilty to bad work.
1 The reaction to Milliman’s making clear the necessity of risk adjustment by those who had long promoted the Qliance boasts was swift and predictable: never ignore what can be lied about and spun. DPC Coalition is a lobbying organization co-founded by Qliance; a co-founder of Qliance is currently president of DPC Coalition. DPC Coalition promptly held a legislative strategy briefing on the Milliman study at which the Executive Director ended the meeting by declaring that the Milliman study had validated the Qliance data.
In its recent assessment of the impact of the direct primary care model, Milliman took a two track approach. An employer ROI based approach included comparing claims experience for a group of employees who opted to receive primary care from a DPC clinic versus those using traditional FFS PCPs; in addition, the ROI analysis also assigned costs to employer of the differing cost-sharing structures the employer has required of the two groups. This computation resulted in the conclusion that in the plan under study DPC actuatlly increased employer costs.
In a second approach, however, Milliman looked only at a claim cost comparison between the two the groups. According to Milliman, “this cost comparison attempts to isolate the impact of the DPC delivery model on the overall level of demand for health care services“.[Italics in original].
In June, I wrote that, “Given that the employer offered different cost sharing regimes to the two groups, the purported isolation is simply not possible.” There followed a lengthy argument, which I now retract.
Or so I thought.
As in November of 2020, I acquired the tools to more accurately assess induced utilization. And the difference in cost sharing regimes I now understand is negligible. Based on using the CMS AV calculator to compare the plans, and CMS induced utilization proportions, I think this would make a difference of less than 0.2% in favor of the DPC/.
To kick off our open enrollment period two years ago, we at Ratner Industries held a company wide employee meeting. There we dusted off our brand new offering of a high deductible plan option. To get a rough idea how many employees planned on electing each option we offered free bags of M&Ms to employees as they left, requesting that those leaning toward the high deductible option each take a bag of peanut M&Ms, while those remaining in our standard plan were asked to take only traditional M&Ms. About equal numbers took bags of each of the two kinds of M&Ms.
We just ran some claims data (fully risk adjusted, of course) for the first full year under these plan choices. We found that those who received peanut M&Ms had 12% lower utilization of services overall compared to those receiving traditional M&Ms.
Have we have proven the power of peanuts? Caiaphas, a commenter, has suggested that this a case of borrowed valor, with the HDHP plan having done the hard work for which the pro-peanut caucus has “improperly” taken credit. Caiaphas’s anti-peanut bias is evident.