More than 12 million Americans selected a health insurance plan on the Affordable Care Act’s (ACA) public marketplaces in fourth open enrollment period (OEP4; November 2016–January 2017).1 However, consumers have a hard time choosing health insurance plans because of the number and complexity of plan options, limited health insurance literacy, and lack of information.2,3,4 Sometimes consumers forgo insurance enrollment all together if too overwhelmed.5

Consumers’ insurance choices strongly influence how they access (or are unable to access) high-quality health care, particularly in the primary and inpatient care settings. To understand the true cost of a health insurance plan, consumers must calculate often complicated cost sharing values (e.g., deductibles and coinsurance) to estimate out-of-pocket spending. As a result, cost-conscious but overwhelmed consumers may overly focus on monthly premiums with suboptimal plan choices.6 Similarly, as insurers use narrow provider networks to control costs, consumers need to know if their preferred providers are covered in their selected plan.7 Choice errors carry substantial financial consequences, as cost sharing rises and out-of-network costs are often not subject to out-of-pocket maximums (meaning that consumers bear even more financial risk).

The design of the online health insurance marketplaces influences complex health plan choices.6, 8 Our prior work has demonstrated that the public ACA marketplaces have offered various tools in prior open enrollment periods (OEPs) to help consumers make informed health plan choices.9, 10 Total cost estimators, for example, are tools that can help consumers estimate their yearly total out-of-pocket spending on healthcare—by predicting the amount of cost sharing (based on expected healthcare utilization), adding it to the monthly premium, and subtracting any subsidy discounts. While most exchange officials see the value in providing these tools, no consensus exists on the best way to structure the estimators, nor has their accuracy been tested.11

Health policy reforms could make the design of the marketplace choice environments even more important if consumers are offered more diverse plan options in a less regulated health insurance market (e.g., repeal of universal coverage for essential health benefits or pre-existing conditions) and where consumers with more “skin-in-the-game” are exposed to higher costs.12, 13 In this study, we build on our prior work by presenting new data on the choice environments of the state-based marketplaces and HealthCare.gov in OEP4. We compare OEP4 to data previously collected from the third open enrollment period (OEP3), as we have seen an evolution in the health insurance exchanges over time .10, 14 We also add an examination of the choice environment on privately run health insurance marketplaces. These privately run health insurance exchanges are similar to the public exchanges in that they are a “one-stop shopping” site for a variety of health plans; however, they are developed and operated by private companies, rather than by the federal or state government. Current policy debates (e.g., selling health plans across state lines, employers forming Association Health Plans, full repeal of the ACA) could lead to increased opportunities for privately run exchange websites to enter the health insurance shopping landscape.13

METHODS

Data Collection

We collected data in November–December 2016 during OEP4 from the federal and all 12 state-based marketplaces established by the ACA (Appendix Table 5). We navigated each site as a typical marketplace shopper in two contexts: “window-shopping” allows consumers to browse plans and prices without creating an account or entering detailed personal information, while “real-shopping” requires a consumer account and personal information. We collected data in both contexts because real and window-shopping have differed in prior OEPs, potentially due to the use of different vendors for each context.9, 10 We compared data from OEP4 to data previously collected using similar procedures during OEP3 in November 2015.9

We also surveyed privately run online health insurance exchanges in November–December 2016 for notable features, defined as choice architecture elements not found on the public exchanges, and total cost estimators. From a list of federally approved brokers15 and RAND-reviewed sites,16 we identified 23 private online exchanges that, like public online marketplaces, sold multiple private health plans without requiring email or phone communication (Appendix Table 5). These web-brokers completed the registration requirements for the federally facilitated marketplace and listed any on-exchange plans for the same prices as public exchange offerings. Some private online exchanges operated in geographically restricted areas.

We determined that total cost estimators for OEP4 plans were available on seven public exchanges (six state-based marketplaces, HealthCare.gov) and two of 23 federally approved private online exchanges (Stride Health, GoHealth). HoneyInsured had a total cost estimator but had not been updated for OEP4 plans, precluding price comparisons. We window-shopped on these nine exchanges as a standardized consumer with common medical conditions: a 30-year-old single male with $25,000 annual income, with type II diabetes and asthma on metformin and albuterol, with an average of four doctor visits per year and no expected medical procedures, and in “good” health.

Outcomes

We documented plan display characteristics, including default order of plans and plan features presented. We compared our findings to the choice environment features previously documented in OEP3, which were selected based on their known availability in prior enrollment periods and potential impact on health plan choice.10, 17

New data elements captured in OEP4 were how consumers could transition between window- and real-shopping platforms and whether exchanges offered a mobile application. Because our prior research illustrated substantial differences in window- and real-shopping and the transition was noted to be cumbersome, we documented if consumers were supported in moving from the browsing window-shopping environment to the real-shopping environment where they purchased plans, such as with an online shopping cart. We additionally searched for health insurance exchange mobile applications as consumers increasingly shop on their mobile devices.18

For the comparison of total cost estimators, we documented estimator inputs (i.e., questions used to generate estimates and descriptions for levels of healthcare utilization) and estimator outputs (i.e., total cost estimates for a particular silver-level plan and for different levels of healthcare utilization) on each of the nine exchanges with an estimator.

For data collection on both private and public exchange websites, all features were surveyed with screenshots and/or screen video-capture documentation by at least two researchers who worked independently. Any coding discrepancy was resolved by team consensus. This study was deemed exempt by the University of Pennsylvania Institutional Review Board. The datasets analyzed during the current study are available from the corresponding author on reasonable request.

RESULTS

Public Health Insurance Exchanges

Consumer Decision Aids

Compared to OEP3, the public marketplaces increased use of nearly all studied consumer decision aids in OEP4 (Table 1). Exchange-specific choice environment details for OEP4 appear in Appendix Table 6. In window-shopping, we found total cost estimators on over half (n = 7 of 13) of the public exchanges versus five of 14 in OEP3. In the District of Columbia, Minnesota, and Vermont, consumers could both sort and filter health plans by total cost estimate, while the rest allowed either sorting or filtering. Idaho displayed flags, such as “low” or “average” to indicate the “expense estimate” without a specific estimate. In real-shopping, only California and HealthCare.gov had total cost estimators.

Table 1 Choice Environment in the Health Insurance Marketplaces, Fourth Open Enrollment Period, Window-Shopping

All sites with integrated provider lookups in OEP3 maintained them in OEP4 (window-shopping, seven; real-shopping, eight). Twice as many marketplaces included an indication of network size (n = 4 of 13 versus 2 of 14). Massachusetts used flags for health plans with a narrow network; the District of Columbia provided a list of in-network hospitals; New York provided a percent of nearby hospitals and providers that were covered; and Idaho used labels basic, standard, or broad for network size relative to other plans available in the county (for an example of a network size indicator, see Appendix Fig. 2). In window-shopping, the District of Columbia added an integrated drug lookup in OEP4, joining the HealthCare.gov and Colorado from OEP3. Only HealthCare.gov had an integrated drug lookup in real-shopping.

Quality ratings were more prevalent in OEP4 (n = 6 of 13 versus 4 of 14). For the first time, pop-up definitions that appeared when hovering a cursor over key health insurance terms (e.g., deductible) were available on all exchanges in both shopping contexts. For consumers qualifying for health plan discount programs, 11 of 13 public exchanges indicated that consumers who qualify for a cost sharing reduction (i.e., reduction in deductible, copay, and coinsurance amounts) should consider silver plans because the savings are only available within the silver tier.

Choice tools on four of 13 sites allowed consumers to narrow their plan options based on their responses to a series of questions. For example, Washington state’s tool asked, “Do you prefer your primary care doctor to manage your health care or do you want to have more choices about which doctors you visit?” to filter by HMO or PPO plans (for an image of the choice tool, see Appendix Fig. 3). Colorado’s choice tool was a series of five filters for preferred provider, premium range, deductible range, metal tier level, and insurance carrier. Each page offered further education; for example, “Plans with the higher deductibles usually have lower premiums and higher out-of-pocket costs at the time you receive services or obtain medications...” Rhode Island asked a series of three questions on frequency of medical use, chronic illness, and payment preferences to reorder plans.

Plan Display Characteristics

The most common default plan orders were total cost estimate and premium (Table 2; for an example of plans ordered by total cost estimate, see Appendix Fig. 4). Plan order could differ between window- and real-shopping for the same state. In window-shopping, 5 of 13 sites ordered plans by estimated total out-of-pocket costs. In real-shopping, eight state-based marketplaces ordered plans by premium. Other orders used were best fit for consumer, silver listed first for consumers who qualified for a cost-sharing reduction, standard plans listed first, or metal tier.

Table 2 Public Exchange Choice Environment in Fourth Versus Third Open Enrollment Period

Three sites (California, Idaho, Washington) provided an online shopping cart to transition from window- to real-shopping. Three public health insurance exchanges had mobile applications (Connecticut, Maryland, DC).

Private Online Health Insurance Exchange Notable Features

Private online exchanges offered several notable features for total cost estimators (Table 3). Stride Health and HoneyInsured indicated out-of-pocket estimates for specific conditions (e.g., concussion). GoHealth provided an infographic representation of the total cost estimate (Fig. 1). HoneyInsured asked a unique question to generate its estimate: “How much do you expect to spend on healthcare if you didn’t have insurance next year?”

Table 3 Notable Features on Private Online Exchange Websites
Fig. 1
figure 1

Infographic for total cost estimator on GoHealth. Source: GoHealth ( www.gohealthinsurance.com ).

Private exchanges personalized plan display and highlighted recommended plans in notable ways. Examples included identifying a recommended plan; presenting only recommended plans; or using flags such as “best match,” “runner-up,” “hand-picked plans,” or “cheap plans.” HoneyInsured and Stride Health featured plan partitioning, which highlighted certain plans by displaying them separately. GetInsured provided a plan score based on shopper preferences. HealthSherpa provided a map of in-network providers for each health plan. A video illustrating notable features on plan recommendation and costs for specific conditions is available online (Online Video in Appendix).

Private Online Health Insurance Exchange Notable Features Example Source: Stride Health (www.stridehealth.com). Stride Health provided a plan recommendation and out-of-pocket costs for specific conditions in their choice environment, which were notable features not seen on the public insurance marketplaces. (MP4 29746 kb)

Total Cost Estimators

The total cost estimates for the same plan by state for our standardized consumer varied substantially between the public and private exchanges (Table 4). The total cost estimate for a Pennsylvania plan sold on HealthCare.gov was $1905 versus approximately $3900 on the private exchanges. The Connecticut state-based marketplace total cost estimate ($6352) was higher than on the private exchanges ($4792, $4286). The average difference between public and private exchange estimates for the same plan in each state was $1526.

Table 4 Health Insurance Plan Total Cost Estimates* for a Standardized Consumer† Considering the Same Plan By State: Comparison of Estimates Between Public and Private Online Exchanges

Differences in cost estimates may be attributable, in part, to differences in the number and specificity of questions asked to calculate the estimate (Table 4) and the response categories available to consumers (Appendix Table 7). Questions included categorizing medical care and prescription use, self-reported health status, existing medical conditions, expected medical treatments, and ongoing prescriptions. Answer options differed; the lowest healthcare utilization option on HealthCare.gov was described as “minimal other medical expense,” versus “1-2 doctor visits” in California, and “4 doctors’ visits” in Colorado. Distinct utilization categories and estimate algorithms resulted in different distributions of cost estimates between the lowest and highest utilization categories for the same plan.

DISCUSSION

The ACA’s public health insurance exchanges offered more consumer decision tools in OEP4 compared to OEP3. The increased decision support on the public exchanges is encouraging, as the complexity of selecting a plan has been clearly demonstrated and consumers may be asked to select from more diverse plan options.4, 13, 19 For the first time, we also examined the choice environment of private online health insurance exchanges, which provide alternate venues for consumers to shop for plans. These private exchanges offered notable consumer decision aid features that uniquely presented plan data or further personalized shopping for consumers.12

We also identified that the same health insurance plan considered by the same patient had widely varied total cost estimates on public and private exchanges. Theoretically, these estimates can help consumers identify the highest value health plans. However, the estimates are valuable only if accurate and understood by consumers. A substantial underestimate, for example, could have considerable financial consequences for a patient with a costly chronic condition, while an overestimate may deter a relatively low-cost patient from buying insurance. Notably, only 2 of 23 privately run exchanges, compared to 7 of 13 public exchanges offered a total cost estimator for plans in the fourth OEP.

HealthCare.gov was the only public marketplace to offer a total cost estimator, provider lookup, and drug lookup in both window- and real-shopping experiences. These tools have been cited as current gold standards of informed consumer choice.20 The federal exchange’s relative scale and budget may have allowed development of these tools in both shopping contexts.12

Provider lookups were again found on the majority of sites in OEP4. Improved transparency of insurance networks continues to be critically important as consumers are selecting among narrow network plans or plans with changing networks of providers.7 Choosing the wrong plan that excludes an existing primary or specialty provider can disrupt continuity of care or expose a patient to much higher out-of-network costs. While we found some indicators of overall network size (e.g., flags, descriptors, percentages of in-network providers), the narrowness of networks may differ for patients needing different specialty medical services, such as mental health or pediatric care.21, 22 Tools, such as the map of nearby primary and specialty in-network providers seen on one of the private exchanges, may better help patients assess the fit of a network for their own medical needs.

While integrated provider lookups were prominently featured on most sites, drug formularies and quality ratings were not. Prescription costs can account for substantial medical expenditures and are associated with medication non-adherence.23 Especially because Medicare.gov has long demonstrated the feasibility and importance of online drug formularies, it is surprising that more public exchanges do not include this feature.24 Quality ratings were also infrequently found. While five-star quality ratings can summarize health plan cost and quality information, no consensus exists on how to generate or explain them.16, 25 Improving the clinical relevance of these measures (e.g., specifying quality ratings if you are a patient with diabetes) could facilitate better health plan choices.

For the first time since the ACA public exchanges opened, less than half used premium as the default order in window-shopping and instead adopted recommendations to order plans by total cost estimate or best fit26. Expert groups have suggested listing plans by premium can cause consumers to focus on the premium and ignore sometimes substantial out-of-pocket costs.26 Moving beyond default plan ordering, several private exchanges more boldly made plan recommendations, either explicitly with a “recommended plan” flag or by highlighting certain plans by displaying them separately on the page. Private exchanges have more choice environment flexibility, as they are not constrained by the political pressures and public contracting procedures of state and federally run exchanges. Challenges to these recommendation tools include how to account for consumer preferences and willingness to trade-off costs for coverage, and identifying appropriate algorithms since research tends to focus on poor rather than optimal plan choices.16

While we describe choice environment features, we did not test the accuracy or impact of the tools on consumer choices. Well-designed tools may improve consumer shopping,8 while inaccurate or poorly explained tools could inadvertently lead to poor choices. We also may have missed certain choice environment features on the exchanges, though we have multiple years of experience shopping on these websites, so the features would likely not be obvious to a typical shopper. Finally, we surveyed the websites early in OEP4, similar to OEP3; the exchange websites may have been updated with new or different choice features after we completed our data collection.

While the future of the ACA and its marketplaces remain uncertain, understanding the construction and impact of the choice environments on health plan selection has implications for improving insurance choices more broadly. Public and privately run exchanges can learn from each other to develop choice environments that best support consumers in making difficult health plan selections. In addition, the decision-making process and tools on the ACA exchanges are relevant to consumers who are selecting among plan options for employer-sponsored or Medicare Advantage insurance.

Research is needed on the impact of different choice environment features. Key research questions include how current tools influence consumers’ plan choices in experimental and real-world settings and what data source and assumption differences lead to the variation in total costs estimates.27 For example, what degree of input specificity (e.g., simple classification as low or medium user, versus indicating specific conditions and medications) best predicts actual expenses? The diversity and evolution but overall similar structure of the ACA exchange choice environments present an opportunity to identify best practices and study the impact of the different tools in a natural experiment. As millions of patients are asked to be increasingly savvy health insurance consumers—facing potentially more diverse and less regulated health plan options—the next generation of online health insurance marketplaces are needed to facilitate access to the health care they want, at the lowest price.