1 Introduction

In the United States (U.S.), a consortium of private sector, non-profit, and government entities produce cybersecurity, which is defined here as the protection of digital information from malicious exploitation by preventing, detecting, and responding to attacks.Footnote 1 Market intervention exists, but is broadly limited as the private sector possesses the dominant role in producing, monitoring, and enforcing digital security (Clarke, 2009: 33). Regulation is applicable to a limited number of industries at the federal level, like healthcare with the Health Insurance Portability and Accountability Act (HIPAA) and finance with the Graham-Leach Bliley Act. Even within regulated industries, the establishment of the standards and practices from which to measure accountability is mostly delegated to the private sector given their expertise (Thaw, 2014: 287–288). State enacted legislation also exists. As an illustration, all 50 states have enacted data breach notification laws. Nonetheless, the U.S. government has historically favored an approach that depends on self-regulation by software vendors and other businesses (Volz 2023).

The status quo of voluntary cybersecurity production does not receive unanimous support. Coyne and Leeson (2005: 477–478) cited the emergence of government’s role in cybersecurity as a major national security policy issue due to software vulnerabilities and cyber-attacks. The risks associated with failure and the reported economic costs from cyber-attacks have grown since the turn of the twenty-first century. The International Monetary Fund finds that cyber risk threatens “system-wide stability” of the financial markets (Adelmann et al., 2020: 5). The Federal Bureau of Investigation’s 2022 Internet Crime Report estimates the potential loss from cybercrime at $10.2 billion.

Cybersecurity is following what Friedman and Friedman (1980: 201) call the “natural history of government intervention” where a political alliance forms and demands the government solve a “real or fancied evil.” In this case, data breaches are the evil and a growing political coalition believes the market fails to adequately provision cybersecurity due to inadequacies with market self-help (Bauer & Van Eeten, 2009: 717; Sales, 2013; Gordon et al., 2015: 14; Weiss, 2015; Schneier, 2018; Clarke & Knake, 2019; Kende, 2021). Market solutions are supposedly lacking due to knowledge problems and misaligned incentives. Rapid technological advancement and the anonymity of cyber criminals, otherwise known as black hat hackers, lead to significant information challenges for cybersecurity vendors. As for incentive problems, Bruce Schneier, a prominent cryptographer and computer security professional, states that security investments will only occur if mandated by the government, challenging anyone to find an industry that voluntarily improved its security (Truta 2018).

Regulatory enactments and proposals are now burgeoning. Multiple federal organizations, like the Federal Trade Commission (FTC) and Cybersecurity and Infrastructure Security Agency (CISA), are establishing new cybersecurity requirements across industry. Thirty-six states enacted cybersecurity legislation, including directives for cyber incidence reporting and consumer data privacy, in 2021 (Madnick, 2022). The Biden administration’s 2023 National Cybersecurity Strategy is perhaps the boldest push for state intervention to date. The strategy proclaims that firms, even the large, reputable ones, are under-producing cybersecurity and asserts that the majority of cyber-attack costs are levied against the public rather than the vendors (Volz 2023). Therefore, the policy promotes crafting national requirements for customer data security and establishing vendor liability for poor cybersecurity.

The merit of a regulatory proposal is dependent upon a comprehensive examination of its benefits and costs relative to other competing institutional arrangements. As an example, Pennington’s (2010: 7) robust political economy framework comparatively assesses resource allocation between the market and government regulation by analyzing whether an institutional reform would improve the discovery and transmission of relevant knowledge and by examining the extent to which incentives are aligned with beneficial outcomes. The more robust system is the one that best addressees non-ideal, real-world challenges with both incentives and knowledge, like self-interested motivations and poorly informed individuals, respectively. Unfortunately, interventionists typically do not conduct a comparative institutional assessment. Instead, they limit their focus to addressing market inefficiencies without considering whether the prescription ameliorates the knowledge problem or improves incentive alignment.

In terms of a comprehensive assessment, the market may perform better than assumed. Critics of self-regulation may either fail to see the market eliminating cyber inefficiencies or find that non-instantaneous fixes are unsatisfactory and a sign of failure (Kirzner, 1985: 138). In this case, regulatory activists underappreciate the innovative role of the cybersecurity entrepreneur in the market process (Hodgins, 2024). In addition, state regulation can suffer from substantial cognitive limitations and self-interested behavior that culminate in worse social outcomes. The regulators may lack relevant cybersecurity expertise or possess incentives that allow for regulatory capture. Government intervention may also alter the institutional arrangements of the market participants, which can impose opportunity costs and unintended consequences on cybersecurity entrepreneurial activity.

This study conducts an economic assessment of the potential negative regulatory impacts on the market process by applying Israel Kirzner’s (1985) Perils of Regulation framework to cybersecurity. Regulation can affect entrepreneurial activity directly by changing the incentives, costs, and rewards for digital security innovations. Regulatory adjustments impact the way individuals “perceive the options available to them,” as well as the “accuracy and sensitivity of opportunity perception itself” by the entrepreneur (Kirzner, 1985: 82). Indirect impacts on any entrepreneurial activity related to cybersecurity, such as the vast and growing digital economy, are also possible.

Kirzner’s (1985: 121) framework identifies three broad types of regulatory costs, which serve to structure this effort.Footnote 2 Section 2 examines unsimulated discovery, where the government’s limited cognitive abilities lead to prescriptions that do not reflect the actual and emergent market requirements. Section 3 discusses the most pernicious threat, stifled entrepreneurship, where regulation inhibits the price and profit signals for entrepreneurs by generating costs and barriers that ultimately restrict innovative activity. Section 4 analyzes superfluous discovery, where entrepreneurial incentivizes are directed towards unproductive and even destructive activities. Section 5 concludes the study.

2 Unsimulated entrepreneurial discovery

Unsimulated discovery is a regulatory peril that occurs when bureaucrats are unable to coordinate production outcomes that align with the underlying tastes and technologies of the market. Within a robust political economy framing, regulators may lack the requisite cybersecurity expertise and the motivations to induce beneficial cybersecurity outcomes when compared to the market. In general, the regulatory decision-making process is usually less timely and precise relative to the market because regulation substitutes decentralized, competitive speculation by varied market entrepreneurs with centralized, non-competitive mandates that are unable to curtail poor decision making (Pennington, 2017: 10–11).

Specifically, regulators fail to coordinate the market because they do not have price signals to guide investments in cybersecurity nor the profit motive that incentivizes the marketplace entrepreneur to discover new opportunities that further coordinate the market order (Kirzner, 1985: 139–140). Lacking market induced prices and profit/loss accounting, the correct alignment of cybersecurity investment with the market’s underlying consumer preferences, technologies, and resource constraints is improbable. Compounding errors in both cybersecurity production and maintenance decision making are a far more likely outcome.

Regulators, as a matter of fact, possess inferior knowledge regarding digital security threats and defensive mitigations relative to market participants (Thaw, 2014: 293). Cybersecurity regulations often fail to specify relevant, minimum requirements as a result. Dynes et al. (2008: 20) interviewed information security directors on their perceptions of SEC security control obligations, one of whom lamented that the guidance defined the minimum thickness of data center doors, but was mostly silent on digital security. Similarly, the SEC issued cybersecurity disclosure requirements in 2011 to improve consumer knowledge of a business’s cyber posture and risks. Lacking knowledge on what information is most useful to communicate, the SEC’s guidance resulted in the reporting of low-quality information that was of little use to consumers (Ferraro, 2013: 346). Most businesses disclose that they encounter many cybersecurity risks but are actively investing in protective measures.

Deficiencies also occur when policy mandates conflict with cybersecurity best practices. The European Union’s (E.U.’s) proposed Cyber Resilience Act includes a requirement that firms report the existence of a vulnerability to the government within 24 h of discovery; however, the development of a tested vulnerability patch typically requires a greater amount of time. Software vendors and security researchers are concerned about government organizations storing and protecting these cyber risks, which are referred to as zero-day vulnerabilities (Zurier 2023). Greater insecurity is likely because government employees possess relatively inferior knowledge regarding state-of-the-art protective measures and internalize a nominal amount of the losses from a cyber-attack. Moreover, HIPAA requires that organizations store patient’s medical data for over 5 years so that the information is available and accessible (Yaraghi, 2016: 10). Extending data storage duration and increasing accessibility, though, are practices that cause worse cybersecurity outcomes because they increase the probability and impact of a data breach.

HIPAA, in general, serves as an example where the cybersecurity prescriptions for protecting personal health information (PHI) are deficient. Wolff (2018: 51) notes that the current standards are not necessarily reducing data breach risks, thus the rules may not reflect the underlying market variables for data security. Empirically, Seh et al. (2020) analyze breach information published by the Privacy Rights Clearinghouse (PRC), a non-profit dedicated to data privacy, that corroborates Wolff’s comment (see Table 1). The healthcare industry suffered from more data breaches than all other industries combined over the study’s 15-year timeframe. Hacking is the leading cause of the breaches with poor database security and outdated security software cited as the most common causes of failure (Seh et al. 2020: 10).

Table 1 Data breaches by industry

Apart from poor cybersecurity, several reasons exist to explain these results, such as the higher value of healthcare records and the potential underreported incidents in other industries that do not have as stringent reporting requirements. The personal information in healthcare records is unique. Unlike credit cards, the data does not expire and is difficult to cancel (Wolff, 2018: 53). As a result, healthcare data is more valuable on the black market, where records can sell for ten times the value of credit cards (Humer and Finkle 2014). Nevertheless, the healthcare industry is the most regulated industry and possesses the worst cybersecurity outcomes. In addition, less regulated industries, like banking and finance, are subject to breach notification requirements and are also highly desirable targets for hackers yet suffer a fraction of the data losses.

Existing research provides evidence that industry compliance requirements are, in fact, inadequate for protecting healthcare data. Kwon and Johnson (2013: 3979–3980) examined 250 health care security strategies for regulatory compliance, finding that an organization’s self-reported compliance is unrelated to the number of data breaches it suffered. Instead, data breach avoidance was associated with an organization investing in security capabilities that were well outside the scope of regulatory compliance. Abraham et al., (2019: 540–542) explicitly call out risk management and breach remediation as critical cybersecurity activities that are not addressed by existing law. Overall, HIPAA focuses too much on procedures and neglects technical safeguards, which can lead to fully compliant healthcare organizations that produce poor quality cybersecurity (Kulkarni 2016).

On the opposite end of the policy spectrum, overly stringent regulatory requirements commonly occur due to the information available to and incentives generated by the regulatory institutional environment. Regulators can observe a cyber-attack and measure some of the financial losses, but they lack the knowledge and motivations to perceive the opportunity costs associated with excessive prescriptions. Regulatory bodies also possess perverse incentives to excessively minimize risk due to the asymmetric consequences of potential mistakes (Moorhouse, 2003: 131). Failure to ban an unsafe product or process is visible and likely receives backlash from the public, while the costs to the public from excessive prescriptions are mostly hidden and likely dispersed.

An over-supply of cybersecurity can impose inefficiencies by redirecting resources away from a higher value investment and can indirectly lessen cybersecurity quality as well. As an example, Department of Defense (DoD) issued cybersecurity guidelines, known as the Trusted Computer System Evaluation Criteria, for commercial electronic procurements in the 1980s. The products were considered very secure, but so costly that government agencies refused to buy them. Instead, agencies procured cheaper systems with lower cyber quality (Skibell, 2003: 127). In general, the excessive prescription of cybersecurity was relatively unaffordable and inadvertently increased cyber risk for DoD agencies.

Complex government website password policies serve as a second example of unsuccessful resource allocation directives. Florêncio and Herley (2010) evaluated the password guidelines for 75 different organizations, finding that private firms, like PayPal and Amazon, have much weaker password demands relative to government organizations. The results are initially puzzling because private firms typically have significantly greater security needs due to a larger number of user accounts and a higher value of resources to protect. Upon closer inspection, the private companies effectively coordinate and balance across competing, underlying market variables. Florêncio and Herley (2010: 8) observed that the companies require less stringent passwords to enhance usability because every login is a “revenue opportunity.” Instead, firms produce security with a combination of measures, like fraud detection, lockout policies, and relatively lenient passwords (Florêncio and Herley 2010: 10).

On the contrary, government organizations establish more stringent requirements because the negative impacts from exceedingly stringent passwords are “less direct” (Florêncio and Herley 2010: 1–2). Indeed, government officials do not possess a profit and loss motive to accurately determine the efficient level of either usability or security. The password requirements are not only onerous to consumers in terms of timeliness and usability, but also offer small marginal improvements in cybersecurity quality (Florêncio and Herley 2010). The research authors make no mention to the government agencies use of other cybersecurity measures, but the lack of market mechanisms casts doubt that they correctly evaluate and invest across the multitude of cybersecurity options. At a societal level, the overly prescriptive password standards set by government agencies impose significant time and cognition burdens on end users.

Regulatory perils are especially detrimental to small firms and new entrants, who are the innovation drivers in imperfect and ambiguous markets, like cybersecurity, because they possess information advantages and ownership incentives relative to larger, existing firms (Audretsch, 1995; Wiggins, 1995; Karlson et al., 2021: 83). The E.U.’s General Data Protection Regulation (GDPR) cybersecurity requirements, for example, do not align with market forces because the rules apply to all firms regardless of size or age. The homogenous guidelines can significantly harm new and small entrants in several ways. Consider the stereotypical information technology (IT) entrant that releases an innovative product with relatively poor cybersecurity in order to benefit from a first-mover advantage. In accordance with the conventional wisdom, the firm is motivated to capture the market through network effects, complementary products, and customer lock-in (Anderson & Moore, 2006). Security is an afterthought until a market is dominated because the required time and resources slow product introduction (Anderson 2001; Lai et al., 2007; Kopp et al., 2017: 19).

Nonetheless, the decision to underinvest in cybersecurity is likely an economical choice that can still meet consumers’ security preferences. Security requirements vary across organizations, so the economizing investment also differs. Recent entrants with new products typically possess relatively limited security needs because their consumer base is relatively small and the value of the resources that require protection is relatively low. New entrants are less likely to be explicitly targeted by hackers as well, due to their relatively low exploitation value (Herley, 2013).

One can evaluate the cyber quality of software vendors with small market shares, which is an imperfect, though reasonable proxy for a new entrant. The relative investment tradeoffs are observed by the widely perceived, though false, assumption that Apple, with an insignificant share of the operating system (OS) market, was more secure than Microsoft, the OS market leader, during the 1990s. In reality, Apple possessed fewer vulnerabilities than Microsoft because the value of hacking Apple was much lower (Mills 2010). The market would likely discipline an entrepreneur who overly invests in security to the detriment of other features, unless the entrepreneur was introducing a dedicated cybersecurity product. On the contrary, GDPR regulators are unencumbered by these market forces and apply strict cybersecurity requirements on all firms, regardless of their actual security needs. In summary, regulators’ knowledge problems and poor motivations to appropriately evaluate the opportunity costs among competing data security options is generating adverse and inefficient market outcomes.

3 Stifled entrepreneurial discovery

3.1 Regime uncertainty

Stifled entrepreneurial discovery is a second danger where regulation blocks the entrepreneur from detecting and engaging in profitable opportunities (Kirzner, 1985: 141–142). The design of cybersecurity regulatory regimes is extremely problematic, which can lead to policy designs that inhibit the price and profit signals for entrepreneurs. Wolff (2018: 252–253) finds that U.S. government agencies face significant difficulties in specifying information security practices because digital complexity prevents the regulators from measuring what controls are most effective. The uncertainty is compounded by the wide-ranging estimates of the size and scope of cybercrime, which challenges policy makers to determine an appropriate level of effort in deterring the issue (Barrett et al., 2011).

Several activities that are necessary for deploying high quality cybersecurity are also non-technical, such as a business’s cultural integration of security into business operations and continuous risk management (Kwon & Johnson, 2012). These non-technical practices are even more difficult to codify than technical solutions, like installing anti-virus software and deploying a network firewall. Defining the punishments for failure incurs uncertainty for the regulators as well. Implementing an effective cybersecurity liability regime is highly complicated because regulators must understand the defensive measures that organizations ought to have in place (Wolff, 2018: 254). Government authorities, however, lack the knowledge to ascertain whether specific controls work because that information is only gleaned through continuous trial and error, which is limited to market participants.

Therefore, regulators may adopt nebulous and open-ended legislation in an attempt to address today’s challenges, an uncertain future, and likely incompatibilities between rapid IT innovation and slow regulatory adaptation. Interventions create opportunity costs for cybersecurity innovation that are immeasurable, but some are tractable. Uncertainty pertaining to regulatory design and enforcement, also referred to as regime uncertainty, is one type of cost that stifles innovation (Higgs, 1997). HIPAA, for example, does not direct healthcare organizations to adhere to any specific cyber risk management approach, citing the framework’s “flexibility” (Marron, 2024: 10). Nonetheless, the lack of specificity causes confusion regarding whether a healthcare organization’s security policies are in compliance with HIPAA (Yaraghi, 2016: 16–17).

Other cybersecurity regulations are equally ambiguous. California passed Senate Bill 327 in 2018, which requires manufacturers to ensure internet connected devices are produced with “reasonable security.” Due to the aforementioned complexities, the bill’s details on what constitutes reasonable security is unclear and incoherent. Security features are circularly defined as “a feature of a device designed to provide security for that device.” Likewise, the GDPR mandates that companies provide a reasonable level of data protection, but does not specify what reasonable protection entails (Nadeau, 2020). Variable sanctions with high maximum penalties for non-compliance exacerbate the uncertainty associated with imprecise and vague cybersecurity rules. Violations of GDPR’s framework can result in a maximum fine of £20 million or 4% of a firm’s annual global revenue (Schneier, 2018: 184–185).

Entrepreneurs spend less on innovation when encountering greater regulatory uncertainty because gauging a return on investment is subject to understanding what the regulators will prioritize and enforce (Higgs, 1997). To illustrate further, the Biden administration’s proposed ex post liability for cybersecurity failure may dissuade innovative software production activities. Proprietary software is routinely coupled with open-source components that are developed by freelancers and the academic community (Garg 2021: 4). Strict liability for failure may discourage today’s highly collaborative and innovative software development process. In terms of the healthcare regulatory environment, Yaraghi (2016: 17–21) cites HIPAA’s opaque auditing method and ambiguous penal code as causes for significant uncertainty in a health care organization’s cost-benefit forecasting, which limits cybersecurity investments.

Cybersecurity is increasingly subject to overlapping regulations from many types of legislative bodies as well. Michael Heller (1998) identified this excessive interference phenomenon as the “tragedy of the anti-commons,” whereas Sobel and Leeson (2006: 56) referred to it as the “tragedy of political commons.” While the more widely understood tragedy of the commons occurs when individuals have access to rivalrous resources without exclusion due to poorly defined property rights, the tragedy of the anti-commons occurs when too many people have exclusionary power in resource allocation decision making or usage. Given the borderless nature of cyberspace, companies are subject to a growing collection of overlapping and conflicting cybersecurity mandates at the industry, state, federal, and international levels. Fuller (2017: 206) contends that cybersecurity and privacy laws are routinely created or modified in response to changes in digital technology, which induces entrepreneurial caution regarding investment opportunities.

3.2 Procedural rigidity

Regulation may also stifle entrepreneurial activity when the procedural mandates are inflexible and, therefore, subject to obsolescence. Regulatory adaptation is critical because cybersecurity market evolution is swift and continuous. Black hat hackers are motivated by fame and profits (Leeson and Coyne, 2005) to identify and exploit new software vulnerabilities. Moreover, black hats possess a strategic advantage over cyber defenders (Schneier, 2018), which necessitates continuous cybersecurity development in order to resolve emergent vulnerabilities. In addition, IT innovation continues to introduce new networks, such as operational technology (OT), the internet of things (IoT), and cloud computing (Souppaya & Scarfone, 2022: iv) that generate novel protection uncertainties and a need for new cybersecurity solutions.

Pennington’s (2010: 5) robust political economy framework notes that adaptation is much timelier in the market relative to a regulated system because a spontaneous order “learns from and imitates profitable models, while no profit signal exists for the regulator.” The market coordinates recurring adjustments to production output, processes, and organization. Entrepreneurs, who are alert to price changes and motivated by the lure of profit and avoidance of loss, repeatedly provision new IT and cybersecurity solutions in response to changes in consumer preferences, resource constraints, and technological adaptation (Hodgins, 2024). Moreover, free entry allows firms to engage in cybersecurity self-help associations that work, such as coordinated vulnerability disclosure, and exit those that are no longer effective.

Conversely, regulatory frameworks are slow to both adopt a set of prescriptions and to adapt to changes in the marketplace. Legislative processes broadly employ procedural safeguards, such as checks and balances, to ensure accountability and constrain power, but the safeguards delay regulatory enactment as well. Due to the combination of this institutional constraint and market modernization, Rosenzweig (2011: 10) believes that most IT statutes would be outmoded by the time the government closes its public comment period and enacts the law. Once a specific rule is established, regulatory sclerosis can also prevent timely modifications. Robust political economy concludes that regulatory frameworks are typically slow to adapt because entry and exit decisions are not free. An overarching authority must be cognizant of the necessary changes to the institution and approve the updates, thus slowing the process of change (Pennington, 2017: 11–12). Moreover, Fernandez and Rodrik (1991: 1147) find that a status quo bias occurs in political institutions, where the currently accepted policy is rarely challenged due to uncertainty regarding the benefits and costs of reform.

Thus, a substantial gap between an economically efficient cybersecurity prescription and a state policy is likely to occur. Procedural rigidity is particularly nefarious because it imposes large transaction and opportunity costs on businesses that deny a return on investment for better cybersecurity. In the absence of government intervention, market entrepreneurs often modify the institutional parameters, reduce transaction costs, and remove externalities that render the state’s policy obsolete (Foldvary & Klein, 2002). Unfortunately, current cybersecurity regulatory prescriptions provide empirical evidence in support of the comparative institutional economic theory. The following examples highlight the emergence of gaps between the procedural requirements for HIPAA and the optimal cybersecurity market solutions for healthcare.

State level digital privacy laws for the U.S. healthcare industry has stifled technological innovation by limiting the adoption of electronic health record (EHR) systems and their accompanying cybersecurity solutions. Miller and Tucker (2009) conclude that relatively stringent regulations lowered hospital adoption rates of digital medical records because the procedural requirements raised the costs of information exchange and lowered network effects. In terms of increased costs, successful EHR implementation necessitates the sharing of sensitive data among multiple organizations, such as primary care physicians, laboratories, specialists, and medical imaging facilities. All healthcare entities must execute business associate agreements (BAAs) in order share sensitive consumer data, such as PHI, with any other organization. However, an organization is subject to costly HIPAA requirements, like data safeguarding procedures and liability clauses for PHI exposure, once it signs a BAA.

The increased costs of information exchange from the HIPAA requirements subsequently reduced EHR system adoption for individual health organizations. In turn, the profitability of EHR system implementation by other health organizations in the network dissolved. Positive externalities, like smaller transaction costs from electronic information exchanges and the reduction in duplicative medical testing across a hospital network, were not internalized due to regulation. Low adoption rates slowed the exchange of information and ultimately worsened patients’ health outcomes (Miller & Tucker, 2009).

HIPAA also contributed to the stifling of healthcare video teleconferencing capabilities (e.g., telehealth). The procedural requirements, like using an allowed technology interface, adhering to privacy rules, and conducting compliance audits (Shaver, 2022: 518), levy significant costs on healthcare companies that seek to implement virtual services. Regulatory stifling is observed when comparing the difference in telehealth opinions between providers and consumers. Consumers generally viewed telehealth favorably, citing ease of use and timeliness, as advantages. On the other hand, clinicians were less favorable, naming several procedural compliance cost considerations, like training requirements and liability concerns, as adoption constraints (Shaver, 2022: 518). In fact, the majority of hospitals possessed some form of telehealth service, but the actual employment of telehealth as a percentage of total processed claims was only 0.1% in 2019 (Shaver, 2022: 518–521). Similarly, Kane and Gillis (2018) conducted a related study on the adoption of telemedicine and found that only 15% of physician practices used telemedicine in 2016.Footnote 3

Entrepreneurial stifling is also observable when comparing the adoption and efficacy of telehealth to related products in industries with fewer procedural requirements. Videoconferencing emerged in the mid-2000s with the initial public offerings (IPOs) of WebEx and Skype. A 2014 industry survey reported that nearly 50% of businesses used videoconferencing and nearly all the remaining firms planned to implement the capability in the near future (Pratt 2015). The adoption of mobile banking, which involves high value data protection requirements, significantly predates and exceeds the prevalence of telehealth as well. According to surveys conducted by the Board of Governors of the Federal Reserve, 22% and 43% of respondents used mobile banking in 2011 and 2015, respectively (Dodini et al., 2016: 9). In addition, cybersecurity solutions for business videoconferencing and mobile banking are likely superior to HIPAA given the industry level breach information from the previous section. Thus, entrepreneurial activity in less regulated industries improved both IT and cybersecurity solutions relative to the healthcare sector, with strict HIPAA procedural rules.

COVID-19 was an exogenous shock to the HIPAA regulatory framework’s status quo. The Department of Health and Human Services (DHHS) Office of Civil Rights (OCR), the organization responsible for monitoring and enforcing HIPAA violations, decided early in the pandemic to apply discretion and waive the procedural requirements, including liability risks, as long as businesses acted in good faith (Bassan, 2020: 3; Linebaugh, 2020). Market innovation occurred following the exogenous shock. Telehealth’s percentage of total claims increased from 0.1% in 2019 to approximately 6% during 2020 (Shaver, 2022: 520–521). Public health policies, like lockdowns, and increased consumer demand for virtual services certainly contributed to the growth. Nevertheless, the most recent statistics from the FAIRHealth Monthly Telehealth Regional Tracker (2023) reports a steady 5% rate well after the pandemic restrictions ended. Finally, recent telehealth cybersecurity solutions appear to be effective, despite the revocation of HIPAA procedural mandates. A recent sample of the global telehealth market by a multinational cyber firm, Kapersky,Footnote 4 did not find any tangible concerns, like poor patch management or a high frequency of data breaches, with telehealth cybersecurity (Namestnikova 2022).

3.3 Barriers to entry

From a robust political economy perspective, price setting requirements or mandatory activities may diminish the motivation for a new challenger to enter the market, thereby creating a barrier to entry (Pennington, 2017: 13). As a specific example, regulation erects barriers to entry because firms must invest in the required “procedural safeguards” and expend time waiting for regulatory authority approval (Stigler, 1971: 7). Unfortunately, firms are compelled to expend effort on a recurring basis to gain clarity of their cybersecurity regulatory responsibilities.

GDPR is an ongoing process that demands continual reassessments by firms to understand whether they are compliant (Uzialko 2023). The European Data Protection Board, which is the GDPR supervisory authority, has issued policy clarifications due to the aforementioned regime uncertainty. Regrettably, the clarifications may substantially increase a firm’s compliance costs, given their recurring nature and the amount of new content. A total of 31 clarifications have been issued since 2018, covering items like the information to include in a breach disclosure and the territorial scope of the regulation. An astonishing 875 pages of material have been added to the original 88-page law. Akin to GDPR, Deitcher (2009) notes that HIPAA includes a vast amount of material across at least nine key documents.Footnote 5 These extensive procedural requirements render HIPAA “inaccessible” to healthcare companies (Gaynor, Basse, and Duepner 2015: 114–115).

Firms also encounter high costs from prolonged regulatory reviews. The DHSS OCR conducts audits on healthcare entities following data breaches. The duration of an investigation normally exceeds two years due to multiple rounds of interviews with OCR attorneys, the generation of lengthy reports, and long delays between communication (Yaraghi, 2016: 19–20). In addition, numerous GDPR cases against tech firms have gone unresolved for years (Burgess 2022). As a specific example of the high ongoing compliance costs, GDPR authorities fined British Airways £183 million for inadequate security measures following a data breach in 2018. British Airways then spent over three years in legal mediation to finalize a £20 million penalty (Bray 2021).

Increased barriers to entry have a disproportionate, negative impact on small and new firms because they possess fewer resources to satisfy the costs of regulation. New entrants and small businesses typically lack the administrative and legal scale economies to meet compliance. Nonetheless, HIPAA applies equally to all healthcare entities regardless of the organization’s revenue or number of patients (Gaynor et al., 2015: 115). In addition, California Senate Bill 327 and GDPR require consumer consent for firms to collect data, which is more onerous for small businesses and new start-ups (Campbell et al., 2015). Moreover, GDPR’s requires businesses to employ dedicated data protection officers, which led to large investments in new internal bureaucracies by established corporations (Fazzini 2019). Finally, Geradin et al. (2021) examined the impacts of GDPR on advertising technology competition and found that Google’s market power increased to the detriment of small companies due to the policy’s large implementation costs and data collection restrictions.

The regulatory stifling of entrepreneurial innovation may prevent entry altogether. Increased cybersecurity requirements redirect resources away from other innovative product features. Subsequently, IT innovation by new start-ups and would-be competitors may become economically unviable. Janssen et al. (2022) empirically measured the impact of GDPR on Google’s app store options with a structural model of demand and firm entry. Their study concludes that GDPR imposes high costs on entrepreneurship and dramatically reduces consumer choice. Specifically, new app offerings decreased by nearly 50% and 33% of the existing offerings exited the Google app store after GDPR went into effect (Janssen et al., 2022).

Even seemingly unrelated regulation, like the U.S. immigration system, prevents entry and stifles innovation by raising the costs for human capital, which is cybersecurity’s most critical input. Organizations import skilled cybersecurity talent from abroad through the H-1B visa program (Cappelli 2000; Gabberty, 2013). Immigrants have accounted for greater than one-quarter of the highly skilled IT occupations across major metropolitan areas since 2000 (Otoiu & Titan, 2017). Yet, overly restrictive immigration policies prevent the market from closing the skills gap (Cappelli 2000). The H-1B visa program is capped at less than 100 thousand individuals per year. Technology companies repeatedly advocate for regulatory reforms that would increase the allowance of skilled foreign labor, citing the cybersecurity workforce shortage as a pressing need (Gabberty, 2013; Clarke & Knake, 2019: 144), but the calls go unheeded. An employment shortage in excess of one million positions now exists and is cited as a primary challenge in producing cybersecurity (Crumpler & Lewis, 2019; Smith 2021).

4 Superfluous entrepreneurial discovery

A superfluous discovery process is a third type of regulatory cost. The regulated market imposes new constraints that alter the discovery process and create new incentives for participants (Kirzner, 1985: 144–145). In terms of a robust political economy framework, regulation may generate self-interested behavior issues, which undermine the legislation’s intent to serve the public’s interest. Superfluous discovery is also related to William Baumol’s (1990) concept of unproductive and destructive entrepreneurial efforts that occurs when entrepreneurs decide to use the new institutional rules to maximize their profits, rather than engage in innovation.

Unproductive entrepreneurial activity is welfare enhancing, but only because the regulation creates a new market distortion (March et al., 2016: 210–212). The distortion from government policy limits discovery to activities that are aligned with the regulatory requirements, alleviating artificial, rather than natural scarcities. Regulation can redirect a company’s attention towards strict compliance with the requirements rather than a firm’s actual cybersecurity needs (Shackelford et al., 2015: 353). As an example, many states introduced data breach notifications laws with an encryption safe harbor clause, which means that reporting a breach is not required if the stolen data was encrypted. Thaw (2014: 317–324) comments that safe harbor clauses motivated firms to significantly invest in data encryption as well as modify their cybersecurity investment plans. The innovation appears to be productive because data encryption is widely considered as an industry best practice.

Strong passwords are an industry standard too. Yet the earlier discussion on password strengths (Florêncio and Herley 2010) revealed how market entrepreneurs correctly eschewed strict password requirements in favor of other cybersecurity investments, like fraud detection, to accurately coordinate cybersecurity production with the underlying market variables of preferences, resources, and technologies. In terms of the safe harbor clauses, the investment in encryption imposes opportunity costs on other cybersecurity activities, such as data breach preventive measures. As a singular incentive, encryption provisions deliver little stimulus for firms to adopt holistic protective solutions for their customers (Gaynor et al., 2015: 116). Thus, encryption safe harbors reflect artificial, rather than natural scarcities.

Government intervention can also result in destructive entrepreneurship, where market participants influence the institutional framework to enable more opportunities for unproductive entrepreneurial activity (March et al., 2016: 212–213). Existing vendors, especially those in a concentrated market, are incentivized to engage in lobbying and bribery in order to control the entry of new firms (Stigler, 1971: 5). Portions of the IT industry are relatively concentrated due to network effects and some evidence exists that today’s industry leaders are advocating for cybersecurity regulation. Big tech firms, once opposed to regulation due to its negative impacts on innovation, now generally advocate for regulation, like data protection requirements, as these interventions help raise barriers to competition (Schaake, 2021). Ben Horowitz (2023), a cofounder of the venture capital firm, Andreessen Horowitz, concurs that the lobbying efforts of the big technology firms, like Apple, Amazon, Facebook, Microsoft, and Alphabet, reflect strong interests in regulatory capture and stifling competition.

Alphabet, for example, has invoked GDPR’s privacy policy to mandate the use of its own products across its platforms, like the Google search engine and Chrome. However, Alphabet’s legal interpretation of the rule is controversial and may primarily serve as an excuse to limit third party competition (Geradin et al., 2021: 78–79). In another instance, Mark Zuckerberg, the founder and chief executive officer (CEO) of Meta (Facebook), penned a 2019 Op-ed in the New York Times calling for a U.S. based GDPR. Isaac (2019) postulated that Zuckerberg’s advocacy for mandatory prescriptions, like cyber testing, were partially self-serving because they would limit future challenges to Meta’s social networking dominance by raising market entrance costs. Zuckerberg is also motivated to lobby for state intervention because multinational technology companies are subject to the tragedy of the anti-commons (Heller 1998), where overlapping and conflicting requirements from many authorities is leading to bloated compliance costs. If successful, the lobbying would remove a comparative cost advantage for domestic firms that are not subject to international requirements, like the GDPR (Niebel, 2021: 9).

Destructive entrepreneurship also occurs in the healthcare industry, as dominant providers cite regulatory requirements as justification to withhold data from competitors and even their own customers. Health information blocking is so pervasive that the DHHS Office of the National Coordinator for Health Information Technology (ONC) issued a report to Congress on the matter. The DHHS report (2015: 16) described how privacy and security requirements were the most commonly used rationale by healthcare IT vendors and practices to restrict data sharing, but that at least some of these claims are specious. Healthcare providers use regulation as an excuse to retain customers by restricting their patient’s access to their own records. Similarly, IT incumbents can use these claims as a pretext to raise profits by locking-in providers to their data ecosystem (Hollis, 2016: 4).

EHR’s market leader, Epic Systems, strictly limits interoperability and data exchanges with both competitors’ customers and third-party data sharing applications (Leventhal 2020). As a prominent example, Epic Systems lobbied extensively against ONC’s Health Information Technology’s 21st Century Cures Act, which was crafted in direct response to the persistent data blocking. Epic System’s founder and CEO, Judy Faulkner, called on her health system customers to oppose the Act due to cybersecurity risks (Farr, 2020). The firm also boldly asserted that increasing a patient’s accessibility to their own data could impose significant data privacy risks (Jennings 2021). Nonetheless, competitors and regulators alike broadly reproached Epic System’s security claims as disingenuous to the regulatory requirements and actual safeguarding necessities (Jennings 2021).

The preceding discussion establishes that government intervention has altered the incentives of alert entrepreneurs away from market enhancing activities, leading to inefficient outcomes. Overall cybersecurity regulatory institutions are fragile, lacking the knowledge and incentives to appropriately monitor and enforce the market. Regulators lack both the price system to guide decision making and the property rights that incentivize cybersecurity solutions. Current and analogous regulatory efforts repeatedly fail to augment resource allocation for digital security. Many interventions have actually intensified insecurity. Furthermore, regulation imposes unintended consequences that stifle cybersecurity production or redirect efforts towards less productive, or even destructive, activities. These negative effects are most deleterious to new entrants, who are critical to both IT and cybersecurity innovation.

5 Conclusion

Understanding the relative costs and benefits for provisioning cybersecurity through a robust comparative framework is imperative because the U.S. digital economy is a key contributor to national productivity and failures to secure cyberspace, like ransomware and malware based cyber-attacks, can be costly to the public. Cybersecurity garners attention as a major policy issue as a result of the occasional shortfalls. The current cybersecurity regulatory debate, however, is narrowly focused on the purported ills of private cybersecurity provisioning, while ignoring both the benefits of an unregulated market and the costs of government intervention.

Though this study focused on the perils of cybersecurity regulation, the entrepreneur plays a vital role in producing cybersecurity. Regulatory justifications typically cite recent or historic cyberattacks as evidence of market failure and a need for corrective action. Nonetheless, the market is not a settled state of affairs, but a dynamic process, where the underlying institutional variables induce changes to relative prices. Alert to these price signals and motivated by profit opportunities, entrepreneurs act to resolve inefficiencies and typically move the market “systematically toward, rather than away from, the path to equilibrium” (Kirzner, 1997: 62). Proponents of market intervention rarely, if ever, check to see if entrepreneurs fixed the discoordination. Indeed, cybersecurity entrepreneurs have engaged in substantial innovation and dramatically improved productive efficiency. The proliferation of software patching processes, authentication products, and extensive vulnerability sharing among vendors serve as three noteworthy examples of entrepreneurial improvements (Hodgins, 2024).

Information security regulatory regimes are typically fragile because state regulators do not possess the knowledge and motivations to appropriately enforce socially beneficial cybersecurity outcomes relative to the market. Cybersecurity discovery is highly dynamic and regulators lack price signals from competing actors, thereby creating significant information challenges. Regulators also lack the financial motivation to implement and maintain the right processes and procedures. Furthermore, regulation dampens the price and profit signals for entrepreneurs to produce quality information security by imposing significant costs on entrepreneurial discovery and misaligning incentives. Several analogies where cybersecurity regulation was adopted by specific industries, like the U.S. healthcare sector, or by sovereign nations, like the E.U., were identified to examine the scope and magnitude of regulation’s opportunity costs using Kirzner’s regulatory perils framework.

HIPAA, one of the most prominent cybersecurity frameworks, provides strong evidence for all three of Kirzner’s perils. The framework’s focus on security procedures, rather than technical safeguards, causes suboptimal resource allocation prescriptions. In the U.S., healthcare incurs the most data breaches of all U.S. industries and the industry’s percentage of total breaches continues to increase over time as technological complexity compounds the regulatory errors. Sclerotic contracting requirements stifle the adoption of healthcare IT and cyber innovations, delaying increases in total welfare for the industry and solutions that likely negate the very need for regulation. Finally, incumbents, as evidenced by EHR leaders, use the vaguely defined portions of the rule set to justify destructive entrepreneurialism that protects their market position against new entrants, to the detriment of consumers.

The economic valuations for both the market’s entrepreneurial benefits and regulation’s costs are significant, which invalidates the efficacy improving intentions and claims of state intervention. Advocates for government intervention into the cybersecurity market are clearly victims of the “Nirvana fallacy,” because they are comparing the real world, with imperfect knowledge and uncertainties, against an ideal state with a government corrective (Demsetz, 1969). In reality, the market institutions provide information and incentives for entrepreneurs to correct market inefficiencies over time, while the costs and unintended consequences of the regulatory corrective are noteworthy. Therefore, the economic community must communicate the evidence provided here, and in their own related research, to policy makers so that cybersecurity regulatory discussions consider all the pertinent information.