Skip to main content

Workplace health surveillance and COVID-19: algorithmic health discrimination and cancer survivors

Abstract

Purpose

This article examines ways COVID-19 health surveillance and algorithmic decision-making (“ADM”) are creating and exacerbating workplace inequalities that impact post-treatment cancer survivors. Cancer survivors’ ability to exercise their right to work often is limited by prejudice and health concerns. While cancer survivors can ostensibly elect not to disclose to their employers when they are receiving treatments or if they have a history of treatment, the use of ADM increases the chances that employers will learn of their situation regardless of their preferences. Moreover, absent significant change, inequalities may persist or even expand.

Methods

We analyze how COVID-19 health surveillance is creating an unprecedented amount of health data on all people. These data are increasingly collected and used by employers as part of COVID-19 regulatory interventions.

Results

The increase in data, combined with the health and economic crisis, means algorithm-driven health inequalities will be experienced by a larger percentage of the population. Post-treatment cancer survivors, as for people with disabilities generally, are at greater risk of experiencing negative outcomes from algorithmic health discrimination.

Conclusions

Updated and revised workplace policy and practice requirements, as well as collaboration across impacted groups, are critical in helping to control the inequalities that flow from the interaction between COVID-19, ADM, and the experience of cancer survivorship in the workplace.

Implications for Cancer Survivors

The interaction among COVID-19, health surveillance, and ADM increases exposure to algorithmic health discrimination in the workplace.

Introduction

The novel coronavirus (“COVID-19”) pandemic has created new and unprecedented grounds for exclusion and has intensified existing inequalities facing marginalized individuals, such as persons who have existing health concerns and disabilities [1,2,3,4,5]. Some of these exclusionary grounds involve life and death decisions, [6, 7] and others, economic and social hardships [8,9,10,11]. Not only are people losing their jobs due to a downturn in the labor market, but employers are also expressing reluctance to hire persons with disabilities or health issues due to concerns about supporting them during the pandemic [12]. For individuals with disabilities or health concerns who are employed, technologies and processes implemented by employers in their employment-related decision-making are often not readily accessible [13].

One major change in work due to the pandemic has been the introduction of COVID-19 regulatory interventions that require employers to play a central role in ensuring the public safety of their employees and customers. The use of algorithmic decision-making (“ADM”) and data gathered as part of COVID-19 public health surveillance is having profound impacts on full and equal participation in employment, and in society, for marginalized groups such as people with disabilities and health concerns. This article examines how COVID-19 health surveillance and ADM are exacerbating and creating sites of inequality that impact post-treatment cancer survivors and others with chronic health conditions and disabilities.

ADM involves running computational processes over a dataset to provide machine-generated information to inform decision-making, in this instance decision-making in the workplace. ADM can be defined as “a socio-technological framework that encompasses a decision-making model, an algorithm that translates this model into computable code, the data this code uses as an input—either to ‘learn’ from it or to analyze it by applying the model—and the entire political and economic environment surrounding its use” [14]. Artificial intelligence (“AI”) is a process whereby the algorithm “learns” by modifying the process based on large sets of information (“big data”), such as that arising from COVID-19 society-wide health surveillance. As such, ADM may rely on AI to inform decision-making protocols.

ADM processes that rely on AI use mathematical algorithms and machine learning (“ML”) to make automated decisions based on data through pattern recognition. These processes aim to supplement or replace human decision-making, ideally to increase efficacy or remove bias. However, if unchecked, these processes can create adverse outcomes for people with health concerns such as cancer or other chronic health conditions and disabilities [15]. To avoid such outcomes, ADM processes need to be informed by the lived experiences of people across the spectrum of disability and health conditions, use data that are inclusive of people across that spectrum, and be audited to ensure accountable and valid decision-making outcomes.

The COVID-19 pandemic has spurred the use of ADM in the workplace for the purpose of monitoring and surveilling the health of workers. Unchecked, ADM use can create inequalities for people in the workplace who have acquired, or may acquire, COVID-19, as well as those otherwise affected by the pandemic. People with disabilities or chronic health conditions such as cancer are especially likely to confront discrimination “on the basis of” their disabilities and conditions during the current crisis [16]. The widespread collection of health information and use of ADM during the pandemic in the workplace raise questions about the transparency, accountability, equity, and privacy of such processes.

People with disabilities and chronic health conditions experience workplace discrimination in many ways [17]. Exclusion from work is typical and evident in the case of cancer survivorship [18, 19]. Over 10 million Americans have cancer, and improved treatment means many continue to work or can return to work [20]. Like the wider community, cancer survivors need and desire to work [21], but unlike the wider population, they confront considerable workplace discrimination on the basis of their conditions and recovery [22, 23]. Often cancer survivors can manage adverse health outcomes while continuing to work [24]. Their ability to do so, however, can be limited by managers who base decisions on stigma rather than facts and who unfairly question cancer survivors’ continued ability to work, with or without a workplace accommodation [17, 25]. Cancer survivors report that the struggle against workplace discrimination is as hard as their struggle against the disease itself [26, 27].

The Americans with Disabilities Act (“ADA”) provides that cancer of a significant and lasting degree may be a disability for purposes of its antidiscrimination provisions [28, 29]. While people undergoing cancer treatment may request and benefit from reasonable accommodations at work in reliance on ADA provisions [30], cancer survivors are not consistently using such protections and making such requests [31, 32]. Furthermore, interactive communication between employers and workers across the cancer journey remains challenging [33, 34]. Many workers are reluctant to disclose they have or have had cancer, and they often have no reason to inform future employers that they have completed treatment [25]. If a survivor does not require workplace accommodations, a simple way of avoiding attitudinal discrimination is not to inform their employer of their condition or history [35, 36].

In the “COVID-19 health surveillance” section of this article, we first argue that the combination of COVID-19 health surveillance and algorithmic processes significantly reduces the capacity of people with disabilities, and particularly cancer survivors, to appropriately keep their private health data from their employers. Before COVID-19, discrimination based on algorithms had already resulted in adverse outcomes for people with health concerns [37]. COVID-19 health surveillance is expanding the potential for such discrimination so as to apply even to people who do not view themselves as disabled. We present the term “algorithmic health discrimination” to describe this broadening vulnerability to inequalities in the workplace and elsewhere in society.

We next analyze how COVID-19 health surveillance is creating an unprecedented amount of health data about people that is generated and controlled by a relatively few public and private actors. The information associated with COVID-19 health surveillance identifies whether a person has acquired COVID-19 or has a pre-existing condition that makes them susceptible to COVID-19, and it provides data on how people consider and respond to treatment and mitigating measures. Because so many people have been impacted by COVID-19, the potential negative impacts of algorithmic health discrimination are widespread in society. We consider how COVID-19 health surveillance is providing employers new and unprecedented access to information on their workers and what impact this increased access is having upon the use of ADM.

The “Algorithmic decision-making at work” section analyzes how ADM technologies operate, with a focus on work and employment. It discusses how ADM can reinforce workplace prejudices and cause workers with various health conditions and disabilities to be overlooked in recruitment, career, and advancement opportunities, and to experience a range of other inequalities [38]. We illustrate how this can occur through ADM controlled by employers or by third parties that employers retain. In many situations, the calculations and processes behind the decision-making are invisible to both the decision-maker and the person subjected to the adverse finding.

The “Addressing algorithmic health discrimination at work” section analyzes “hard” and “soft” law and policy responses to regulating algorithmic health discrimination. Disability antidiscrimination laws, such as the ADA, are primary hard legal remedies that can be used to combat inequalities from algorithmic health discrimination at work. Soft law regulatory vehicles provide additional means to advocate for fairness and equality in ADM. So-called principled AI codes demonstrate that demands for fair use of these technologies are gaining support. Finally, the “Conclusions and recommendations” section presents our conclusions.

COVID-19 health surveillance

By design, COVID-19 health surveillance, whether legally mandated, endorsed, or otherwise supported, acts to reduce the capacity of workers to keep their medical conditions private from their employers. Workers are required to report personal information to their employers as part of public health measures, often via technology surveillance that detects health irregularities caused not only by COVID-19 but also by undisclosed medical conditions, such as cancer.

COVID-19 health surveillance and health data

To help reduce disability and health discrimination, privacy laws have historically restricted the capacity of employers to access information on their workers [39, 40]. These laws restrict employers from accessing worker-owned electronic devices and limit questions employers can ask current and future workers about their health. Other laws limit employers’ ability to make current and potential workers provide access to their social media accounts [40]. In the USA, for example, Maryland was the first state to pass a law restricting employer access to worker social media accounts [41]. Other jurisdictions, such as California, Virginia, and Illinois, have adopted similar laws [42]. This part considers how COVID-19 health surveillance has transformed these protections in the context of a public health emergency.

Governments across the globe are now placing unprecedented public health–related restrictions upon people and collecting massive amounts of health, geospatial, and personal information to help restrict the spread of COVID-19 [43]. This development has led to concerns about how this information is being stored and used [44]. It has also led to ethical and practical concerns about the effectiveness of these new measures, including the effectiveness of digital proximity tracking programs deployed in a predominantly unregulated environment [45].

Along with the potential for mandated data collection to have a negative impact on worker privacy and job security, workers with disabilities and health conditions must confront the potential for job loss based on their own choices about how to protect their individual safety in the workplace. For example, due to personal safety concerns, some workers want to avoid the workplace [46] and prefer to work remotely [47]. But because most workers in the USA are employed “at will,” absent other contractual or legal protections, an employer can dismiss a worker who refuses to return to the workplace.

The US Occupational Safety and Health Act (“OSHA”) provides a worker the right to refuse to perform unsafe work, the National Labor Relations Act entitles workers to strike due to OSHA concerns, and the Labor Management Relations Act entitles workers to walk off the job for abnormally dangerous conditions [48]. However, these laws were not enacted with a pandemic in mind and do not provide workers a blanket right to refuse to return to the workplace [49]. And even under the recent Families First Coronavirus Response Act, workers taking medical or other leave pursuant to the 2-week-paid quarantine provision are not protected from having to return to the workplace. If a worker refuses to return, and the safety concerns are not accepted by a state unemployment agency, the worker may be deemed to have voluntarily resigned and be unable to claim unemployment benefits [49].

When an employer is following a COVID-19 safety plan, workers still must return to the workplace when requested or run the risk of being fired, unless they have a medical condition that justifies their refusal. Extant research shows that cancer survivors have greater rates of severe respiratory outcomes and death than the wider population [50,51,52]. Workers with such medical conditions and disabilities who have not disclosed their conditions to their employers are thus often placed in a situation to either disclose their medical condition, claim protection under the ADA, or risk losing their job [53].

Employer access to worker health data

COVID-19 regulatory interventions compel employers and businesses to play a significant role in ensuring public safety in workplaces [54] and at places of public accommodation [55]. Thoughtful balancing of individualized ADA reasonable accommodations [56] with occupational health and safety requirements can help to alleviate workplace discrimination against persons with disabilities [57]. Still, during the pandemic, employers have more capacity to exclude people on public health grounds.

Employers do have an important role in helping control COVID-19 in the workplace through monitoring and surveillance of workers’ health status. However, often this role includes obtaining access to personal information about worker health and social interactions. This data collection pits the importance of health and safety against expectations of personal privacy [58] and generates the possibility for algorithmic health discrimination.

Routine health checks are a common COVID-19 surveillance measure [58]. But accurately testing a person for COVID-19 can involve a wait time at a testing venue [59] (during which a person is unable to work) and can be expensive [60], so not all employers use this approach. Absent a medically administered test, distinguishing between a person with COVID-19 and one who is not currently infected but happens to have one of the symptoms associated with this virus is not easy. The US Centers for Disease Control and Prevention (“CDC”) recognizes this challenge and advises businesses that they may “not want to treat every worker with a single non-specific symptom (e.g., a headache) as a suspect case of COVID-19 and send them home until they meet criteria for discontinuation of isolation” [61]. Rather than needlessly excluding workers from their work, the CDC suggests businesses should consider focusing screening questions on new or unexpected symptoms as opposed to known chronic conditions [61].

The problem with biased, inaccurate, or overreaching COVID-19 health surveillance and response is that it may cause people with chronic medical conditions to experience exclusion from the workplace when they pose no significant risk to themselves or others. For example, researchers are trying to map the genetic correlates of COVID-19 progression [62] to promote enhanced medical responses. While certainly valuable, over-reliance on this research for screening tools does increase the chance of adverse outcomes based on a potential or unfounded risk [62]. Screening can also be used, perhaps excessively, to identify workers who may be at high risk (e.g., who may pose a “direct threat” to self or others in the workplace, to use the ADA terminology) to remove them from high exposure jobs.

Various measures of health status may reveal COVID-19 symptoms, but these same symptoms may be characteristic of other conditions as well. Many employers scan workers and customers using heat-sensing technologies [63]. These technologies identify a person with a high temperature and may result in that person being denied access to work and sent for further testing. But they may also force unwanted or unwarranted self-disclosure of disability or a health condition, as may be the case with pregnancy [64]. Other conditions, such as high blood pressure associated with COVID-19, can also potentially be an ADA-covered disability and a symptom of conditions ranging from cancer treatment and survivorship [65,66,67] to urinary tract infections [68], HIV/AIDS [69], respiratory conditions such as asthma [70], or other conditions unrelated to COVID-19 [71].

Businesses assess their own risk tolerance and unique circumstances and can require additional health measures [72]. As vaccines are distributed, for example, a supplemental COVID-19 public health measure may be to require workers to provide proof of antibodies. Antibody tests indicate who is immune from attracting or spreading COVID-19, and thus might be used to indicate who can enter workplaces and other spaces safely [73]. Such proof-of-antibody requirements are, so far, limited to certain professions, such as health care [74].

COVID-19 arguably makes the probability of so-called immunity passports at work and elsewhere, such as for air travel, more probable. The US government requires proof of COVID-19 testing and clearance for people when entering the country [75]. Some universities are now requiring that staff and students be vaccinated, with exemption on religious and medical grounds (the latter requiring self-disclosure) [76].

Immunity passport requirements raise various issues, including socio-economic challenges: gaining access to vaccines and an immunity passport may be expensive. They also raise issues of what other medical conditions may be included in such passports. Even if passports are “voluntary,” in practice the absence of a passport may result in people not getting work or having their access to premises restricted. While individuals with weakened immune systems, such as people undergoing cancer treatment, may be able to opt out of vaccination due to safety concerns or other medical reasons [77], such an exemption necessitates disclosure of a disability or a health condition, possibly exposing workers to potential workplace bias or discrimination [78]. Of course, health “passports” in the disability community are not new. In Australia, disability assistance or service animals may have voluntary government-issued identification to reduce fraud and enable handlers and their animals to gain access [79].

Algorithmic decision-making at work

ADM is transforming how opportunities are allocated in work and in public benefits. ADM is increasingly used in autonomous technologies, such as autonomous vehicles [80, 81], drone delivery systems [82], smart-home technologies [83], virtual and augmented realities [84], and technologies in smart cities [85]. ADM partially replaces or supplements human decision-making in areas such as assigning credit scores to individuals [86], determining access to health care services [87], making decisions associated with policing and criminal justice [88], and, as analyzed below, making decisions about work and governmental benefits such as Social Security. Workplace ADM is used to increase the accuracy of predictive decision-making in human resources [89, 90], such as for assessing teamwork competency and improving formative assessment [91, 92].

Certainly, ADM can provide benefits to workers with disabilities. ADM is used to help persons with disabilities gain the skills to obtain and retain work by improving educational support provided to students with disabilities [93, 94]. Algorithmic technology is also used to support people with health concerns and conditions at work. For example, ADM can help cancer survivors manage their pain more effectively [95] and increase their capacity to work. If a cancer survivor acquires a disability, algorithmic technologies can enhance the quality of support measures, including devices that support navigation in fixed or dynamic environments [96,97,98,99] and smart mobility devices [100]. On the mental health side, they can provide medical and social interventions for persons with mental health disabilities such as depression and other conditions [101,102,103].

Workplace ADM uses highly structured data sets, including variables of gender, age, start and completion dates of work activities, and work patterns, as well as less structured data sets, such as recruitment videos that are mined for facial and vocal cues [104]. ADM can utilize metadata collected from the Internet during the job recruitment process [105]. Once people are employed, ADM provides employers access to a vast amount of structured and unstructured data on their workers [106]. Audit trail logs record when a worker used a device and database [103], and Keystroke logging software records movements in the workplace [107]. Workers are surrounded by sensors [108]: swipe cards, wireless points, and others. These data can be mined for a variety of insights [109].

COVID-19 has resulted in a massive shift to remote work, which has resulted in employers using sensors and technical measures for monitoring workers at their homes, often even when those workers are engaged in non-work-related activities [110]. Private and public social media, of course, can be additionally mined for data on individuals and their associations whether at home or in the workplace [111].

Equality concerns with algorithm-driven decision-making

All this information collected through COVID-19 health surveillance measures can be used to discriminate against workers, intentionally or unintentionally, in ways unrelated to the public health emergency. The collection of data as input into ADM and the use of resulting decisions create a range of potential concerns for workers with and without health concerns. The increased power and use of workplace surveillance technologies means that workers’ private lives are more likely to be intruded upon [112]. This intrusion may be especially concerning for workers with disabilities and medical conditions that are invisible and not previously self-disclosed—those that cannot be readily observed when a person interacts in society, as is the case typically for post-treatment cancer, mental health, and cognitive conditions [113]. Workers who have decided not to disclose their medical conditions to their employers are now at heightened risk of having their nonvisible medical conditions detected and acted upon by ADM [114] using data mining for “emergent medical data” [115], a process that draws together digital traces to discover medical information on individuals [115].

The introduction of new resources in the workplace that assist in managing health benefits for workers has also made collection of health data more commonplace. Castlight’s health navigation platform offers personalized assistance to employees based on health indicators [116]. Castlight has recently added programs to its services such as Working Well, which is an application to assist in planning a safe return to the work during the COVID-19 pandemic and to support workers once they return [117]. Programs like Working Well rely on machine learning to forecast health trends and to assess the risk of exposure to COVID-19 [118].

As part of using these resources, employers can track workers’ COVID-19-related symptoms, conduct contact tracing, and establish COVID-19 protocols based on their needs [119]. For instance, as part of the Working Well app, workers can opt-in to receive mental health care [120]. Microsoft has also introduced ProtectWell, an application to help employers screen for COVID-19 symptoms and exposure in the workplace. Utilizing AI, ProtectWell calculates the risk of infection for employees and directs workers for testing as indicated [121].

Alongside providing the crucial benefit of controlling the spread of COVID-19 in the workplace, the ubiquitous presence of technological platforms in the workplace requires a consideration of the equity of ADM. While data collected through such applications is deidentified and not permanently stored, the automated decisions based on the health data collected can be discriminatory. This may be the case, for example, for those who report (or are subject to detection of) COVID-19-related symptoms resulting from other health conditions or those who are deemed as “high risk” because of those health conditions. Automated decisions that flag cancer survivors as high risk, for example, may force individuals to miss work due to mandated testing and quarantine, as well as require disclosure of their health conditions.

The implications for privacy violations and discriminatory decision-making from the use of health data collection platforms have been considered by others. Some highlight Castlight’s mining of employee health data to identify a variety of worker health needs and predict their future risk of getting sick [122, 123]. One Castlight product predicts which employees may soon become pregnant based on information from their insurance claims [122]. While such data is shared only at the aggregate level, it risks discriminatory behaviors, such as, in the extreme case, hiring fewer women.

ADM processes therefore can have negative implications for cancer survivors who desire, appropriately, to keep their medical histories private from their employers. Special attention, for example, should be paid to how wearable devices are mined for information [124], considering that cancer survivors often use wearable technologies to help manage their recovery [125]. Cancer treatment can have long-term cognitive impacts on survivors [126], and data mining can discover that a person has cognitive or mental health conditions, even if they have exercised their right of privacy and decided not to inform their employer [126]. Cox-Martin and colleagues have found that nearly 17% of working-age cancer survivors reported experiencing cancer-related pain [127]. While clinical interventions targeting chronic pain can have positive impacts on the continued work success of survivors, increased health scrutiny means that pain-related symptoms are at an increased chance of being detected and scored negatively in an ADM process.

ADM has also attracted attention for its potential to monitor the capacity to work for purposes of governmental social safety-net systems such as Social Security benefits [128]. Social Security benefits are meant to enable persons with disabilities, including some cancer survivors, to work by providing funds for the purchase of assistive technologies. They also provide financial and practical supports pertaining to transportation to work, access to disability-specific training, and, for people seeking employment, supports in the job search and hiring process. These supports include information and programs for assessing teamwork competency and improving formative work and job assessments [129].

Although these and other uses of ADM can create efficiencies for employers [130], they are often implemented without public transparency and associated with biasing errors. Australia’s robo-debt scandal, for example, involved the federal government erroneously determining that people had been overpaid under Social Security payments and from the National Disability Insurance Scheme [131,132,133]. An automated demand letter was sent to people wrongly determined to owe money, and some were suspended from support unless they took steps to resolve the error.

The potential negative impacts of ADM processes can exacerbate the challenges of people undergoing cancer treatment, whose time, energy, and mental and emotional stamina have already been impacted by their conditions. Those who have acquired a secondary disability from their cancer confront even greater challenges. Workers with disabilities often perform unpaid tasks when reasonable accommodation laws fail to provide work equality [134, 135], manage a heavy load of emotional labor [136], need to identify the inequalities they can accept [137], and are called on to participate in strategic disability initiatives [138].

With increased availability of health data and the use of ADM in the workplace, cancer survivors and other people with disabilities are potentially exposed to even more inequalities than they have previously faced. An “algocracy” can develop where the output from the ADM is blanketly regarded as reliable and valid and acted upon [139].

Addressing algorithmic health discrimination at work

Disability antidiscrimination laws such as the ADA may be used, with varying effectiveness, to combat algorithmic health discrimination at work and in governmental safety-net payments. With respect to the use of ADM during the pandemic, however, many of the protections that would otherwise shield workers from discrimination, such as the ADA, the Rehabilitation Act, the Genetic Information Nondiscrimination Act (“GINA”), and other Equal Employment Opportunity (“EEO”) laws and guidance, have been modified to allow screening and quarantining of workers exposed to COVID-19. Voluntary codes, especially with respect to principled AI frameworks (discussed below), illustrate growing acceptance that inequalities resulting from the use of these ADM technologies will need to be addressed at the industry-wide level.

Reconsidering current legal protections

The increased use of AI and ADM in the workplace and other settings has raised new concerns related to transparency, privacy, and inequality. Scholars have called for increased scrutiny of how algorithms are used in employment, how data are being used and stored, and the implications this may have for employees [140, 141]. Current laws such as the ADA, GINA, and Health Insurance Portability and Accountability Act (“HIPPA”) do not offer clear protections for the new ways data are extracted and used in the workplace, and new considerations for how these laws can be interpreted to protect employees have arisen [123, 141]. Some scholars have called upon social institutions to regulate how AI systems are used and to clarify what should be considered “health” data [15, 142]. Other suggestions are to address how algorithms are created and ways further development of AI can become more inclusive. Failure to capture the unique circumstances of people with disabilities in the data and accurately represent their experiences can be a potential source of bias in AI [15].

Antidiscrimination laws, such as the ADA, can potentially combat new forms of algorithmic health discrimination [143]. But these laws are primarily enforced by those who allege discrimination with the support of advocates [144]. This places a burden upon already under-resourced populations [145]. In addition, the person alleging discrimination must link the discriminatory treatment to their disability—requiring disclosure that some workers may not want [143].

One fundamental challenge in proving algorithmic health discrimination is basic lack of awareness by individuals that exclusion on a prohibited ground has occurred. When employers are hiring, promoting, and assigning work tasks, it is often difficult for individuals to know the exact nature of the assessments involved. So determining whether there were appropriate reasons for, or any underlying bias associated with, hiring and advancement decisions is challenging [146]. Brown, Shetty, and Richardson [38] have stated that algorithmic disability discrimination during recruitment can “feel like an invisible injustice because it is one of the hardest types of employment discrimination to detect.”

To determine if persons with disabilities have been unfairly excluded by ADM on the basis of their disabilities, it is necessary to understand the underlying organizational and industry patterns in this regard. Yet employers rarely inform candidates of the details of a recruitment process [147]. Moreover, even when candidates know ADM is used in the hiring or personnel process, they are often unaware of the algorithm used. Should aggrieved individuals be able to access the code, it still may be difficult to pinpoint the precise data used to make the decision. ADM processes continually alter inputs and draw on changing sources of big data.

The complexity in unpacking the ADM process may necessitate increased transparency by employers that use it [148,149,150]. Burdon calls for reformulating information privacy laws to regulate the consequences of ubiquitous autonomous data collection [151]. In another work, with Harpur, Burdon has proposed information protections to reduce the impact of algorithmic disability discrimination [39]. Burdon and Harpur argue that due process protections are needed for information privacy laws to lessen the embedded biases and inequalities of these systems.

Voluntary principled ethical frameworks

Voluntary agreements may foster innovative activities that transcend minimal compliance with the law and address algorithmic health discrimination resulting from the use of ADM at work. As evidenced by other web accessibility guidelines, voluntary standards can become widely accepted and serve as a guide in developing and enforcing hard law regulatory frameworks [152].

Principled ethical frameworks help regulate how algorithms are developed, deployed, and reviewed [153,154,155]. These frameworks are relevant to promoting health equality in ADM. Fjeld and others have surveyed leading ethical guidelines and frameworks for AI [156]. Their analyses illustrate commitment by industry groups, such as the Information Technology Industry Council (“ITI”), which has AI Policy Principles, and by leading companies, such as Google and Microsoft, which each have AI Principles, to exceed minimal compliance.

The Principled Artificial Intelligence Report involves identifying prominent AI frameworks and discovering thematic trends relating to industry sector norms [156]. These principles include Privacy and Consent; Control over Use of Data; Recommendations for Data Protection Laws; Ability to Restrict Processing and Right to Rectification and Erasure; Accountability; Evaluation and Auditing; Verifiability and Replicability; Liability and Legal Responsibility; Creation of a Monitoring Body; Transparency and Open-Source Data and Algorithms; Notification when Interacting with an AI; and Non-discrimination and the Prevention of Bias.

Although created by a diverse range of governmental, non-governmental, and private actors, common themes can be identified across these frameworks. These include Privacy; Accountability; Safety and Security; Transparency; Fairness and Non-discrimination; Human Control of Technology; Professional Responsibility; and Promotion of Human Values. These themes are relevant to persons experiencing inequalities flowing from COVID-19 and algorithmic health discrimination. Yet, a review of the principled AI frameworks suggests that disability is provided relatively less attention than other protected attributes. Across the frameworks, gender is mentioned sixteen times, race eleven times—and disability, twice. The frameworks were published before the COVID-19 pandemic. It is time to consider how disability and health might become more relevant in future frameworks.

Conclusions and recommendations

COVID-19 has made individuals from all parts of society more vulnerable to algorithmic health discrimination. Almost forty-seven million Americans have survived the disease [157] and others have needed to take measures due to their exposure to a person who has acquired COVID-19. An undefined percentage have been forced to isolate and take measures due to other vulnerabilities to the pandemic. All of these people are vulnerable to inequalities flowing from ADM, and they have limited legal recourse. This common experience has created the need for a unified approach that focuses on algorithmic fairness and equality, with an especially urgent need for a focus on disability and chronic health conditions such as cancer.

History shows that groups facing inequalities can draw together to create a legislative, research, and advocacy agenda [158,159,160]. The cancer survivorship advocacy movement illustrates the challenges and benefits of building such broad coalitions. Finding common senses of identity can be a challenge—people who have a cancer diagnosis and those who have completed primary treatment do not always regard themselves as “survivors” [161]. Survivorship can imply “defeating” the disease, but cancer is often in remission and not survived over a longer period [162]. Nevertheless, the label of “cancer survivorship” has enabled powerful informational and advocacy efforts to emerge, which in turn have helped advance the rights of diverse people across the cancer journey [163].

The shortcomings of legislative and regulatory interventions can also motivate people to collaborate across diverse individual experiences [17, 31, 164]. Here, too, different views of labeling or identity can be a challenge, but some have been able to pursue remedies. Many people with cancer and cancer survivors may not regard themselves as “disabled,” and courts have held that some forms of cancer are not necessarily a disability for purposes of the ADA [165]. Older persons often do not regard themselves as disabled, yet they often avail themselves of disability discrimination protections [166, 167]. People who use sign language often regard themselves as a linguistic minority, yet some use disability regulatory protections [168].

Algorithmic health discrimination, spurred by COVID-19 health surveillance, differs from previous instances of inequality due to the large and diverse numbers of individuals and groups impacted. COVID-19 health surveillance cuts across society, affecting not only those with disabilities or with health concerns, the elderly, communities of color, and socio-economically disadvantaged communities, but also those who have previously been unlikely to experience disadvantage. Furthermore, it has the potential to intensify discrimination experiences of those with multiple intersecting marginalized identities [169, 170].

Today’s situation creates a unique opportunity to mobilize groups across society, whether actual victims of algorithmic health discrimination or potential future victims, to call for increased transparency to help reduce inequalities. Considering how COVID-19 health surveillance is creating new loci of algorithmic inequalities, there is a pressing need for open and broad discussions of how society constructs and addresses disability. Furthermore, the lack of transparency and the ambiguity associated with ADM and AI can make it difficult for workers to spot and report instances of discrimination, thus increasing the need for legislative reform, research, and advocacy efforts [15, 171, 172].

Harpur and Blanck have posited ways that disability-inclusive employment policies can respond to the pandemic, rapid technological changes, and the “new normal” in the world of work [47]. Their agenda includes examination of who “owns” workplace data, how the data are used and monitored, and how the outcomes are assessed. This agenda can help to foster more informed approaches for data scientists, workforce development professionals, human resource personnel, organizational managers, employers, governmental benefits specialists, and persons with disabilities. The outcomes should help to improve the capacity of systems to avoid algorithmic disability discrimination and to find means of enhancing equality at work.

For cancer survivors, and persons with disabilities generally, there is a need for enhanced access to knowledge to explore alternative paths to employment, career advancement, and job retention. Today’s norms, while unsettling, create opportunities for informed discussion about societal notions of “ability” and ideas about diversity in societies ravaged by the public health crisis. These discussions can lead to increased and better-informed scrutiny of the use of ADM technologies that impact individual privacy and rights. These prospects will, in turn, open new paths for notions of equality, uses of technology, and innovation as informed by diverse individuals, including cancer survivors and persons with disabilities.

Data availability

Not applicable.

Code availability

Not applicable.

References

  1. Fisher J, Languilaire JC, Lawthom R, Nieuwenhuis R, Petts RJ, Runswick-Cole K, Yerkes MA. Community, work, and family in times of COVID-19. Community Work Fam. 2020;26;23(3):247–52.

  2. Valdez RS, Rogers CC, Claypool H, Trieshmann L, Frye O, Wellbeloved-Stone C, Kushalnagar P. Ensuring full participation of people with disabilities in an era of telehealth. J Am Med Inform Assoc. 2021;28(2):389–92.

    PubMed  Google Scholar 

  3. Pettinicchio D, Maroto M, Chai L, Lukk M. Findings from an online survey on the mental health effects of COVID-19 on Canadians with disabilities and chronic health conditions. Disabil Health J. 2021;:101085.

  4. Massicotte V, Ivers H, Savard J. COVID-19 pandemic stressors and psychological symptoms in breast cancer patients. Curr Oncol. 2021;28(1):294–300.

    PubMed  PubMed Central  Google Scholar 

  5. Ayubi E, Bashirian S, Khazaei S. Depression and anxiety among patients with cancer during COVID-19 pandemic: a systematic review and meta-analysis. J Gastrointest Oncol. 2021;52:499–507.

    CAS  Google Scholar 

  6. Schwartz AE, Munsell EG, Schmidt EK, Colón-Semenza C, Carolan K, Gassner DL. Impact of COVID-19 on services for people with disabilities and chronic health conditions. Disabil Health J. 2021;14(3):101090.

    PubMed  Google Scholar 

  7. Solomon MZ, Wynia MK, Gostin LO. COVID-19 crisis triage – optimizing health outcomes and disability rights. N Engl J Med. 2020;383(5):e27.

    PubMed  Google Scholar 

  8. Couch KA, Fairlie RW, Xu H. Early evidence of the impacts of COVID-19 on minority unemployment. J Public Econ. 2020;192:104287.

    PubMed  PubMed Central  Google Scholar 

  9. Farrell D, Ganong P, Greig F, Liebeskind M, Noel P, Vavra J. Consumption effects of unemployment insurance during the COVID-19 pandemic. Available at SSRN 3654274. 2020.

  10. Kong E, Prinz D. Disentangling policy effects using proxy data: which shutdown policies affected unemployment during the COVID-19 pandemic? J Public Econ. 2020;189:104257.

    Google Scholar 

  11. Wang J, Yang J, Iverson BC, Kluender R. Bankruptcy and the COVID-19 crisis. Available at SSRN 3690398. 2020.

  12. Locked out of the labour market: The impact of COVID-19 on disabled adults in accessing good work-now and into the future. Leonard Cheshire. 2020. https://www.leonardcheshire.org/sites/default/files/2020-10/Locked-out-of-labour-market.pdf. Accessed 05 Oct 2021.

  13. COVID-19 and the rights of persons with disabilities: guidance. United Nations Human Rights Office of the High Commissioner. 2020. https://www.ohchr.org/Documents/Issues/Disability/COVID-19_and_The_Rights_of_Persons_with_Disabilities.pdf. Accessed 05 Oct 2021.

  14. Algorithm Watch, Bertelsmann Stifung. Automated decision-making systems in the COVID-19 pandemic: a European perspective. Algorithm Watch. 2020. https://algorithmwatch.org/en/automating-society-2020-covid19/. Accessed 05 Oct 2021.

  15. Packin NG, Lev-Aretz Y. Learning algorithms and discrimination. In: Research handbook on the law of artificial intelligence. Edward Elgar Publishing; 2018.

  16. Davis LJ. Constructing normalcy: The bell curve, the novel, and the invention of the disabled body in the nineteenth century. In: Davis LJ, editor. The disability studies reader. 2nd ed. New York: Routledge; 2006. p. 3–16.

    Google Scholar 

  17. Blanck P, Hyseni F, Wise FA. Diversity and inclusion in the American legal profession: discrimination and bias reported by lawyers with disabilities and lawyers who identify as LGBTQ+. Am J Law Med. 2021;47(1):9–61.

    PubMed  Google Scholar 

  18. de Boer A, Torp S, Popa A, Horsboel T, Zadnik V, Rottenberg Y, Bardi E, Bultmann U, Sharp L. Long-term work retention after treatment for cancer: a systematic review and meta-analysis. J Cancer Surviv. 2020;14(2):135–50.

    PubMed  PubMed Central  Google Scholar 

  19. Rottenberg Y, de Boer A. Risk for unemployment at 10 years following cancer diagnosis among very long-term survivors: a population based study. J Cancer Surviv. 2020;14:151–7.

    PubMed  Google Scholar 

  20. Ganz PA. Cancer survivorship: today and tomorrow. New York: Springer; 2007.

    Google Scholar 

  21. Hodges AC. Working with cancer: how the law can help survivors maintain employment. Wash L Rev. 2015;90(3):1039–112.

    Google Scholar 

  22. Canfield L. Cancer patients’ prognosis: how terminal are their employment prospects? Syracuse L Rev. 1987;38:801–30.

    Google Scholar 

  23. Strauser DR, Leslie MJ, Rumrill P, McMahon B, Greco C. The employment discrimination experiences of younger and older Americans with cancer under Title I of the Americans with Disabilities Act. J Cancer Surviv. 2020;14(5):614–23.

    PubMed  Google Scholar 

  24. Klaver KM, Duijts S, Engelhardt E, Geusgens C, Aarts M, Ponds R, van der Beek A, Schagen S. Cancer-related cognitive problems at work: experiences of survivors and professionals. J Cancer Surviv. 2020;14(2):168–78.

    PubMed  Google Scholar 

  25. Stergiou-Kita M, Pritlove C, Kirsh B. The, “Big C”—stigma, cancer, and workplace discrimination. J Cancer Surviv. 2016;10(6):1035–150.

    PubMed  Google Scholar 

  26. Gibson SM. The Americans with Disabilities Act protects individuals with a history of cancer from employment discrimination: myth or reality. Hofstra Lab Emp LJ. 1998;16:167–200.

    Google Scholar 

  27. McEvoy SA. Cancer and employment discrimination: fighting job bias is worse than fighting the disease. Labor Law J. 1990;41(6):323–36.

    Google Scholar 

  28. Bonilla I. Cancer as a disability after the American with Disabilities Act Amendments Act. Fed Lawyer. 2012;59:12–4.

    Google Scholar 

  29. Hoffman B. The law of intended consequences: did the Americans with Disabilities Act Amendments Act make it easier for cancer survivors to prove disability status? NYU Ann Surv Am L. 2012;68:843–90.

    Google Scholar 

  30. Neumark D, Bradley CJ, Henry M, Dahman B. Work continuation while treated for breast cancer: the role of workplace accommodations. Ind Labor Relat Rev. 2015;68(4):916–54.

    PubMed  PubMed Central  Google Scholar 

  31. Blanck P, Hyseni F, Wise FA. Diversity and inclusion in the American legal profession: workplace accommodations for lawyers with disabilities and lawyers who identify as LGBTQ+. J Occup Rehab. 2020;30(4):537–64.

    Google Scholar 

  32. Leslie M, Strauser DR, McMahon B, Greco C, Rumrill PD. The workplace discrimination experiences of individuals with cancer in the Americans with Disabilities Act Amendments Act era. J Occup Rehab. 2020;30(1):115–24.

    Google Scholar 

  33. de Rijk A, Amir Z, Cohen M, Furlan T, Godderis L, Knezevic B, Miglioretti M, Munir F, Popa AE, Sedlakova M, Torp S. The challenge of return to work in workers with cancer: employer priorities despite variation in social policies related to work and health. J Cancer Surviv. 2020;14(2):188–99.

    PubMed  Google Scholar 

  34. Tiedtke CM, De Casterlé BD, Frings-Dresen MH, De Boer AG, Greidanus MA, Tamminga SJ, De Rijk AE. Employers’ experience of employees with cancer: trajectories of complex communication. J Cancer Surviv. 2017;11(5):562–77.

    CAS  PubMed  PubMed Central  Google Scholar 

  35. Blanck P. Why America is better off because of the Americans with Disabilities Act and the Individuals with Disabilities Education Act. Touro Law Rev. 2019;35:605–18.

    Google Scholar 

  36. Schur L, Nishii L, Adya M, Kruse D, Bruyère SM, Blanck P. Accommodating employees with and without disabilities. Hum Resour Manage. 2014;53(4):593–621.

    Google Scholar 

  37. Harpur P. From universal exclusion to universal equality: regulating ableism in a digital age. N KY Law Rev. 2013;40:529–65.

    Google Scholar 

  38. Brown LXZ, Shetty R, Richardson M. Algorithm-driven hiring tools: innovative recruitment or expedited disability discrimination? Centre for Democracy and Technology. 2020. https://cdt.org/insights/report-algorithm-driven-hiring-tools-innovative-recruitment-or-expedited-disability-discrimination/. Accessed 05 Oct 2021.

  39. Burdon M, Harpur P. Re-conceptualising privacy and discrimination in an age of talent analytics. Univ NSW Law J. 2014;37(2):679–712.

    Google Scholar 

  40. Roberts JL. Protecting privacy to prevent discrimination. William Mary Law Rev. 2014;56(6):2097–174.

    Google Scholar 

  41. Sprague R. No surfing allowed: a review & analysis of legislation prohibiting employers from demanding access to employees’ & job applicants’ social media accounts. Albany Law J Sci Technol. 2013;24:481–513.

    Google Scholar 

  42. Lam H. Social media dilemmas in the employment context. Empl. 2016;38(3):420–37.

    Google Scholar 

  43. Harpur P. Symposium: how justified are restrictions in the name of health during COVID-19? COVID-19 in Australia: protecting public health through restricting civil rights and risking the democratic rule of law. Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School. 2020. https://blog.petrieflom.law.harvard.edu/2020/05/14/australia-global-responses-covid19/. Accessed 05 Oct 2021.

  44. Nojeim G. CDT statement and announcement of coronavirus: data for life and liberty task force formation. Centre for Democracy and Technology. 2020. https://cdt.org/insights/cdt-statement-and-announcement-of-coronavirus-data-for-life-and-liberty-task-force-formation/. Accessed 05 Oct 2021.

  45. Thayyil J, Kuniyil V, Cherumanalil JM. COVID-19: digital contact tracing technologies and ethical challenges. Int J Community Med Public Health. 2020;7(7):2854–61.

    Google Scholar 

  46. Kanter AS. Can faculty be forced back to campus? The Chronicle of Higher Education. 2020. https://www.chronicle.com/article/can-faculty-be-forced-back-on-campus. Accessed 05 Oct 2021.

  47. Harpur P, Blanck P. Gig workers with disabilities: opportunities, challenges, and regulatory response. J Occup Rehab. 2020;30(4):511–20.

    Google Scholar 

  48. Ashford NA, Katz JI. Unsafe working conditions: employee rights under the Labor Management Relations Act and the Occupational Safety and Health Act. Notre Dame Law Rev. 1976;52(5):802–37.

    Google Scholar 

  49. Carlisle M. Scared to return to work amid the COVID-19 pandemic? These federal laws could grant you some protections. TIME. 2020. https://time.com/5832140/going-back-to-work-coronavirus-rights/. Accessed 05 Oct 2021.

  50. Liang W, Guan W, Chen R, Wang W, Li J, Xu K, Li C, Ai Q, Lu W, Liang H, Li S. Cancer patients in SARS-CoV-2 infection: a nationwide analysis in China. Lancet Oncol. 2020;21(3):335–7.

    CAS  PubMed  PubMed Central  Google Scholar 

  51. Onder G, Rezza G, Brusaferro S. Case-fatality rate and characteristics of patients dying in relation to COVID-19 in Italy. JAMA. 2020;323(18):1775–6.

    CAS  PubMed  Google Scholar 

  52. Robilotti EV, Babady NE, Mead PA, Rolling T, Perez-Johnston R, Bernardes M, Bogler Y, Caldararo M, Figueroa CJ, Glickman MS, Joanow A. Determinants of COVID-19 disease severity in patients with cancer. Nat Med. 2020;26(8):1218–23.

    CAS  PubMed  PubMed Central  Google Scholar 

  53. Americans with Disabilities Act of 1990, ADA Amendments Act of 2008, 42 U.S.C. § 12101(a)(8) (2008).

  54. Ibekwe CS. Legal implications of COVID-19 on the employers’ duty to provide a safe work environment. J Law Policy Glob. 2020;101:196–206.

    Google Scholar 

  55. Griffin F. COVID-19 and public accommodations under the Americans with Disabilities Act: getting Americans safely back to restaurants, theatres, gyms, and ‘normal.’ St Louis Univ Law J. Forthcoming 2021;65(2).

  56. Harpur P, French B. Is it safer without you? Analysing the intersection between work health and safety and anti-discrimination laws. J Health Saf Environ. 2014;30(1):167–83.

    Google Scholar 

  57. Harpur P. Ableism at work: disablement and hierarchies of impairment. Cambridge: Cambridge University Press; 2019.

    Google Scholar 

  58. Bodie MT, McMahon M. Employee testing, tracing, and disclosure as a response to the coronavirus pandemic. Wash Univ J Law Policy. 2021;64:31–61.

    Google Scholar 

  59. Goldstein J, Hubler S, Wu KJ. Long coronavirus testing lines hit NYC again. The New York Times. 2020. https://www.nytimes.com/2020/11/13/nyregion/nyc-coronavirus-testing-lines.html. Accessed 05 Oct 2021.

  60. Kliff S. Why coronavirus tests come with surprise bills. The New York Times. 2020. https://www.nytimes.com/2020/09/09/upshot/coronavirus-surprise-test-fees.html. Accessed 05 Oct 2021.

  61. FAQs for workplaces & businesses. Centre for Disease Control and Prevention. 2020. https://www.cdc.gov/coronavirus/2019-ncov/community/general-business-faq.html. Accessed 05 Oct 2021.

  62. Field RI, Orlando AW, Rosoff AJ. Genetics and COVID-19: how to protect the susceptible. Trends Genet. 2020;37(2):106–8.

    PubMed  PubMed Central  Google Scholar 

  63. Li TC. Privacy in pandemic: law, technology, and public health in the COVID-19 crisis. Loloya U Chic L J. Forthcoming 2021;52(3).

  64. Van Natta M, Chen P, Herbek S, Jain R, Kastelic N, Katz E, Struble M, Vanam V, Vattikonda N. The rise and regulation of thermal facial recognition technology during the COVID-19 pandemic. J Law Biosci. 2020:1–17.

  65. An MM, Zou Z, Shen H, Liu P, Chen ML, Cao YB, Jiang YY. Incidence and risk of significantly raised blood pressure in cancer patients treated with bevacizumab: an updated meta-analysis. Eur J Clin Pharmacol. 2010;66(8):813–21.

    CAS  PubMed  Google Scholar 

  66. Gibson TM, Li Z, Green DM, Armstrong GT, Mulrooney DA, Srivastava D, Bhakta N, Ness KK, Hudson MM, Robison LL. Blood pressure status in adult survivors of childhood cancer: a report from the St.Jude Lifetime Cohort Study. Cancer Epidemiol Biomark Prev. 2017;26(12):1705–13.

    Google Scholar 

  67. Sagstuen H, Aass N, Fossa SD, Dahl O, Klepp O, Wist EA, Wilsgaard T, Bremnes RM. Blood pressure and body mass index in long-term survivors of testicular cancer. J Clin Oncol. 2005;23(22):4980–90.

    CAS  PubMed  Google Scholar 

  68. Sinha MD, Postlethwaite RJ. Urinary tract infections and the long-term risk of hypertension. Curr Paediatr. 2003;13(7):508–12.

    Google Scholar 

  69. Fahme SA, Bloomfield GS, Peck R. Hypertension in HIV-infected adults. Hypertension. 2018;71(1):44–55.

    Google Scholar 

  70. Salako BL, Ajayi SO. Bronchial asthma: a risk factor for hypertension? Afr J Med Sci. 2000;29(1):47–50.

    CAS  Google Scholar 

  71. Weaver FM, Collins EG, Kurichi J, Miskevics S, Smith B, Rajan S, Gater D. Prevalence of obesity and high blood pressure in veterans with spinal cord injuries and disorders: a retrospective review. Am J Phys Med Rehab. 2007;86(1):22–9.

    Google Scholar 

  72. Pendo E, Gatter R, Mohapatra S. Resolving tensions between disability rights law and COVID-19 mask policies. MD Law Rev. 2020;80:1–12.

    Google Scholar 

  73. Kaminer D. Discrimination against employees without COVID-19 antibodies. NY Law J. 2020. https://doi.org/10.2139/ssrn.3593113.

    Article  Google Scholar 

  74. N.Y. Comp. Codes R. & Regs. tit. 10, §405.3.

  75. Diaz J. U.S. now requires all U.K. travelers to have a negative coronavirus test. NPR. 2020. https://www.npr.org/sections/coronavirus-live-updates/2020/12/25/950218997/u-s-now-requires-all-u-k-travelers-to-have-a-negative-coronavirus-test. Accessed 05 Oct 2021.

  76. Devlin, D. Rutgers to require COVID-19 vaccine for students. Rutgers. 2021. https://www.rutgers.edu/news/rutgers-require-covid-19-vaccine-students. Accessed 05 Oct 2021.

  77. COVID-19 vaccines for people with underlying medical conditions. Centers for Disease Control and Prevention. 2021. https://www.cdc.gov/coronavirus/2019-ncov/vaccines/recommendations/underlying-conditions.html. Accessed 05 Oct 2021.

  78. Anderson N, Svrluga S, Stanley-Becker I, Lumpkin L, Aguilar M. Colleges want students to get a coronavirus vaccine. But they’re split on requiring the shots. Washington Post. 2021. https://www.washingtonpost.com/education/2021/06/23/colleges-covid-vaccine-mandates-divide. Accessed 05 Oct 2021.

  79. Harpur P, Bronitt S, Billings P, Verreynne ML, Pachana NA. Regulating fake assistance animals – a comparative review of disability law in Australia and the United States. Anim Law Rev. 2018;24(1):77–97.

    Google Scholar 

  80. Himmelreich J. Ethics of technology needs more political philosophy. Comms ACM. 2019;63(1):33–5.

    Google Scholar 

  81. Lyakina M, Heaphy W, Konecny V, Kliestik T. Algorithmic governance and technological guidance of connected and autonomous vehicle use: regulatory policies, traffic liability rules, and ethical dilemmas. Contemp Read Law Soc Justice. 2019;11(2):15–21.

    Google Scholar 

  82. Kim J, Moon H, Jung H. Drone-based parcel delivery using the rooftops of city buildings: model and solution. Appl Sci. 2020;10(12):4362–82.

    CAS  Google Scholar 

  83. Das SK, Cook DJ, Battacharya A, Heierman EO, Lin TY. The role of prediction algorithms in the MavHome smart home architecture. IEEE Wirel Commun. 2002;9(6):77–84.

    Google Scholar 

  84. Fundación ONCE, ILO Global Business and Disability Network. An inclusive digital economy for people with disabilities. Disability hub Europe. 2021. http://www.businessanddisability.org/wp-content/uploads/2021/02/inclusiveDigitalEconomy.pdf. Accessed 05 Oct 2021.

  85. Komninos N, Panori A, Kakderi C. Smart cities beyond algorithmic logic: digital platforms, user engagement and data science. In: Smart cities in the post-algorithmic era Edward Elgar Publishing; 2019.

  86. Reddix-Smalls B. Credit scoring and trade secrecy: an algorithmic quagmire or how the lack of transparency in complex financial models scuttled the finance market. UC Davis Bus Law J. 2011;12(1):87–124.

    Google Scholar 

  87. Hoffman S. The emerging hazard of AI-related health care discrimination. Hastings Cent Rep. 2021;51(1):8–9.

    PubMed  Google Scholar 

  88. Chiao V. Fairness, accountability and transparency: notes on algorithmic decision-making in criminal justice. Int J Law Context. 2019;15(2):126–39.

    Google Scholar 

  89. Edwards MR, Edwards KA. Predictive HR analytics: mastering the HR metric. 2nd ed. London: Kogan Page; 2019.

    Google Scholar 

  90. Meyers TD, Vagner L, Janoskova K, Grecu I, Grecu G. Big data-driven algorithmic decision-making in selecting and managing employees: advanced predictive analytics, workforce metrics, and digital innovations for enhancing organizational human capital. Psychosociol Issues Hum Resour Manag. 2019;7(2):49–54.

    Google Scholar 

  91. Fidalgo-Blanco Á, Sein-Echaluce ML, García-Peñalvo FJ, Conde MA. Using learning analytics to improve teamwork assessment. Comput Human Behav. 2015;47:149–56.

    Google Scholar 

  92. Tempelaar DT, Heck A, Cuypers H, van der Kooij H, van de Vrie E. Formative assessment and learning analytics. In: Suthers D, Verbert K, Duval E, Ochoa X, editors. Third International Conference on Learning Analytics and Knowledge (LAK 2013). New York: Association for Computing Machinery; 2013. p. 205–9.

    Google Scholar 

  93. Drigas AS, Ioannidou RE. Artificial intelligence in special education: a decade review. Int J Eng Educ. 2012;28(6):1366–72.

    Google Scholar 

  94. Garg S, Sharma S. Impact of artificial intelligence in special needs education to promote inclusive pedagogy. Int J Inf Educ Technol. 2020;10(7):523–7.

    Google Scholar 

  95. Du Pen SL, Du Pen AR, Polissar N, Hansberry J, Kraybill BM, Stillman M, Panke J, Everly R, Syrjala K. Implementing guidelines for cancer pain management: results of a randomized controlled clinical trial. J Clin Oncol. 1999;17(1):361–70.

    PubMed  Google Scholar 

  96. Cortes U, Martinez-Velasco A, Barrue C, Martin EX, Campana F, Annicchiarico R, Caltagirone C. Towards an intelligent service to elders mobility using the i-Walker. In: 2008 Nov 7. AAAI Fall Symposium: AI in Eldercare: New Solutions to Old Problems; Arlington. Menlo Park: Association for the Advancement of Artificial Intelligence; 2008. pp. 32-38.

  97. Lacey G, Dawson-Howe KM. The application of robotics to a mobility aid for the elderly blind. Robot Auton Syst. 1998;23(4):245–52.

    Google Scholar 

  98. Morrison C, Cutrell E, Dhareshwar A, Doherty K, Thieme A, Taylor A. Imagining artificial intelligence applications with people with visual disabilities using tactile ideation. In: 19th International ACM SIGACCESS Conference on Computers and Accessibility; Baltimore. New York: Association for Computing Machinery; 2017. pp. 81–90.

  99. Poggi M, Mattoccia S. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning. In: IEEE Symposium on Computers and Communications; Messina, Italy. New York, United States: Institute of Electrical and Electronics Engineers; 2016. pp. 208–13.

  100. Kim CH, Jung JH, Kim BK. Design of an intelligent wheelchair for the motor disabled. In: Bien ZZ, Stefanov D, editors. Advances in Rehabilitation Robotics. Berlin: Springer; 2004. p. 299–310.

    Google Scholar 

  101. de Andrade NN, Pawson D, Muriello D, Donahue L, Guadagno J. Ethics and artificial intelligence: suicide prevention on Facebook. Philos Technol. 2018;31(4):669–84.

    Google Scholar 

  102. Fonseka TM, Bhat V, Kennedy SH. The utility of artificial intelligence in suicide risk prediction and the management of suicidal behaviors. Aust N Z J Psychiatry. 2019;53(10):954–64.

    PubMed  Google Scholar 

  103. McKernan LC, Clayton EW, Walsh CG. Protecting life while preserving liberty: ethical recommendations for suicide prevention with artificial intelligence. Front Psychiatry. 2018;9:650.

    PubMed  PubMed Central  Google Scholar 

  104. Köchling A, Riazy S, Wehner MC, Simbeck K. Highly accurate, but still discriminatory. Bus Inf Syst Eng. 2021;63(1):39–54.

    Google Scholar 

  105. Mondore S, Douthitt S, Carson M. Maximizing the impact and effectiveness of HR analytics to drive business outcomes. People Strategy. 2011;34(2):20–7.

    Google Scholar 

  106. Köchling A, Wehner MC. Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus Res. 2020;13(3):795–848.

    Google Scholar 

  107. Jeske D, Calvard T. Big data: lessons for employers and employees. Empl Relat. 2020;42(1):248–61.

    Google Scholar 

  108. Andrejevic M, Burdon M. Defining the sensor society. Telev New Media. 2015;16(1):19–36.

    Google Scholar 

  109. Sprague R. Welcome to the machine: privacy and workplace implications of predictive analytics. Rich J Law Tech. 2014;21(4):1–46.

    Google Scholar 

  110. Collins, P. The right to privacy, surveillance-by-software and the “home-workplace”. UK Labour Law. 2020. https://uklabourlawblog.com/2020/09/03/the-right-to-privacy-surveillance-by-software-and-the-home-workplace-by-dr-philippa-collins/. Accessed 05 Oct 2021.

  111. Hashimoto E, Ichino M, Kuboyama T, Echizen I, Yoshiura H. Breaking anonymity of social network accounts by using coordinated and extensible classifiers based on machine learning. In: Dwivedi YK, Mantymaki M, Ravishankar MN, Janssen M, Clement M, Slade EL, Rana NP, Al-Sharhan S, Simintiras AC, editors. Social media: the good, the bad, and the ugly. Berlin: Springer; 2016. p. 455–70.

    Google Scholar 

  112. Ajunwa I, Crawford K, Schultz J. Limitless worker surveillance. Calif Law R. 2017;105(3):735–76.

    Google Scholar 

  113. Kattari SK, Olzman M, Hanna MD. “You look fine!” Ableist experiences by people with invisible disabilities. Affilia. 2018;33(4):477–92.

    Google Scholar 

  114. Marks M. Algorithmic disability discrimination. In: Cohen IG, Shachar C, Silvers A, Stein MA, editors. Disability, health, law, and bioethics. Cambridge: Cambridge University Press; 2020. p. 242–54.

    Google Scholar 

  115. Marks M. Emergent medical data. Harvard Law School. 2017. https://blog.petrieflom.law.harvard.edu/2017/10/11/emergent-medical-data/. Accessed 05 Oct 2021.

  116. Castlight. Company overview. 2021. https://www.castlighthealth.com/about/. Accessed 05 Oct 2021.

  117. Castlight. Working Well. 2021. https://www.castlighthealth.com/working-well/. Accessed 05 Oct 2021.

  118. Miliard M. Working Well, designed to support evolving clinical protocols, enables contact tracing and can be configured to organizations’ own specific back-to-work needs. 2020. https://www.healthcareitnews.com/news/castlight-health-intros-new-tool-help-plan-safe-workplace-reopenings. Accessed 05 Oct 2021.

  119. Castlight. Castlight health launches new solution to safely navigate return to campus challenges in the wake of COVID-19. 2021. https://www.castlighthealth.com/press-releases/castlight-health-launches-new-solution-to-safely-navigate-return-to-campus-challenges-in-the-wake-of-covid-19/. Accessed 05 Oct 2021.

  120. Business Wire. Vida Health selected as preferred mental health partner for Castlight’s Working Well. 2020. https://www.businesswire.com/news/home/20200929005316/en/Vida-Health-Selected-as-Preferred-Mental-Health-Partner-for-Castlight%E2%80%99s-Working-Wel. Accessed 05 Oct 2021.

  121. UnitedHealth Group and Microsoft collaborate to launch ProtectWell™ protocol and app to support return-to-workplace planning and COVID-19 symptom screening. Microsoft News Center. 2020. https://news.microsoft.com/2020/05/15/unitedhealth-group-and-microsoft-collaborate-to-launch-protectwell-protocol-and-app-to-support-return-to-workplace-planning-and-covid-19-symptom-screening/. Accessed 05 Oct 2021.

  122. Silverman RE. Bosses tap outside firms to predict which workers might get sick. Wall Street Journal. 2016. https://www.wsj.com/articles/bosses-harness-big-data-to-predict-which-workers-might-get-sick-1455664940. Accessed 05 Oct 2021.

  123. Brown E. Supercharged sexism: the triple threat of workplace monitoring for women. 2020; https://doi.org/10.2139/ssrn.3680861.

  124. Brown EA. A healthy mistrust: curbing biometric data misuse in the workplace. Stanf Tech L Rev. 2020;23(2):252–305.

    Google Scholar 

  125. Nguyen NH, Vallance JK, Buman MP, Moore MM, Reeves MM, Rosenberg DE, Boyle T, Milton S, Friedenreich CM, English DR, Lynch BM. Effects of a wearable technology-based physical activity intervention on sleep quality in breast cancer survivors: the ACTIVATE Trial. J Cancer Surviv. 2021;15(2):273–80.

    PubMed  Google Scholar 

  126. Ehrenstein JK, van Zon SKR, Duijts SFA, van Dijk BAC, Dorland HF, Schagen SB, Bültmann U. Type of cancer treatment and cognitive symptoms in working cancer survivors: an 18-month follow-up study. J Cancer Surviv. 2020;14:158–67.

    PubMed  PubMed Central  Google Scholar 

  127. Cox-Martin E, Anderson-Mellies A, Borges V, Bradley C. Chronic pain, health-related quality of life, and employment in working-age cancer survivors. J Cancer Surviv. 2020;14(2):179–87.

    PubMed  Google Scholar 

  128. Henman P. Improving public services using artificial intelligence: possibilities, pitfalls, governance. Asia Pac J Public Admin. 2020;42(4):209–21.

    Google Scholar 

  129. Brown LXZ, Richardson M, Shetty R, Crawford A. Challenging the use of algorithm-driven decision-making in benefits determinations affecting people with disabilities. Centre for Democracy and Technology. 2020. https://cdt.org/insights/report-challenging-the-use-of-algorithm-driven-decision-making-in-benefits-determinations-affecting-people-with-disabilities/. Accessed 05 Oct 2021.

  130. Newell S, Marabelli M. Strategic opportunities (and challenges) of algorithmic decision-making: a call for action on the long-term societal effects of ‘datification.’ J Strateg Inf Syst. 2015;24(1):3–14.

    Google Scholar 

  131. Carney T. Robo-debt illegality: the seven veils of failed guarantees of the rule of law? Altern Law J. 2019;44(1):4–10.

    Google Scholar 

  132. Henman P. The computer says ‘DEBT’: towards a critical sociology of algorithms and algorithmic governance. Data for Policy. 2017. https://espace.library.uq.edu.au/view/UQ:8963b02. Accessed 05 Oct 2021.

  133. Park S, Humphry J. Exclusion by design: intersections of social, digital and data exclusion. Inf Commun Soc. 2019;22(7):934–53.

    Google Scholar 

  134. Blanck P. Disability inclusive employment and the accommodation principle: emerging issues in research, policy, and law. J Occup Rehabil. 2020;30(4):505–10.

    PubMed  Google Scholar 

  135. Inckle K. Unreasonable adjustments: the additional unpaid labour of academics with disabilities. Disabil Soc. 2018;33(8):1372–6.

    Google Scholar 

  136. Wilton RD. Workers with disabilities and the challenges of emotional labour. Disabil Soc. 2008;23(4):361–73.

    Google Scholar 

  137. Harpur P. Naming, blaming and claiming ablism: the lived experiences of lawyers and advocates with disabilities. Disabil Soc. 2014;29(8):1234–47.

    Google Scholar 

  138. Harpur P, Stein MA. Universities as disability rights change agents. N East Univ Law Rev. 2018;10(2):542–82.

    Google Scholar 

  139. Danaher J. The threat of algocracy: reality, resistance and accommodation. Philos Technol. 2016;29(3):245–68.

    Google Scholar 

  140. Li TC. Privacy in pandemic: law, technology, and public health in the COVID-19 crisis. Loy U Chi L J Forthcoming. 2021. https://doi.org/10.2139/ssrn.3690004.

    Article  Google Scholar 

  141. Areheart BA, Roberts JL. Gina, big data, and the future of employee privacy. Yale Law J. 2018;128(3):710–91.

    Google Scholar 

  142. Nguyen A. The constant boss: work under digital surveillance. Data & Society. 2021. https://apo.org.au/sites/default/files/resource-files/2021-05/apo-nid312352.pdf. Accessed 05 Oct 2021.

  143. Blanck P. Disability law and policy. St. Paul: Foundation Press; 2020.

    Google Scholar 

  144. Stein MA, Barnett P, Harpur P, Porter B, O’Cinneide C. Litigating disability social rights. In: Stein MA, Langford M, editors. Disability social rights. Cambridge: Cambridge University Press; 2020. p. 134.

    Google Scholar 

  145. Pasachoff E. Special education, poverty, and the limits of private enforcement. Notre Dame Law R. 2011;86(4):1413–93.

    Google Scholar 

  146. Ameri M, Kurtzberg, T. Difficult disclosures: effects of timing in raising the need for accommodations in job interviews, J Cancer Surviv. 2021.

  147. Jimenez A. Whether you’re hired may depend on how an algorithm rates your video job interview. A new state law on AI screening gives you rights. Chicago Tribune. 2020. https://www.chicagotribune.com/business/ct-biz-illinois-law-limits-online-video-job-interviews-20200124-y2fuvlfzxzftnatx7olabgrule-story.html. Accessed 05 Oct 2021.

  148. Citron DK, Pasquale F. The scored society: due process for automated predictions. Wash Law R. 2014;89(1):1–34.

    Google Scholar 

  149. Tomasev N, McKee KR, Kay J, Mohamed S. Fairness for unobserved characteristics: insights from technological impacts on queer communities. Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society. 2021. https://arxiv.org/abs/2102.04257. Accessed 05 Oct 2021.

  150. Crawford K, Schultz J. Big data and due process: toward a framework to redress predictive privacy harms. Boston Coll Law R. 2014;55(1):93–128.

    Google Scholar 

  151. Burdon M. Digital data collection and information privacy law. Cambridge: Cambridge University Press; 2020.

    Google Scholar 

  152. Blanck P. eQuality: the struggle for web accessibility by persons with cognitive disabilities. Cambridge: Cambridge University Press; 2014.

    Google Scholar 

  153. Goodman B. Discrimination, data sanitization and auditing in the European Union’s General Data Protection Regulation. Eur Data Prot Law Rev. 2016;2(4):493–506.

    Google Scholar 

  154. Toreini E, Aitken M, Coopamootoo KPL, Elliott K, Zelaya VG, Missier P, Ng M, van Moorsel A. Technologies for trustworthy machine learning: a survey in a socio-technical context. Cornell University. 2020. https://arxiv.org/abs/2007.08911. Accessed 05 Oct 2021.

  155. Smit K, Zoet M, van Meerten J. A review of AI principles in practice. Pacific Asia Conference on Information Systems. Association for Information Systems. 2020:198.

  156. Fjeld J, Achten N, Hilligoss H, Nagy A, Srikumar M. Principled artificial intelligence: mapping consensus in ethical and rights-based approaches to principles for AI. Berkman Klein Center for Internet and Society. 2020. https://dash.harvard.edu/handle/1/42160420. Accessed 05 Oct 2021.

  157. COVID data tracker. Centre for Disease Control and Prevention. 2021. https://covid.cdc.gov/covid-data-tracker/#datatracker-home. Accessed 05 Oct 2021.

  158. Harpur P, Stein MA. The UN Convention on the Rights of Persons with Disabilities and the global south. Yale J International Law. Forthcoming 2022;47(1).

  159. Blanck P. On the importance of the Americans with Disabilities Act at 30. J of Disabil Policy Stud. 2021:1–23.

  160. Blanck P. Thirty years of the Americans with Disabilities Act: law students and lawyers as plaintiffs and advocates. NYU Rev Law Soc Chang (Harbinger). 2021;45:8–24.

    Google Scholar 

  161. Feuerstein M. Defining cancer survivorship. J Cancer Surviv. 2007;1(1):5–7.

    PubMed  Google Scholar 

  162. Khan NF, Rose PW, Evans J. Defining cancer survivorship: a more transparent approach is needed. J Cancer Surviv. 2012;6(1):33–6.

    PubMed  Google Scholar 

  163. Blanck P, Myhill WN, Vedeler JS, Morales J, Pearlman P. Individuals with cancer in the workforce and their federal rights. In: Feuerstein M, editor. Work and cancer survivors. New York: Springer; 2009. p. 255–76.

    Google Scholar 

  164. Blanck P, Abdul-Malak Y, Adya M, Hyseni F, Killeen M, Wise FA. Diversity and inclusion in the American legal profession: first phase findings from a national study of lawyers with disabilities and lawyers who identify as LGBTQ+. Univ DC Law Rev. 2020;23(1):23–87.

    Google Scholar 

  165. Korn JB. Cancer and the ADA: rethinking disability. South Calif Law Rev. 2000;74(2):399–454.

    Google Scholar 

  166. Harpur P. Old age is not just impairment: the Convention on the Rights of Persons with Disabilities and the need for a convention on older persons. Univ Penn J Inter Law. 2016;37(3):1027–59.

    Google Scholar 

  167. Harpur P, Pachana N. My Animal, my support, and my new home in a retirement village: disability discrimination, assistance animals and old age. Elder Law Rev. 2017;11(1).

  168. Harpur P. Time to be heard: how advocates can use the Convention on the Rights of Persons with Disabilities to drive change. Valparaiso Univ Law Rev. 2011;45(3):1271–96.

    Google Scholar 

  169. Blanck P. Disability inclusive employment, cancer survivorship, and the Americans with Disabilities Act. J Cancer Surviv. 2021. https://doi.org/10.1007/s11764-021-01141-4.

  170. Hyseni F, Myderrizi A, Blanck P. Diversity and inclusion in the legal profession: disclosure of cancer and other conditions by lawyers with disabilities and lawyers who identify as LGBTQ+. J Cancer Surviv. 2021. https://doi.org/10.1007/s11764-021-01143-2.

  171. Paul K. California passes landmark bill targeting Amazon’s algorithm-driven rules. The Guardian. 2021. https://www.theguardian.com/us-news/2021/sep/10/california-bill-amazon-warehouse-quotas. Accessed 05 Oct 2021.

  172. Stoyanovich J. Hiring and AI: let job candidates know why they were rejected. The Wall Street Journal. https://www.wsj.com/articles/hiring-job-candidates-ai-11632244313?st=fmb5mokn34eauqh&reflink=article_email_share. Accessed 05 Oct 2021.

Download references

Acknowledgements

The authors would like to thank Paul Henman, Mary Trevor, Mitree Vongphakdi, and Arzana Myderrizi for their helpful comments on prior drafts of this article.

Funding

This research was assisted by the Social Science Research Council’s Just Tech COVID-19 Rapid Response Grants, with funds from the Ford Foundation and the MacArthur Foundation, and by the Australian Research Council Future Fellowship awarded to Paul Harpur, Grant #FT210100335. The authors acknowledge the ARC Centre of Excellence for Automated Decision-Making and Society, which receives funding from the Australian Government, of which Paul Harpur is a member. This line of study was also supported in part by grants to Peter Blanck (Principal Investigator) at Syracuse University from the National Institute on Disability, Independent Living, and Rehabilitation Research (“NIDILRR”) for the Rehabilitation Research & Training on Employment Policy Center for Disability-Inclusive Employment Policy Research, Grant #90RTEM0006-01–00, the Southeast ADA Center, Grant #90DP0090-01–00 and 90DPAD0005-01–00, and the RRTC on Employer Practices Leading to Successful Employment Outcomes Among People with Disabilities, Douglas Kruse PI, Grant Application #RTEM21000058. NIDILRR is a Center within the Administration for Community Living (“ACL”), US Department of Health and Human Services (“HHS”). The views provided herein do not necessarily reflect the official policies of NIDILRR nor imply endorsement by the Federal Government.

Author information

Authors and Affiliations

Authors

Contributions

Contribution of the authors is equal.

Corresponding author

Correspondence to Peter Blanck.

Ethics declarations

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent for publication

The authors consent to their work being published as submitted.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Harpur, P., Hyseni, F. & Blanck, P. Workplace health surveillance and COVID-19: algorithmic health discrimination and cancer survivors. J Cancer Surviv 16, 200–212 (2022). https://doi.org/10.1007/s11764-021-01144-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11764-021-01144-1

Keywords

  • COVID-19
  • Health surveillance
  • Algorithmic health discrimination
  • Cancer
  • Disability
  • Chronic illness