Abstract
On February 13, 2020, the Toronto Police Services (TPS) issued a statement admitting that its members had used Clearview AI’s controversial facial recognition technology (FRT). The controversy sparked widespread outcry by the media, civil society, and community groups, and put pressure on policy-makers to address FRTs. Public consultations presented a key tool to contain the scandal in Toronto and across Canada. Drawing on media reports, policy documents, and expert interviews, we investigate four consultations held by the Toronto Police Services Board (TPSB), the Office of the Privacy Commissioner (OPC), and the parliamentary Standing Committee on Access to Information, Privacy and Ethics (ETHI) to understand how public opinion and outrage translate into policy. We find that public consultations became a powerful closure mechanism in the policy-making toolbox, inhibiting rather than furthering democratic debate. Our findings show that consultations do not advance public literacy; that opportunities for public input are narrow; that timeframes are short; and that mechanisms for inclusion are limited. Even in the best-case circumstances, consultations are merely one of many factors in AI governance and seldom impact concrete policy outcomes in the cases studied here.
Similar content being viewed by others
Data availability statement
Some data are available from the authors upon reasonable request except for interview data that are confidential following our research protocols approved by the Research Ethics Unit at the Office of Research at Concordia University.
References
Beckman L, Hultin Rosenberg J, Jebari K (2022) Artificial intelligence and democratic legitimacy. The problem of publicity in public authority. AI Soc. https://doi.org/10.1007/s00146-022-01493-0
Benjamin R (2019) Race after technology: abolitionist tools for the new Jim code. Polity.
Browne S (2010) Digital epidermalization: race, identity and biometrics. Crit Sociol 36(1):131–150. https://doi.org/10.1177/0896920509347144
Browne S (2015) Dark matters: on the surveillance of blackness. Duke University Press, Durham. https://doi.org/10.1215/9780822375302
Buolamwini J, Gebru T (2018) Gender shades: intersectional accuracy disparities in commercial gender classification. Proc Mach Learn Res 81:1–15
Cameron H (2002) CCTV and (In)dividuation. Surveill Soc. https://doi.org/10.24908/ss.v2i2/3.3370
Chilvers J, Kearnes M (eds) (2020) Remaking participation in science and democracy. ST&HV 45(3):347–380. https://doi.org/10.1177/0162243919850885
Clarke AE, Friese C, Washburn R (2018) Situational analysis: Grounded theory after the interpretive turn, 2nd edn. Sage, Newcastle upon Tyne
Clarke J, Bainton D, Lendvai N, Stubbs P (eds) (2015) Making policy move: towards a politics of translation and assemblage. Policy Press.
Conger K, Fausset R, Kovaleski SF (2019) San Francisco bans facial recognition technology. The New York Times. https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html
Cotter SP (2020) Boston City Council votes to ban facial-recognition technology – Boston Herald. https://www.bostonherald.com/2020/06/24/boston-city-council-votes-to-ban-facial-recognition-technology/
Crawford K, Paglen T (2021) Excavating AI: the politics of images in machine learning training sets. AI Soc 36(4):1105–1116. https://doi.org/10.1007/s00146-021-01162-8
Crosby AA, Monaghan J (2018) Policing indigenous movements: dissent and the security state. Fernwood Publishing, Nova Scotia
Dandurand G, McKelvey F, Roberge J (2023) Freezing out: legacy media’s shaping of AI as a cold controversy. Big Data Soc 10(2):20539517231219240. https://doi.org/10.1177/20539517231219242
Department of Justice, G. of C (2001) Policy statement and guidelines for public participation. https://canada.justice.gc.ca/eng/cons/pol.html
Dubois E, McKelvey F (2019) Political bots: disrupting Canada’s democracy. Can J Commun 44(2):PP-7-PP−3. https://doi.org/10.22230/cjc.2019v44n2a3511
Ducas I (2020) Montréal: Pas de reconnaissance faciale par les policiers sans l’accord des élus. La Presse. https://www.lapresse.ca/actualites/2020-09-22/montreal/pas-de-reconnaissance-faciale-par-les-policiers-sans-l-accord-des-elus.php
Edwards B (2023) EU votes to ban AI in biometric surveillance, require disclosure from AI systems | Ars Technica. Ars Technica. https://arstechnica.com/information-technology/2023/06/eu-votes-to-ban-ai-in-biometric-surveillance-require-disclosure-from-ai-systems/
Ezeonu I (2010) Gun violence in toronto: perspectives from the police. Howard J Crim Just 49(2):147–165. https://doi.org/10.1111/j.1468-2311.2009.00603.x
Gandy OH (2021) The panoptic sort: a political economy of personal information, 2nd edn. Oxford University Press, Oxford. https://doi.org/10.1093/oso/9780197579411.001.0001
Garvie C (2022) A Forensic without the science: face recognition in U.S. Criminal Investigations. Center on Privacy & Technology at Georgetown Law, Washington
Garvie C (2019) Garbage in, garbage out: face recognition on flawed data. https://www.flawedfacedata.com
Gates K (2011) Our biometric future: Facial recognition technology and the culture of surveillance. New York University Press, New York
Geist M (2021) Why the Liberals Have Become the Most Anti-Internet Government in Canadian History—Michael Geist. https://www.michaelgeist.ca/2021/04/why-the-liberals-have-become-the-most-anti-internet-government-in-canadian-history/
Gillis W (2018) Civil rights organization urges city hall to delay purchase of gunshot-location technology. Toronto Star. https://www.thestar.com/news/gta/civil-rights-organization-urges-city-hall-to-delay-purchase-of-gunshot-location-technology/article_3e15826a-acc6-5554-8718-b4dd9fe8cde3.html
Gray M (2002) Urban surveillance and panopticism: will we recognize the facial recognition society? Surveill Soc 1(3):314–330. https://doi.org/10.24908/ss.v1i3.3343
Hartzog W (2018) Facial recognition is the perfect tool for oppression. Medium. https://medium.com/s/story/facial-recognition-is-the-perfect-tool-for-oppression-bc2a08f0fe66
Haskins C (2019) A second U.S. City has banned facial recognition. https://www.vice.com/en/article/paj4ek/somerville-becomes-the-second-us-city-to-ban-facial-recognition
Hickok M (2022) Public procurement of artificial intelligence systems: new risks and future proofing. AI Soc. https://doi.org/10.1007/s00146-022-01572-2
Hill K (2020a) The secretive company that might end privacy as we know it. The New York Times. https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html
Hill K (2020b) Wrongfully accused by an algorithm. The New York Times. https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html
Introna L, Wood D (2002) Picturing algorithmic surveillance: the politics of facial recognition systems. Surveill Soc. https://doi.org/10.24908/ss.v2i2/3.3373
Jaynes A (2020) The end of anonymity? Facial recognition app used by police raises serious concerns, say privacy advocates. CBC Radio. https://www.cbc.ca/radio/thecurrent/the-current-for-jan-21-2020-1.5434328/the-end-of-anonymity-facial-recognition-app-used-by-police-raises-serious-concerns-say-privacy-advocates-1.5435278
Jones M (2022) Towards civil strategization of AI in Germany. An investigation of the involvement of civil society in the making of the German National Artificial Intelligence Strategy. HIIG Discuss Pap Ser 222(1):21. https://doi.org/10.5281/ZENODO.6091638
Jones M, Wester M, McKelvey F (2024) Public inputs as training data: consultation, calibration and machine learning political orders [Unpublished Manuscript]
Keyes O, Stevens N, Wernimont J (2019) The government is using the most vulnerable people to test facial recognition software. https://slate.com/technology/2019/03/facial-recognition-nist-verification-testing-data-sets-children-immigrants-consent.html
Konina A (2021) Privatization of law enforcement: promoting human rights through procurement contracts. McGill GLSA Res Ser 1(1):1–36. https://doi.org/10.26443/glsars.v1i1.134
Linder T, Jones M, McKelvey F (2023) Opinion | Toronto police consultation on AI lacks sufficient public engagement. Toronto Star. https://www.thestar.com/opinion/contributors/2023/01/12/toronto-police-consultation-on-ai-lacks-sufficient-public-engagement.html
Lynch J (2018) Face off: law enforcement use of face recognition technology
Lyon D, Wood DM (eds) (2021) Big data surveillance and security intelligence: the Canadian case. UBC Press.
Maynard R (2017) Policing Black lives: state violence in Canada from slavery to the present. Fernwood Publishing, Nova Scotia
McPhail B (2022) Facial recognition explained: how is FRT used in Canada? CCLA. https://ccla.org/privacy/facial-recognition-explained-how-is-frt-used-in-canada/
New York State Attorney (2023) Attorney general James secures $615,000 from companies that supplied fake comments to influence FCC’s repeal of net neutrality rules. https://ag.ny.gov/press-release/2023/attorney-general-james-secures-615000-companies-supplied-fake-comments-influence
Palmås K, Surber N (2022) Legitimation crisis in contemporary technoscientific capitalism. J Cult Econ 15(3):373–379. https://doi.org/10.1080/17530350.2022.2065331
Pestre D (2008) Challenges for the democratic management of technoscience: governance, participation and the political today. Sci Cult 17(2):101–119. https://doi.org/10.1080/09505430802062869
Podoletz L (2023) We have to talk about emotional AI and crime. AI Soc 38(3):1067–1082. https://doi.org/10.1007/s00146-022-01435-w
Rabinowicz C (2023) Approaches to regulating government use of facial recognition technology. Harv J Law Technol. https://jolt.law.harvard.edu/digest/approaches-to-regulating-government-use-of-facial-recognition-technology
Robertson K, Khoo C, Song Y (2020) To surveil and predict: a human rights analysis of algorithmic policing in Canada. Citizen Lab and International Human Rights Program, University of Toronto, Toronto
Sanders NE, Schneier B (2023). Opinion | How ChatGPT Hijacks Democracy. The New York Times. https://www.nytimes.com/2023/01/15/opinion/ai-chatgpt-lobbying-democracy.html
Scheuerman MK, Wade K, Lustig C, Brubaker JR (2020) How we’ve taught algorithms to see identity: constructing race and gender in image databases for facial analysis. Proc ACM Hum Comput Interact 4(CSCW1):1–35. https://doi.org/10.1145/3392866
Scheuerman MK, Pape M, Hanna A (2021) Auto-essentialization: gender in automated facial analysis as extended colonial project. Big Data Soc 8(2):20539517211053710. https://doi.org/10.1177/20539517211053712
Serebrin J (2020) Montreal should restrict police use of facial recognition technology: Councillor. CTV News. Retrieved March 15, 2024, from https://montreal.ctvnews.ca/montreal-should-restrict-police-use-of-facial-recognition-technology-councillor-1.5111747?cache=dvgujsbn%3FclipId%3D68596
Smith LG (1982) Mechanisms for public participation at a normative planning level in Canada. Can Public Policy/anal Polit 8(4):561. https://doi.org/10.2307/3549306
Standing Committee on Access to Information, Privacy and Ethics (2022) Facial recognition technology and the growing power of artificial intelligence
Stark L (2019) Facial recognition is the plutonium of AI. XRDS Crossroads ACM Mag Stud 25(3):50–55. https://doi.org/10.1145/3313129
Thorpe C, Gregory J (2010) Producing the post-fordist public: the political economy of public engagement with science. Sci Cult 19(3):273–301. https://doi.org/10.1080/09505430903194504
Treasury Board of Canada, T. B. of C (2014). Directive on Open Government. https://www.tbs-sct.canada.ca/pol/doc-eng.aspx?id=28108
Tunney C (2019) RCMP launches review of its social media monitoring operation. CBC News. https://www.cbc.ca/news/politics/rcmp-social-media-review-1.5346741
Urquhart L, Miranda D (2022) Policing faces: the present and future of intelligent facial surveillance. Inf Commun Technol Law 31(2):194–219. https://doi.org/10.1080/13600834.2021.1994220
Venturini T (2010) Diving in magma: how to explore controversies with actor-network theory. Public Underst Sci 19(3):258–273. https://doi.org/10.1177/0963662509102694
Züger T, Asghari H (2023) AI for the public How public interest theory shifts the discourse on AI. AI Soc 38(2):815–828. https://doi.org/10.1007/s00146-022-01480-5
Acknowledgements
This article draws on research supported by the Social Sciences and Humanities Research Council.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interest
The authors have no competing interests to declare that are relevant to the content of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Jones, M., McKelvey, F. Deconstructing public participation in the governance of facial recognition technologies in Canada. AI & Soc (2024). https://doi.org/10.1007/s00146-024-01952-w
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00146-024-01952-w