Advertisement

Aid and AI: The Challenge of Reconciling Humanitarian Principles and Data Protection

  • Júlia Zomignani BarbozaEmail author
  • Lina Jasmontaitė-Zaniewicz
  • Laurence Diver
Chapter
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 576)

Abstract

Artificial intelligence systems have become ubiquitous in everyday life, and their potential to improve efficiency in a broad range of activities that involve finding patterns or making predictions have made them an attractive technology for the humanitarian sector. However, concerns over their intrusion on the right to privacy and their possible incompatibility with data protection principles may pose a challenge to their deployment. Furthermore, in the humanitarian sector, compliance with data protection principles is not enough, because organisations providing humanitarian assistance also need to comply with humanitarian principles to ensure the provision of impartial and neutral aid that does not harm beneficiaries in any way. In view of this, the present contribution analyses a hypothetical facial recognition system based on artificial intelligence that could assist humanitarian organisations in their efforts to identify missing persons. Recognising that such a system could create risks by providing information on missing persons that could potentially be used by harmful actors to identify and target vulnerable groups, such a system ought only to be deployed after a holistic impact assessment has been made, to ensure its adherence to both data protection and humanitarian principles.

Keywords

Humanitarian action Artificial intelligence Facial recognition Data protection Humanitarian principles 

References

  1. 1.
    Centre for Information Policy Leadership: First Report: Artificial Intelligence and Data Protection in Tension (2018)Google Scholar
  2. 2.
    Angwin, J., Larson, J., Mattu, S., Kirchner, L.: Machine Bias. ProPublica (2016). https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. Accessed 2 Aug 2019
  3. 3.
    Corbett-Davies, S., Pierson, E., Feller, A., Goel, S.: A computer program used for bail and sentencing decisions was labeled biased against blacks. It’s actually not that clear. Washington Post (2016). https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorithm-be-racist-our-analysis-is-more-cautious-than-propublicas/. Accessed 23 Oct 2019
  4. 4.
    Slim, H.: Humanitarian Ethics - A Guide to the Morality of Aid in War and Disaster, 1st edn. Oxford University Press, New York (2015)Google Scholar
  5. 5.
    World Food Program: A statement on the WFP-Palantir partnership (2019). https://insight.wfp.org/a-statement-on-the-wfp-palantir-partnership-2bfab806340c. Accessed 2 Aug 2019
  6. 6.
    Council of Europe, Glossary on Artificial Intelligence: https://www.coe.int/en/web/artificial-intelligence/glossary. Accessed 5 Aug 2019
  7. 7.
    Kuner, C., Marelli, M.: Handbook on Data Protection in Humanitarian Action, 1st edn. International Committee of the Red Cross, Geneva (2017)Google Scholar
  8. 8.
    Scott, R.: Imagining more effective humanitarian aid a donor perspective. 18 OECD Development Co-operation Directorate 34 (2014)Google Scholar
  9. 9.
    Bradford, A.: The Brussels Effect. Northwest. Univ. Law Rev. 107(1), 1–68 (2012)Google Scholar
  10. 10.
    Hashtag Standards for Emergencies (2014). https://www.unocha.org/sites/unocha/files/Hashtag%20Standards%20for%20Emergencies.pdf. Accessed 2 Aug 2019
  11. 11.
    iRevolutions: AIDR: Artificial Intelligence for Disaster Response (2013). https://irevolutions.org/2013/10/01/aidr-artificial-intelligence-for-disaster-response/. Accessed 2 Aug 2019
  12. 12.
    Meier, P.: Digital Humanitarians: How Big Data Is Changing the Face of Humanitarian Response, 1st edn. CRC Press, Boca Raton (2015)CrossRefGoogle Scholar
  13. 13.
    Project Premonition website. https://www.microsoft.com/en-us/research/project/project-premonition/. Accessed 2 Aug 2019
  14. 14.
    Slim, H.: Eye Scan Therefore I am: The Individualization of Humanitarian Aid. European University Institute (2015). https://iow.eui.eu/2015/03/15/eye-scan-therefore-i-am-the-individualization-of-humanitarian-aid/. Accessed 2 Aug 2019
  15. 15.
    Taigman, Y., Yang, M., Ranzato, M.A., Wold, F..: DeepFace: closing the gap to human-level performance in face verification. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1701–1708. IEEE, Columbus (2014). https://research.fb.com/wp-content/uploads/2016/11/deepface-closing-the-gap-to-human-level-performance-in-face-verification.pdf. Accessed 23 Oct 2019
  16. 16.
    Buolamwini, J., Gebru, T.: Gender shades: intersectional accuracy disparities in commercial gender classification. Proc. Mach. Learn. Res. 81, 1–15 (2018)Google Scholar
  17. 17.
    Hartzog, W.: Facial Recognition Is the Perfect Tool for Oppression. Medium (2018). https://medium.com/s/story/facial-recognition-is-the-perfect-tool-for-oppression-bc2a08f0fe66. Accessed 27 May 2019
  18. 18.
    Golumbia, D.: Do You Oppose Bad Technology, or Democracy? Medium (2019). https://medium.com/@davidgolumbia/do-you-oppose-bad-technology-or-democracy-c8bab5e53b32. Accessed 23 Oct 2019
  19. 19.
    Traffic Jam website. https://www.marinusanalytics.com/traffic-jam. Accessed 2 Aug 2019
  20. 20.
    Samsung: Use Facial recognition security on your Galaxy phone. https://www.samsung.com/us/support/answer/ANS00062630/. Accessed 2 Aug 2019
  21. 21.
    Samsung: Diebold ATM Samsung SDS Nexsign – Digital Banking (2017). https://www.samsungsds.com/global/en/about/news/1196788_1373.html. Accessed 2 Aug 2019
  22. 22.
    Vincent, J.: A photo storage app used customers’ private snaps to train facial recognition AI – Photo app Ever pivoted its business without informing users. The Verge (2019). https://www.theverge.com/2019/5/10/18564043/photo-storage-app-ever-facial-recognition-secretly-trained-ai. Accessed 2 Aug 2019
  23. 23.
    Collie, M.: ‘Just walk away from it’: The scary things companies like FaceApp can do with your data. GlobalNews (2019). https://globalnews.ca/news/5653531/faceapp-data-mining/. Accessed 5 Aug 2019
  24. 24.
    Carman, A.: FaceApp is back and so are privacy concerns. The Verge (2019). https://www.theverge.com/2019/7/17/20697771/faceapp-privacy-concerns-ios-android-old-age-filter-russia. Accessed 5 Aug 2019
  25. 25.
    Olavario, D.: FaceApp: are security concerns around viral app founded? Euronews (2019). https://www.euronews.com/2019/07/17/faceapp-are-security-concerns-around-viral-app-founded-thecube. Accessed 5 Aug 2019
  26. 26.
    Milligan, C.S.: Facial recognition technology, video surveillance, and privacy. South. Calif. Interdiscip. Law J. 9, 295–334 (1999)Google Scholar
  27. 27.
    Bowyer, K.W.: Face recognition technology: security versus privacy. IEEE Technol. Soc. Mag. 23(1), 9–19 (2004)CrossRefGoogle Scholar
  28. 28.
    Privacy International: The police are increasingly using facial recognition cameras in public to spy on us (2019). https://privacyinternational.org/feature/2726/police-are-increasingly-using-facial-recognition-cameras-public-spy-us. Accessed 1 Aug 2019
  29. 29.
    Frew, J.: How Facial Recognition Search Is Destroying Your Privacy. Make Use Of (2019). https://www.makeuseof.com/tag/facial-recognition-invading-privacy/. Accessed 1 Aug 2019
  30. 30.
    Curran, D.: Facial recognition will soon be everywhere. Are we prepared? The Guardian (2019). https://www.theguardian.com/commentisfree/2019/may/21/facial-recognition-privacy-prepared-regulation. Accessed 1 Aug 2019
  31. 31.
    Mozur, P.: One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority. The New York Times (2019). https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html. Accessed 2 Aug 2019
  32. 32.
    Lewis, P.: ‘I was shocked it was so easy’: meet the professor who says facial recognition can tell if you’re gay. The Guardian (2018). https://www.theguardian.com/technology/2018/jul/07/artificial-intelligence-can-tell-your-sexuality-politics-surveillance-paul-lewis. Accessed 2 Aug 2019
  33. 33.
    Cuthbertson, A.: Indian Police Trace 3,000 Missing Children in Just Four Days Using Facial Recognition Technology. Independent (2018). https://www.independent.co.uk/life-style/gadgets-and-tech/news/india-police-missing-children-facial-recognition-tech-trace-find-reunite-a8320406.html. Accessed 2 Aug 2019
  34. 34.
    Bernal, N.: Facial recognition to be used by UK police to find missing people. The Telegraph (2019). https://www.telegraph.co.uk/technology/2019/07/16/facial-recognition-technology-used-uk-police-find-missing-people/. Accessed 2 Aug 2019
  35. 35.
    Restoring Family Links website. https://familylinks.icrc.org/en/Pages/home.aspx. Accessed 2 Aug 2019
  36. 36.
    Trace the Face – Migrants in Europe website. https://familylinks.icrc.org/europe/en/Pages/Home.aspx. Accessed 2 Aug 2019
  37. 37.
    Norton: How does facial recognition work? https://us.norton.com/internetsecurity-iot-how-facial-recognition-software-works.html. Accessed 5 Aug 2019
  38. 38.
    Gellert, R.: Understanding the risk based approach to data protection: an analysis of the links between law, regulation, and risk. Ph.D. thesis, Vrije Universiteit Brussel (2017)Google Scholar
  39. 39.
    Burrell, J.: How the machine “thinks”: understanding opacity in machine learning algorithms. Big Data Soc. 1–12 (2016)Google Scholar
  40. 40.
    The Norwegian Data Protection Authority: Artificial intelligence and privacy (2018). https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf. Accessed 2 Aug 2019
  41. 41.
    GDPR: Art. 4(14)Google Scholar
  42. 42.
    GDPR: Art. 9Google Scholar
  43. 43.
    Article 29 Data Protection Working Party: Opinion 3/2012 on developments in biometric technologies (2012). https://www.pdpjournals.com/docs/87998.pdf. Accessed 2 Aug 2019
  44. 44.
    Council of Europe: Guidelines on artificial intelligence and data protection (2019). https://rm.coe.int/guidelines-on-artificial-intelligence-and-data-protection/168091f9d8. Accessed 2 Aug 2019
  45. 45.
    Article 29 Data Protection Working Party: Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (2018). https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053. Accessed 2 Aug 2019
  46. 46.
    Mantelero, A.: Artificial intelligence and big data: a blueprint for a human rights, social and ethical impact assessment. Comput. Law Secur. Rev. 34(4), 754–772 (2018)CrossRefGoogle Scholar
  47. 47.
    Kuner, C., Svantesson, D.J.B., Cate, F.H., Lynskey, O., Millard, C.: Data protection and humanitarian emergencies. Int. Data Priv. Law 7(3), 147–148 (2017)CrossRefGoogle Scholar
  48. 48.
    Sphere: The Sphere Handbook – Humanitarian Charter and Minimum Standardsin Humanitarian Response (2018). https://spherestandards.org/handbook/editions/. Accessed 2 Aug 2019
  49. 49.
    ICRC: Restoring Family Links and Data Protection – background document (2019). https://rcrcconference.org/app/uploads/2019/06/33IC-RFL-background-document_en.pdf. Accessed 2 Aug 2019
  50. 50.
    McGoldrick, C.: The future of humanitarian action: an ICRC perspective. Int. Rev. Red Cross 93(884), 965–991 (2011)CrossRefGoogle Scholar
  51. 51.
    Martin, A., Taylor, L.: Biometric Ultimata — what the Yemen conflict can tell us about the politics of digital ID systems. Global Data Justice (2019). https://globaldatajustice.org/2019-06-21-biometrics-WFP/
  52. 52.

Copyright information

© IFIP International Federation for Information Processing 2020

Authors and Affiliations

  • Júlia Zomignani Barboza
    • 1
    Email author
  • Lina Jasmontaitė-Zaniewicz
    • 1
  • Laurence Diver
    • 1
  1. 1.Vrije Universiteit BrusselBrusselsBelgium

Personalised recommendations