Minds and Machines

, Volume 27, Issue 1, pp 1–5 | Cite as

Data Philanthropy and Individual Rights

Article

Data Philanthropy-the donation of data from private companies-is becoming increasing more popular, as corporations, like Genentech and Pfizer1 donate their data, and international organisations, like the UN, start to create the infrastructure to facilitate the sharing of corporate-owned data (Kirkpatrick 2013).

However, competing tensions on data control and ownership (Kaisler et al. 2013; Andrejevic 2014; Kostkova et al. 2016), limited technical understanding, and the lack of adequate frameworks for coordination and governance (Mayer-Schönberger and Kenneth 2013; Vayena et al. 2015) pose serious obstacles to the attempts to share data among different actors, especially when these include private corporations. This was the case, for example, in 2014 during the Ebola crisis in West Africa, when gaining access to mobile network operators’ data on population movement would have facilitated tracking the spreading of the disease, but proved to be impossible, because of issues concerning commercial interests, users’ privacy, national security, as well as regulatory uncertainty.2

Understanding how to access these data and how to harness their value for the common good is one of the main challenges of this decade.

Many governments are […] beginning to consider adopting the technologies needed for real-time analytics, to be sure […] the data that could help give them the additional agility needed to meet the challenges of governance in the 21st century is accumulating behind corporate firewalls.3

One of the most serious obstacles in meeting this challenge comes from the risks and sensitivities of maximizing the accessibility and use of personal data (Taddeo 2016). For, despite being anonymised and stripped of any reference that may link back to their subjects, once shared and aggregated data can lead to users re-identification. The possibility of re-identification is not new, but it has grown significantly with the chances to access and aggregate big data sets and with the refinement of analytics techniques (Kaye et al. 2012; de Montjoye et al. 2015).

Re-identification and the subsequent breaching of individual privacy unveil a tension between individual rights and data philanthropy, which if left unaddressed risks hindering the latter. This tension requires careful consideration, lest it invites a zero-sum approach.

This approach could prompt an overprotective and detrimental attitude of individuals, companies, and institutions. For individuals would easily prioritise the protection of their rights over the possible benefits of data philanthropy and restrain access to their data, and so would do private companies to secure the trust of their costumers and avoid legal problems. While regulators and research institutions may avoid fostering this practice to elude privacy risks for individuals, de facto crippling research, especially the one depending on biobanks (Gymrek et al. 2013) and medical registries with aggregated clinical data (Kaye 2012; Mascalzoni et al. 2014). The zero-sum approach would also impair data sharing for humanitarian or policy purposes (more on this presently).

Data philanthropy is morally ambiguous (Taddeo 2016), as it can either foster social development, knowledge, and the flourishing of information societies or can help steering the design of current and future societies in the opposite direction. This is not to argue against data philanthropy. It is rather to emphasise that, although there is something morally desirable about it, data philanthropy poses serious ethical problems.

Clearly, its moral ambiguity is not tantamount to moral neutrality. In that data philanthropy is more likely to foster morally good outcomes, like societal and individual welfare, scientific progress, and better governance, than the opposite. Yet, in itself data philanthropy is not sufficient to ensure morally good results.

The moral ambiguity of data philanthropy, on the one side, and its moral desirability, on the other, unveil the infraethical nature of this phenomenon. Infraethics is a neologism introduced in (Floridi 2012) to refer to

not-yet-ethical framework of implicit expectations, attitudes, and practices that can facilitate and promote moral decisions and actions (Floridi 2012, 738).

According to the analysis proposed in (Floridi 2014), the information revolution has unveiled that morally good behaviour is the result of both moral values and an ethical infrastructure able to foster them. Much in the same way in which societies require a socio-political infrastructure to function and prosper, human interactions require an ethical infrastructure able to support the flourishing of moral actions.

The elements constitutive of a given infraethics are not good in themselves, nor are they sufficient to determine morally good outcomes, but they are likely to facilitate morally good actions. Trust, respect, and loyalty offer good examples of infraethical principles. They are often described as moral principles, but they are better understood as elements of the infraethics of a given society, because they facilitate the achievement of the goal that the members of that society may have, irrespective of its moral value. Trust, respect, and loyalty, for example, are crucial for a happy marriage to prosper; at the same time, they are essential for criminal organisations to grow and consolidate their power (Gambetta 1998; Taddeo and Floridi 2011).

The moral ambiguity of infraethics is resolved once it is combined with the right moral values. As Floridi stresses:

the best pipes may improve the flow but do not improve the quality of the water, and water of the highest quality is wasted if the pipes are rusty or leaky. […] because an infraethics is not morally good in itself, but it is what is most likely to yield moral goodness if properly designed and combined with the right moral values (Floridi 2014, 193).

The infraethics of mature information societies encompasses, among others, trust (Taddeo 2010a, b), security (Taddeo 2013, 2014), transparency (Turilli and Floridi 2009) and, as I argue, data philanthropy (Taddeo 2016). Data philanthropy has the potential to foster a host of morally good behaviours by extending our knowledge and understanding of the world, improving governance, and ultimately by favouring the development of open, pluralistic, and tolerant information societies. The increasing use of data to support scientific research (Kurtz et al. 2005), policy making, and humanitarian processes, see for example the use of social data to analyse teenagers’ attitude towards contraception in developing countries,4 and the managing of emergencies, as in the case of IBM5 donating its weather data to map the spreading of Zika virus, offer good examples of the case in point.

The infraethical nature of data philanthropy unveils that the tension between data philanthropy and individual rights is operational, rather than structural. Thus, it can be solved once the right infrastructures and protocols are in place. A first step in this direction has been proposed by the UN Global Pulse, which envisages the creation of a data commons, where non-sensitive data can be shared after adequate anonymization and aggregation, and the establishing of a sentinel network, where companies can share more sensitive data behind firewalls.6

However, more work needs to be done in this direction, as the design of the right infrastructures and protocols depends on a better understanding of individual consent to access and use of their data; the design of auditing processes to minimise the chances for unethical consequences; the definition of individual, corporate, and institutional responsibilities to share/donate their data (Floridi and Taddeo 2016); and, ultimately, a refined understanding of the way in which individual rights are understood, harmonised, and fulfilled in mature information societies. As stressed by Vayen and Tasioulas “big data developments stimulate interactions […] that impact both the content of these rights and the ways in which they may be productively exercised”, (Vayena and Tasioulas 2016). Ethical analyses are necessary more than ever to understand and shape this impact and ensure that the value and the possibilities to improve private and public life brought about by data philanthropy are fully harnessed.

Footnotes

References

  1. Andrejevic, M. (2014). Big data, Big questions: The big data divide. International Journal of Communication, 8, 1673–1689.Google Scholar
  2. de Montjoye, Y.-A., Radaelli, L., Singh, V. K., & Pentland, A. S. (2015). Unique in the shopping mall: On the reidentifiability of credit card metadata. Science, 347(6221), 536–539. doi:10.1126/science.1256297.CrossRefGoogle Scholar
  3. Floridi, L. (2012). Distributed morality in an information society. Science and Engineering Ethics, 19(3), 727–743. doi:10.1007/s11948-012-9413-4.MathSciNetCrossRefGoogle Scholar
  4. Floridi, L. (2014). The fourth revolution, how the infosphere is reshaping human reality. Oxford: Oxford University Press.Google Scholar
  5. Floridi, L., & Taddeo, M. (2016). What is data ethics? Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2083), 20160360. doi:10.1098/rsta.2016.0360.CrossRefGoogle Scholar
  6. Gambetta, D. (1998). Can We Trust Trust? In D. Gambetta (Ed.), Trust: Making and Breaking Cooperative Relations (pp. 213–238). Oxford: Basil Blackwell.Google Scholar
  7. Gymrek, M., McGuire, A. L., Golan, D., Halperin, E., & Erlich, Y. (2013). Identifying personal genomes by surname inference. Science, 339(6117), 321–324. doi:10.1126/science.1229566.CrossRefGoogle Scholar
  8. Kaisler, S., Armour, F., Espinosa, J. A.,Money, W. (2013). Big data: Issues and challenges moving forward. In 2013 46th Hawaii International Conference on System Sciences (HICSS), (pp. 995–1004). doi:10.1109/HICSS.2013.645.
  9. Kaye, J. (2012). The tension between data sharing and the protection of privacy in genomics research. Annual Review of Genomics and Human Genetics, 13(1), 415–431. doi:10.1146/annurev-genom-082410-101454.CrossRefGoogle Scholar
  10. Kaye, J., Curren, L., Anderson, N., Edwards, K., Fullerton, S. M., Kanellopoulou, N., et al. (2012). From patients to partners: Participant-centric initiatives in biomedical research. Nature Reviews Genetics, 13(5), 371–376. doi:10.1038/nrg3218.CrossRefGoogle Scholar
  11. Kirkpatrick, R. 2013. A new type of philanthropy: donating data. Harvard Business Review. March 21. https://hbr.org/2013/03/a-new-type-of-philanthropy-don.
  12. Kostkova, P., Brewer, H., de Lusignan, S., Fottrell, E., Goldacre, B., Hart, G., et al. (2016). Who Owns the Data? Open Data for Healthcare. Frontiers in Public Health. doi:10.3389/fpubh.2016.00007.Google Scholar
  13. Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Demleitner, M., & Murray, S. S. (2005). Worldwide use and impact of the NASA Astrophysics Data System digital library. Journal of the American Society for Information Science and Technology, 56(1), 36–45. doi:10.1002/asi.20095.CrossRefGoogle Scholar
  14. Mascalzoni, D., Paradiso, A., & Hansson, M. (2014). Rare disease research: Breaking the privacy barrier. Applied and Translational Genomics, 3(2), 23–29. doi:10.1016/j.atg.2014.04.003.CrossRefGoogle Scholar
  15. Mayer-Schönberger, V., & Kenneth, C. (2013). Big data: A revolution that will transform how we live, work, and think. Boston: Houghton Mifflin Harcourt.Google Scholar
  16. Taddeo, M. (2010a). Modelling trust in artificial agents, a first step toward the analysis of e-trust. Minds and Machines, 20(2), 243–257. doi:10.1007/s11023-010-9201-3.CrossRefGoogle Scholar
  17. Taddeo, M. (2010b). Trust in technology: A Distinctive and a problematic relation. Knowledge, Technology and Policy, 23(3–4), 283–286. doi:10.1007/s12130-010-9113-9.CrossRefGoogle Scholar
  18. Taddeo, M. (2013). Cyber security and individual rights, striking the right balance. Philosophy and Technology, 26(4), 353–356. doi:10.1007/s13347-013-0140-9.CrossRefGoogle Scholar
  19. Taddeo, M. (2014). The struggle between liberties and authorities in the information age. Science and Engineering Ethics, 21(5), 1125–1138. doi:10.1007/s11948-014-9586-0.CrossRefGoogle Scholar
  20. Taddeo, M. (2016). Data philanthropy and the design of the infraethics for information societies. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2083), 20160113. doi:10.1098/rsta.2016.0113.CrossRefGoogle Scholar
  21. Taddeo, M., & Floridi, L. (2011). The case for e-trust. Ethics and Information Technology, 13(1), 1–3. doi:10.1007/s10676-010-9263-1.CrossRefGoogle Scholar
  22. Turilli, M., & Floridi, L. (2009). The ethics of information transparency. Ethics and Information Technology, 11(2), 105–112. doi:10.1007/s10676-009-9187-9.CrossRefGoogle Scholar
  23. Vayena, E., Salathé, M., Madoff, L. C., & Brownstein, J. S. (2015). Ethical challenges of big data in public health. PLoS Computational Biology, 11(2), e1003904. doi:10.1371/journal.pcbi.1003904.CrossRefGoogle Scholar
  24. Vayena, E., & Tasioulas, J. (2016). The dynamics of big data and human rights: The case of scientific research. Philosophical Transactions A, 374(2083), 2–14.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2017

Authors and Affiliations

  1. 1.Oxford Internet InstituteUniversity of OxfordOxfordUK
  2. 2.Alan Turing InstituteLondonUK

Personalised recommendations