Akrich, M. (1992). The description of technological objects. In W. Bijker & J. Law (Eds.), Shaping technology/building society: Studies in sociotechnical change (pp. 205–224). Cambridge, MA: MIT Press.
Ananny, M., & Crawford, K. (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society. https://doi.org/10.1177/1461444816676645.
Andrews, D. A., & Bonta, J. (2010). Rehabilitating criminal justice policy and practice. Psychology, Public Policy, and Law,16(1), 39.
Angwin, J., Larson, J., Kirchner, L., & Mattu, S. (2017). Minority neighborhoods pay higher car insurance premiums than white areas with the same risk. ProPublica. https://www.propublica.org/article/minority-neighborhoods-higher-car-insurance-premiums-white-areas-same-risk. Accessed 12 June 2017.
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016a). Machine bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. Accessed 3 Aug 2016.
Angwin, J., Parris Jr, T., & Mattu, S. (2016b). Breaking the black box: When algorithms decide what you pay. ProPublica. https://www.propublica.org/article/breaking-the-black-box-when-algorithms-decide-what-you-pay. Accessed 11 Oct 2016.
Arce, D. G., & Gentile, M. C. (2015). Giving voice to values as a leverage point in business ethics education. Journal of Business Ethics,131(3), 535–542.
Bambauer, D. E. (2017). Uncrunched: Algorithms, Decision making, and Privacy, Second Annual Digital Information Policy Scholars Conference, George Mason University Antonin Scalia Law School, Arlington, VA. (Apr. 28, 2017).
Barocas, S., Hood, S., & Ziewitz, M. (2013). Governing algorithms: A provocation piece. http://dx.doi.org/10.2139/ssrn.2245322. Accessed 14 June 2017.
Barocas, S., & Selbst, A. D. (2016). Big data’s disparate impact. California Law Review,104, 671.
Barry-Jester, A. M., Casselman, B., & Goldstein, D. (2015). The new science of sentencing. The Marshall Project. https://www.themarshallproject.org/2015/08/04/the-new-science-of-sentencing. Accessed 15 Aug 2016.
Bijker, W. (1995). Of bicycles, bakelite, and bulbs: Towards a theory of sociological change. Boston MA: MIT Press.
Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and Information Technology,15(3), 209–227.
Brenkert, G. G. (2000). Social products liability: The case of the firearms manufacturers. Business Ethics Quarterly,10(01), 21–32.
Brown, K. (2016). When Facebook decides who’s a terrorist. Fusion. http://fusion.net/story/356354/facebook-kashmir-terrorism/. Accessed 11 Oct 2016.
Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society,3(1), 2053951715622512.
Byrne, E. F. (2007). Assessing arms makers’ corporate social responsibility. Journal of Business Ethics,74(3), 201–217.
Calo, R. (2013). Consumer subject review boards: A thought experiment. Stanford Law Review Online,66, 97.
Calo, R. (2014). Digital market manipulation. George Washington Law Review,82(4), 995.
Chen, A. (2017). AI picks up racial and gender biases when learning from what humans write. The Verge. https://www.theverge.com/2017/4/13/15287678/machine-learning-language-processing-artificial-intelligence-race-gender-bias. Accessed 12 June 2017.
Citron, D. K. (2007). Technological due process. Washington University Law Review,85, 1249.
Colquitt, J. A. (2001). On the dimensionality of organizational justice: A construct validation of a measure. Journal of Applied Psychology,86(3), 386.
Cormen, T. H. (2009). Introduction to algorithms. Cambridge, MA: MIT press.
Datta, A., Tschantz, M. C., & Datta, A. (2015). Automated experiments on ad privacy settings. Proceedings on Privacy Enhancing Technologies,2015(1), 92–112.
Desai, D. R., & Kroll, J. A. (2017). Trust but verify: A guide to algorithms and the law. Harvard Journal of Law and Technology. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2959472.
Dewey, C. (2016). Facebook has repeatedly trended fake news since firing its human editors. The Washington Post. https://www.washingtonpost.com/news/the-intersect/wp/2016/10/12/facebook-has-repeatedly-trended-fake-news-since-firing-its-human-editors/?tid=sm_tw&utm_term=.b795225d264d. Accessed 2 Dec 2016.
Diakopoulos, N. (2013). Sex, violence, and autocomplete algorithms. Slate. http://www.slate.com/articles/technology/future_tense/2013/08/words_banned_from_bing_and_google_s_autocomplete_algorithms.html. Accessed 15 Aug 2016.
Diakopoulos, N. (2015). Algorithmic accountability: Journalistic investigation of computational power structures. Digital Journalism,3(3), 398–415.
Diakopoulos, N., & Koliska, M. (2017). Algorithmic transparency in the news media. Digital Journalism,5(7), 809–828.
Donaldson, T., & Dunfee, T. W. (1994). Toward a unified conception of business ethics: Integrative social contracts theory. Academy of Management Review,19(2), 252–284.
Dwork, C., & Mulligan, D. K. (2013). It’s not privacy, and it’s not fair. Stanford Law Review Online,66, 35.
Dwoskin, E. (2015). How social bias creeps into web technology. Wall Street Journal. http://www.wsj.com/articles/computers-are-showing-their-biases-and-tech-firms-are-concerned-1440102894?mod=rss_Technology. Accessed 12 Aug 2016.
Epstein, R. A. (1973). A theory of strict liability. The Journal of Legal Studies,2(1), 151–204.
Feldman, M., Friedler, S. A., Moeller, J., Scheidegger, C., & Venkatasubramanian, S. (2015). Certifying and removing disparate impact (pp. 259–268). In Presented at the proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining, ACM.
Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS),14(3), 330–347.
Garcia-Martinez, A. (2017). I’m an ex-Facebook exec: don’t believe what they tell you about ads. The Guardian. https://www.theguardian.com/technology/2017/may/02/facebook-executive-advertising-data-comment?CMP=share_btn_tw. Accessed 12 June 2017.
Garfinkel, P. (2016). A linguist who cracks the code in names to predict ethnicity. New York Times. http://www.nytimes.com/2016/10/16/jobs/a-linguist-who-cracks-the-code-in-names-to-predict-ethnicity.html?nytmobile=0. Accessed 6 Dec 2016.
Gettman, J., Whitfeld, E., & Allen, M. (2016). The war on marijuana in black and white. ACLU of Massachusetts. https://aclum.org/app/uploads/2016/10/TR-Report-10-2016-FINAL-with-cover.pdf. Accessed 11 Oct 2016.
Ghani, R. (2016). You say you want transparency and interpretability? Machine Learning, Data Science, Analytics, Obama for America, University of Chicago, Big Data, Public Policy. http://www.rayidghani.com/you-say-you-want-transparency-and-interpretability. Accessed 18 Oct 2016.
Gillespie, T. (2016). Algorithmically recognizable: Santorum’s Google problem, and Google’s Santorum problem. Information, Communication & Society,20, 1–18.
Helbing, D., Frey, B. S., Gigerenzer, G., Hafen, E., Hagner, M., Hofstetter, Y., et al. (2017). Will democracy survive big data and artificial intelligence? Scientific American. https://www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence/. Accessed 12 June 2017.
Holder, E. (2014). Attorney General Eric Holder Speaks at the National Association of Criminal Defense Lawyers 57th Annual Meeting and 13th State Criminal Justice Network Conference. The United States Department of Justice. https://www.justice.gov/opa/speech/attorney-general-eric-holder-speaks-national-association-criminal-defense-lawyers-57th. Accessed 26 October 2016.
Howard, A. (2014). Data-driven policy and commerce requires algorithmic transparency. TechRepublic. http://www.techrepublic.com/article/data-driven-policy-and-commerce-requires-algorithmic-transparency/. Accessed 30 July 2015.
Hu, M. (2016). Big Data Blacklisting. Florida Law Review,67(5), 1735.
Introna, L. D. (2016). Algorithms, governance, and governmentality: On governing academic writing. Science, Technology and Human Values,41(1), 17–49.
Johnson, D. G. (2004). Is the global information infrastructure a democratic technology? Readings in Cyberethics,18, 121.
Johnson, D. G., & Noorman, M. (2014). Artefactual agency and artefactual moral agency. In P. Kroes & P. P. Verbeek (Eds.), The Moral Status of Technical Artefacts. Philosophy of Engineering and Technology (vol. 17). Springer, Dordrecht.
Jones, M. L. (2017). A right to a human in the loop: Legal constructions of computer automation & personhood from data banks to algorithms. Social Studies of Science,47(2), 216–239.
Kharif, O. (2016). No Credit History? No Problem. Lenders are looking at your phone data. Bloomberg.com. https://www.bloomberg.com/news/articles/2016-11-25/no-credit-history-no-problem-lenders-now-peering-at-phone-data. Accessed 1 Dec 2016.
Kim, T. (2016). How an old hacking law hampers the fight against online discrimination. The New Yorker. http://www.newyorker.com/business/currency/how-an-old-hacking-law-hampers-the-fight-against-online-discrimination. Accessed 19 Oct 2016.
Kraemer, F., Van Overveld, K., & Peterson, M. (2011). Is there an ethics of algorithms? Ethics and Information Technology,13(3), 251–260.
Kramer, S. (2017). An algorithm is replacing bail hearings in New Jersey. Motherboard. https://motherboard.vice.com/en_us/article/an-algorithm-is-replacing-bail-hearings-in-new-jersey. Accessed 11 June 2017.
Kroll, J. A., Huey, J., Barocas, S., Felten, E. W., Reidenberg, J. R., Robinson, D. G., & Yu, H. (2017). Accountable algorithms. University of Pennsylvania Law Review, 165. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2765268.
LaFrance, A. (2017). Uber’s secret program raises questions about discrimination. The Atlantic. https://www.theatlantic.com/technology/archive/2017/03/uber-ghost-app/518610/?utm_source=twb. Accessed 12 June 2017.
Larson, Q. (2017). What do Uber, Volkswagen and Zenefits have in common? They all used hidden code to break the law. freeCodeCamp. https://medium.freecodecamp.com/dark-genius-how-programmers-at-uber-volkswagen-and-zenefits-helped-their-employers-break-the-law-b7a7939c6591#.3c6kga4q7. Accessed 12 June 2017.
Latour, B. (1992). Where are the missing masses? The sociology of a few mundane artifacts. In W. Bijker & J. Law (Eds.), Shaping technology/building society: Studies in sociotechnical change (pp. 225–258). Cambridge, MA: MIT Press.
Macaulay, T. (2017). Pioneering computer scientist calls for National Algorithm Safety Board. Techworld. http://www.techworld.com/data/pioneering-computer-scientist-calls-for-national-algorithms-safety-board-3659664/. Accessed 12 June 2017.
Martin, K. (2015). Ethical issues in the big data industry. MIS Quarterly Executive,14(2), 67–85.
Martin, K., & Freeman, R. E. (2004). The separation of technology and ethics in business ethics. Journal of Business Ethics,53(4), 353–364.
Miller, C. C. (2015). Algorithms and bias: Q. and A. With Cynthia Dwork. The New York Times. http://www.nytimes.com/2015/08/11/upshot/algorithms-and-bias-q-and-a-with-cynthia-dwork.html. Accessed 12 Aug 2016.
Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society,3(2), 1–21.
Nash, K. S. (2016). Mastercard deploys artificial intelligence to pinpoint transaction fraud. Wall Street Journal. http://blogs.wsj.com/cio/2016/11/30/mastercard-deploys-artificial-intelligence-to-pinpoint-transaction-fraud/. Accessed 1 Dec 2016.
Newman, D. T., Fast, N., & Harmon, D. (2016). When eliminating Bias isn’t fair: Decision-making algorithms and organizational justice. Presented at the Society for Business Ethics in Anaheim, CA.
Neyland, D. (2016). Bearing account-able witness to the ethical algorithmic system. Science, Technology and Human Values,41(1), 50–76.
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Crown Publishing Group.
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Cambridge, MA: Harvard University Press.
Rawls, J. (2009). A theory of justice. Cambridge, MA: Harvard University Press.
Rudner, R. (1953). The scientist qua scientist makes value judgments. Philosophy of Science,20(1), 1–6.
Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2016). Automation, algorithms, and politics| when the algorithm itself is a racist: Diagnosing ethical harm in the basic components of software. International Journal of Communication,10, 19.
Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society,4(2), 2053951717738104.
Selbst, A. D., & Barocas, S. (2018). The intuitive appeal of explainable machines. Fordham Law Review (Forthcoming).
Shaver, K. (2012). Female dummy makes her mark on male-dominated crash tests. Washington Post. https://www.washingtonpost.com/local/trafficandcommuting/female-dummy-makes-her-mark-on-male-dominated-crash-tests/2012/03/07/gIQANBLjaS_story.html. Accessed 20 Dec 2016.
Skeem, J. L., & Lowenkamp, C. T. (2015). Risk, race, & recidivism: predictive bias and disparate impact. Available at SSRN.
Smith, M. (2016). In Wisconsin, a backlash against using data to foretell defendants’ futures. The New York Times. http://www.nytimes.com/2016/06/23/us/backlash-in-wisconsin-against-using-data-to-foretell-defendants-futures.html. Accessed 12 Aug 2016.
Sollars, G. G. (2003). A critique of social products liability. Business Ethics Quarterly,13(03), 381–390.
Thornton, J. (2016). Cost, accuracy, and subjective fairness in legal information technology: A response to technological due process critics. New York University Law Review,91, 1821–1949.
Tufekci, Z. (2015). Algorithmic harms beyond Facebook and Google: Emergent challenges of computational agency. Journal on Telecommunications and High Technology Law,13, 203.
Urbina, I. (2013). Marijuana arrests four times as likely for blacks. The New York Times. http://www.nytimes.com/2013/06/04/us/marijuana-arrests-four-times-as-likely-for-blacks.html. Accessed 3 Aug 2016.
Waddell, K. (2016). How algorithms can bring down minorities’ credit scores. The Atlantic. http://www.theatlantic.com/technology/archive/2016/12/how-algorithms-can-bring-down-minorities-credit-scores/509333/. Accessed 6 Dec 2016.
Wexler, R. (2017). How companies hide software flaws that impact who goes to prison and who gets out. Washington Monthly. http://washingtonmonthly.com/magazine/junejulyaugust-2017/code-of-silence/. Accessed 12 June 2017.
Winner, L. (1980). Modern technology: Problem or opportunity? Daedalus, 109(1), 121–136.
Yeung, K. (2017). Hypernudge: Big data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136.
Ziewitz, M. (2016). Governing algorithms myth, mess, and methods. Science, Technology and Human Values,41(1), 3–16.