Abstract
Academia-intelligence agency collaborations are on the rise for a variety of reasons. These can take many forms, one of which is in the classroom, using students to stand in for intelligence analysts. Classrooms, however, are ethically complex spaces, with students considered vulnerable populations, and become even more complex when layering multiple goals, activities, tools, and stakeholders over those traditionally present. This does not necessarily mean one must shy away from academia-intelligence agency partnerships in classrooms, but that these must be conducted carefully and reflexively. This paper hopes to contribute to this conversation by describing one purposeful classroom encounter that occurred between a professor, students, and intelligence practitioners in the fall of 2015 at North Carolina State University: an experiment conducted as part of a graduate-level political science class that involved students working with a prototype analytic technology, a type of participatory sensing/self-tracking device, developed by the National Security Agency. This experiment opened up the following questions that this paper will explore: What social, ethical, and pedagogical considerations arise with the deployment of a prototype intelligence technology in the college classroom, and how can they be addressed? How can academia-intelligence agency collaboration in the classroom be conducted in ways that provide benefits to all parties, while minimizing disruptions and negative consequences? This paper will discuss the experimental findings in the context of ethical perspectives involved in values in design and participatory/self-tracking data practices, and discuss lessons learned for the ethics of future academia-intelligence agency partnerships in the classroom.
This is a preview of subscription content, access via your institution.




Notes
Note: Pseudonyms are used for the names of all interviewees listed in this paper.
In its initial design, the Journaling focused on a single user but has since been augmented to support collaborative workflows (e.g. in which two or more analysts can see each other’s work, share data, and collaborate on a research problem). See (Jones et al. 2017).
Note: At the time of the experiment, Vogel was involved in separate research project for the LAS; during the course of the experiment the LAS provided Vogel with funding to support graduate student assistance on the project.
With this concern, we checked with the LAS management to ensure that the data would not be sold to a third party, but be used solely for NSA purposes. In addition, according to the LAS management, the students would not hold intellectual property rights for their data.
References
Agre, P. (1997). Toward a critical technical practice: Lessons learned in trying to reform AI. In G. C. Bowker, L. Gasser, S. Leigh Star, & B. Turner (Eds.), Social science, technical systems and cooperative work: The great divide (pp. 131–158). Hillsdale, NJ: Lawrence Erlbaum.
Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.
Bødker, S. (1987). Through the interface—a human activity approach to user interface design. DAIMI Report Series, 16(224), 1–167.
Brown, A. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.
Dawson, S. P. (2006). The impact of institutional surveillance technologies on student behaviour. Surveillance and Society, 4(1/2), 69–84.
Dhami, M., & Careless, K. (2015). Ordinal structure of the generic analytic workflow: A survey of intelligence analysis. In Paper presented at the European intelligence and security informatics conference 2015, Manchester, UK.
Flanagan, M., How, D., & Nissenbaum, H. (2005). Values in design: Theory and practice. In Working Paper.
Friedman, B. (Ed.). (1998). Human values and the design of computer technology. Chicago: University of Chicago Press.
Guston, D. H., & Sarewitz, D. (2002). Real-time technology assessment. Technology in Society, 24(1–2), 93–109.
Hawisher, G. E., & Selfe, C. L. (1991). The rhetoric of technology and the electronic writing class. College Composition and Communication, 42(1), 55–65.
Janangelo, J. (1991). Technopower and technoppression: Some abuses of power and control in computer-assisted writing environments. Computers and Composition, 9(1), 47–64.
Jones, P., Sharma, S., Moon, C. & Samatova, N. F. (2017). A network-fusion guided dashboard interface for task-centric document curation. In Proceedings of the 22nd International Conference on Intelligent User Interfaces (pp. 481–491). ACM.
Jones, P., Thakur, S., Matthews, M., & Cox, S. (2016). A versatile platform for instrumentation of knowledge worker’s computers to improve information analysis. In Proceedings of the Second International Conference on Big Data Computing Service and Applications (pp. 185–194). IEEE.
Jones, P., Thakur, S., Matthews, M., Cox, S., Streck, S., Kampe, C., et al. (2016b). Journaling interfaces to support knowledge workers in their collaborative tasks and goals. In Proceedings of the 17th International Conference on Collaboration Technologies and Systems (pp. 310–318). IEEE.
Knobel, C., & Bowker, G. C. (2011). Computing ethics: Values in design. Communications of the ACM, 54(7), 26–28.
Manders-Huits, N., & Zimmer, M. (2009). Values and pragmatic action: the challenges of introducing ethical intelligence in technical and design communities. International Review of Information Ethics, 10, 37–44.
Miller, B. H. (2008). Improving all-source intelligence analysis: Elevate knowledge in the equation. The International Journal of Intelligence and Counterintelligence, 21(2), 337–354.
Neff, G., & Nafus, D. (2016). Self-tracking. Cambridge, MA: MIT Press.
Nolan, B. R (2013). Information sharing and collaboration in the United States intelligence community: An ethnographic study of the national counterterrorism center, Ph.D. Dissertation (Philadelphia: University of Pennsylvania).
Office of the Director of National Intelligence (2008). Intelligence Community Directive 205: Analytic Outreach. https://fas.org/irp/dni/icd/icd-205.pdf. Accessed Mar 11 2017.
Rabinow, P., & Bennett, G. (2012). Designing human practices: An experiment with synthetic biology. Chicago: University of Chicago Press.
Sengers, P. Boehner, K. David, S, & Kaye, J. (2005). Reflective design. In Proceedings of the 4 th Decennial Conference on Critical Computing: Between Sense and Sensibility (pp. 49–58). New York: ACM, Retried from http://dl.acm.org/citation.cfm?id=1094569.
Shilton, K. (2009). Four billon little brothers?: Privacy, mobile, phones, and ubiquitous data collection. Communications of the ACM, 52(11), 48–53.
Shilton, K. (2010a). Participatory sensing: Building empowering surveillance. Surveillance and Society, 8(2), 131–150.
Shilton, K. (2010b). Technology development with an Agenda: Interventions to Emphasize Values in Design. Retrieved from http://escholarship.org/uc/item/72n146cj
Shilton, K. (2012). Values levers: Building ethics into design. Science, Technology and Human Values, 38(3), 374–397.
Shilton, K. (2014). This is an intervention: Foregrounding and operationalizing ethics during technology design. In K. D. Pimple (Ed.), Emerging pervasive information and communication technologies (PICT) (pp. 177–192). Dordrecht: Springer.
Shilton, K., & Anderson, S. (2016). Blended, not Bossy: Ethics roles, responsibilities, and expertise in design. Interacting with Computers, 29(1), 71–79.
Shilton, K. Burke, J.A., Estrin, D, Govindan, R., Hansen, M. Kang, J. & Mun, M. (2009). Designing the personal data stream: Enabling participatory privacy in mobile personal sensing. Retrieved from http://escholarship.org/uc/item/4sn741ns
Slade, S., & Prinsloo, P. (2013). Learning analytics ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.
The Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, Report to the President of the United States. (2005). https://fas.org/irp/offdocs/wmd_report.pdf. Accessed 11 Mar 2017.
Treverton, G. F. (2008). Assessing the tradecraft of intelligence analysis (Santa Monica. CA: RAND Corporation.
U.S. National Research Council. (2011). Intelligence analysis for tomorrow: Advances from the behavioral and social sciences. Washington, DC: U.S. National Academies Press.
Van Wynsberghe, A., & Robbins, S. (2014). Ethicist as designer: A pragmatic approach to ethics in the lab. Science and Engineering Ethics, 20, 947–961.
Acknowledgements
This material is based upon work supported in whole or in part with funding from the Laboratory for Analytic Sciences (LAS). Any opinions, findings, conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the LAS and/or any agency or entity of the United States Government. The author wishes to thank the reviewers of this paper for insightful comments and suggestions for revisions.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Kampe, C., Reid, G., Jones, P. et al. Bringing the National Security Agency into the Classroom: Ethical Reflections on Academia-Intelligence Agency Partnerships. Sci Eng Ethics 25, 869–898 (2019). https://doi.org/10.1007/s11948-017-9938-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11948-017-9938-7
Keywords
- Intelligence
- Prototype
- Research ethics
- Participatory sensing
- Self-tracking
- Values in design