Bringing the National Security Agency into the Classroom: Ethical Reflections on Academia-Intelligence Agency Partnerships

  • Christopher Kampe
  • Gwendolynne Reid
  • Paul Jones
  • Colleen S.
  • Sean S.
  • Kathleen M. VogelEmail author
Original Paper


Academia-intelligence agency collaborations are on the rise for a variety of reasons. These can take many forms, one of which is in the classroom, using students to stand in for intelligence analysts. Classrooms, however, are ethically complex spaces, with students considered vulnerable populations, and become even more complex when layering multiple goals, activities, tools, and stakeholders over those traditionally present. This does not necessarily mean one must shy away from academia-intelligence agency partnerships in classrooms, but that these must be conducted carefully and reflexively. This paper hopes to contribute to this conversation by describing one purposeful classroom encounter that occurred between a professor, students, and intelligence practitioners in the fall of 2015 at North Carolina State University: an experiment conducted as part of a graduate-level political science class that involved students working with a prototype analytic technology, a type of participatory sensing/self-tracking device, developed by the National Security Agency. This experiment opened up the following questions that this paper will explore: What social, ethical, and pedagogical considerations arise with the deployment of a prototype intelligence technology in the college classroom, and how can they be addressed? How can academia-intelligence agency collaboration in the classroom be conducted in ways that provide benefits to all parties, while minimizing disruptions and negative consequences? This paper will discuss the experimental findings in the context of ethical perspectives involved in values in design and participatory/self-tracking data practices, and discuss lessons learned for the ethics of future academia-intelligence agency partnerships in the classroom.


Intelligence Prototype Research ethics Participatory sensing Self-tracking Values in design 



This material is based upon work supported in whole or in part with funding from the Laboratory for Analytic Sciences (LAS). Any opinions, findings, conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the LAS and/or any agency or entity of the United States Government. The author wishes to thank the reviewers of this paper for insightful comments and suggestions for revisions.


  1. Agre, P. (1997). Toward a critical technical practice: Lessons learned in trying to reform AI. In G. C. Bowker, L. Gasser, S. Leigh Star, & B. Turner (Eds.), Social science, technical systems and cooperative work: The great divide (pp. 131–158). Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
  2. Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.CrossRefGoogle Scholar
  3. Bødker, S. (1987). Through the interface—a human activity approach to user interface design. DAIMI Report Series, 16(224), 1–167.CrossRefGoogle Scholar
  4. Brown, A. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.CrossRefGoogle Scholar
  5. Dawson, S. P. (2006). The impact of institutional surveillance technologies on student behaviour. Surveillance and Society, 4(1/2), 69–84.Google Scholar
  6. Dhami, M., & Careless, K. (2015). Ordinal structure of the generic analytic workflow: A survey of intelligence analysis. In Paper presented at the European intelligence and security informatics conference 2015, Manchester, UK.Google Scholar
  7. Flanagan, M., How, D., & Nissenbaum, H. (2005). Values in design: Theory and practice. In Working Paper.Google Scholar
  8. Friedman, B. (Ed.). (1998). Human values and the design of computer technology. Chicago: University of Chicago Press.Google Scholar
  9. Guston, D. H., & Sarewitz, D. (2002). Real-time technology assessment. Technology in Society, 24(1–2), 93–109.CrossRefGoogle Scholar
  10. Hawisher, G. E., & Selfe, C. L. (1991). The rhetoric of technology and the electronic writing class. College Composition and Communication, 42(1), 55–65.CrossRefGoogle Scholar
  11. Janangelo, J. (1991). Technopower and technoppression: Some abuses of power and control in computer-assisted writing environments. Computers and Composition, 9(1), 47–64.CrossRefGoogle Scholar
  12. Jones, P., Sharma, S., Moon, C. & Samatova, N. F. (2017). A network-fusion guided dashboard interface for task-centric document curation. In Proceedings of the 22nd International Conference on Intelligent User Interfaces (pp. 481–491). ACM.Google Scholar
  13. Jones, P., Thakur, S., Matthews, M., & Cox, S. (2016). A versatile platform for instrumentation of knowledge worker’s computers to improve information analysis. In Proceedings of the Second International Conference on Big Data Computing Service and Applications (pp. 185–194). IEEE.Google Scholar
  14. Jones, P., Thakur, S., Matthews, M., Cox, S., Streck, S., Kampe, C., et al. (2016b). Journaling interfaces to support knowledge workers in their collaborative tasks and goals. In Proceedings of the 17th International Conference on Collaboration Technologies and Systems (pp. 310–318). IEEE.Google Scholar
  15. Knobel, C., & Bowker, G. C. (2011). Computing ethics: Values in design. Communications of the ACM, 54(7), 26–28.CrossRefGoogle Scholar
  16. Manders-Huits, N., & Zimmer, M. (2009). Values and pragmatic action: the challenges of introducing ethical intelligence in technical and design communities. International Review of Information Ethics, 10, 37–44.Google Scholar
  17. Miller, B. H. (2008). Improving all-source intelligence analysis: Elevate knowledge in the equation. The International Journal of Intelligence and Counterintelligence, 21(2), 337–354.CrossRefGoogle Scholar
  18. Neff, G., & Nafus, D. (2016). Self-tracking. Cambridge, MA: MIT Press.Google Scholar
  19. Nolan, B. R (2013). Information sharing and collaboration in the United States intelligence community: An ethnographic study of the national counterterrorism center, Ph.D. Dissertation (Philadelphia: University of Pennsylvania).Google Scholar
  20. Office of the Director of National Intelligence (2008). Intelligence Community Directive 205: Analytic Outreach. Accessed Mar 11 2017.
  21. Rabinow, P., & Bennett, G. (2012). Designing human practices: An experiment with synthetic biology. Chicago: University of Chicago Press.CrossRefGoogle Scholar
  22. Sengers, P. Boehner, K. David, S, & Kaye, J. (2005). Reflective design. In Proceedings of the 4th Decennial Conference on Critical Computing: Between Sense and Sensibility (pp. 49–58). New York: ACM, Retried from
  23. Shilton, K. (2009). Four billon little brothers?: Privacy, mobile, phones, and ubiquitous data collection. Communications of the ACM, 52(11), 48–53.CrossRefGoogle Scholar
  24. Shilton, K. (2010a). Participatory sensing: Building empowering surveillance. Surveillance and Society, 8(2), 131–150.Google Scholar
  25. Shilton, K. (2010b). Technology development with an Agenda: Interventions to Emphasize Values in Design. Retrieved from
  26. Shilton, K. (2012). Values levers: Building ethics into design. Science, Technology and Human Values, 38(3), 374–397.CrossRefGoogle Scholar
  27. Shilton, K. (2014). This is an intervention: Foregrounding and operationalizing ethics during technology design. In K. D. Pimple (Ed.), Emerging pervasive information and communication technologies (PICT) (pp. 177–192). Dordrecht: Springer.CrossRefGoogle Scholar
  28. Shilton, K., & Anderson, S. (2016). Blended, not Bossy: Ethics roles, responsibilities, and expertise in design. Interacting with Computers, 29(1), 71–79.CrossRefGoogle Scholar
  29. Shilton, K. Burke, J.A., Estrin, D, Govindan, R., Hansen, M. Kang, J. & Mun, M. (2009). Designing the personal data stream: Enabling participatory privacy in mobile personal sensing. Retrieved from
  30. Slade, S., & Prinsloo, P. (2013). Learning analytics ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.CrossRefGoogle Scholar
  31. The Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, Report to the President of the United States. (2005). Accessed 11 Mar 2017.
  32. Treverton, G. F. (2008). Assessing the tradecraft of intelligence analysis (Santa Monica. CA: RAND Corporation.Google Scholar
  33. U.S. National Research Council. (2011). Intelligence analysis for tomorrow: Advances from the behavioral and social sciences. Washington, DC: U.S. National Academies Press.Google Scholar
  34. Van Wynsberghe, A., & Robbins, S. (2014). Ethicist as designer: A pragmatic approach to ethics in the lab. Science and Engineering Ethics, 20, 947–961.CrossRefGoogle Scholar

Copyright information

© This is a U.S. government work and its text is not subject to copyright protection in the United States; however, its text may be subject to foreign copyright protection 2017

Authors and Affiliations

  1. 1.Communication Rhetoric and Digital Media ProgramNorth Carolina State UniversityRaleighUSA
  2. 2.Oxford College of Emory UniversityOxfordUSA
  3. 3.Laboratory for Analytic SciencesNorth Carolina State UniversityRaleighUSA
  4. 4.School of Public PolicyUniversity of Maryland, College ParkCollege ParkUSA

Personalised recommendations