Fluidity and Rigour: Addressing the Design Considerations for OSINT Tools and Processes

  • B. L. William WongEmail author
Part of the Advanced Sciences and Technologies for Security Applications book series (ASTSA)


In comparison with intelligence analysis, OSINT requires different methods of identifying, extracting and analyzing the data. Analysts must have the tools that enable them to flexibly, tentatively and creatively generate anchors to start a line of inquiry, develop and test their ideas, and to fluidly transition between methods and thinking and reasoning strategies to construct critical and rigorous arguments as that particular line of inquiry is finalised. This chapter illustrates how analysts think from a design perspective and discusses the integration of Fluidity and Rigour as two conflicting design requirements. It further proposes that designs for OSINT tools and processes should support the fluid and rapid construction of loose stories, a free-form approach to the assembly of data, inference making and conclusion generation to enable the rapid evolution of the story rigorous enough to withstand interrogation. We also propose that the design encourages the analyst to develop a questioning mental stance to encourage self-checking to identify and remove dubious or low reliability data.


Black Hole Intelligence Analysis Counterfactual Reasoning Part Deduction Online Auction Site 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



The research leading to the results reported here has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) through Project VALCRI, European Commission Grant Agreement N° FP7- IP-608142, awarded to B.L. William Wong, Middlesex University and partners.


  1. Baber C, Attfield S, Wong W, Rooney C (2013) Exploring sensemaking through an intelligence analysis exercise. In: Paper presented at the 11th international conference on naturalistic decision making NDM 2013, Marseille, France, 22–24 May 2013Google Scholar
  2. Bazzell M (2016) Open source intelligence techniques: resources for searching and analyzing online information, 5th ednGoogle Scholar
  3. Bennett KB, Flach JM (2011) Display and interface design: subtle science, exact art. CRC Press, Taylor and Francis Group, Boca RatonGoogle Scholar
  4. Bennett KB, Toms ML, Woods DD (1993) Emergent features and graphical elements: designing more effective configural displays. Hum Factors 35(1):71–97Google Scholar
  5. Best RAJ, Cumming A (2008) Open source intelligence (OSINT): issues for congress. In: Paulson TM (ed) Intelligence issues and developments. Nova Science Publishers, IncGoogle Scholar
  6. Bradbury D (2011) In plain view: open source intelligence. Comput Fraud Secur 2011:5–9Google Scholar
  7. Burns CM, Hajdukiewicz JR (2004) Ecological interface design. CRC Press, Boca Raton, FLGoogle Scholar
  8. Clark, R. M. (2013). Intelligence analysis: A target-centric approach. Thousand Oaks, CA: CQ Press, SAGE Publications.Google Scholar
  9. Cook K, Earnshaw R, Stasko J (2007) Discovering the unexpected. IEEE Comput Graph Appl 27:15–19Google Scholar
  10. Gerber, M., Wong, B. L. W., & Kodagoda, N. (2016). How analysts think: Intuition, Leap of Faith and Insight, Proceedings of the Human Factors and Ergonomics Society 60th Annual Meeting, 19-23 September 2016, Washington, D.C., USA (pp. 173-177): SAGE Publications.Google Scholar
  11. Glassman M, Kang MJ (2012) Intelligence in the internet age: The emergence and evolution of open source intelligence (OSINT). Comput Hum Behav 28(2):673–682CrossRefGoogle Scholar
  12. Goodwin J (2000) Wigmore’s chart method. Inf Logic 20(3):223–243MathSciNetGoogle Scholar
  13. Heuer RJ (1999) The psychology of intelligence analysis: center for the study of intelligence. Central Intelligence AgencyGoogle Scholar
  14. Hobbs C, Moran M, Salisbury D (eds) (2014) Open source intelligence in the twenty-first century: new approaches and opportunities. Palgrave Macmillan, London, UKGoogle Scholar
  15. Klein G, Hoffman R (2009) Causal reasoning: Initial report of a naturalistic study of causal inferences. In: William WBL, Stanton NA (eds) Naturalistic decision making and computers, Proceedings of the 9th bi-annual international conference on Naturalistic Decision Making NDM9, 23-26 June 2009, BCS, London. BCS, London, pp 83–90 Google Scholar
  16. Klein G, Moon B, Hoffman RR (2006) Making sense of sensemaking 2: a macrocognitive model. IEEE Intell Syst 21(5):88–92CrossRefGoogle Scholar
  17. Klein G, Philips JK, Rall EL, Peluso DA (2007) A data-frame theory of sense-making. In: Hoffman RR (ed) Expertise out of context: proceedings of the sixth international conference on naturalistic decision making. Lawrence Erlbaum Associates, New York, pp 113–155Google Scholar
  18. Kodagoda N, Attfield S, Wong BLW, Rooney C, Choudhury T (2013a) Using interactive visual reasoning to support sense-making: implications for design. IEEE Trans Visual Comput Graphics 19(12):2217–2226CrossRefGoogle Scholar
  19. Kodagoda, N., et al., Using Interactive Visual Reasoning to Support Sense-making: Implications for Design. IEEE Transactions on Visualization and Computer Graphics, 2013b. 19(12): p. 2217–2226.Google Scholar
  20. Krizan L (1999) Intelligence essentials for everyone. Joint Military Intelligence College, Washington, DCGoogle Scholar
  21. Memisevic R, Sanderson P, Wong W, Choudhury S, Li X (2007) Investigating human-system interaction with an integrated hydropower and market system simulator. IEEE Trans Power Syst 22(2):762–769CrossRefGoogle Scholar
  22. Mercado SC (2009) Sailing the sea of OSINT in the information age. Stud Intell 48(3):45–55Google Scholar
  23. Millward W (2001) Life in and out of Hut 3. In: Hinsley FH, Stripp A (eds) Codebreakers: the inside story of Bletchley park. Oxford University Press, Oxford, pp 17–29Google Scholar
  24. Moore DT (2007) Critical thinking and intelligence analysis, 2nd edn. National Defense Intelligence College, Washington, DCGoogle Scholar
  25. Moore DT (2011) Critical thinking and intelligence analysis, 2nd edn. National Defense Intelligence College, Washington, DCGoogle Scholar
  26. National Defense Authorization Act for Fiscal Year 2006, 119 STAT. 3136 (2006)Google Scholar
  27. National Policing Improvement Agency (2008) Practice advice on analysis. National Policing Improvement Agency, on Behalf of the Association of Chief Police Officers, UKGoogle Scholar
  28. NATO (2002) NATO open source intelligence handbook v1.2 January 2002Google Scholar
  29. Rasmussen J, Pejtersen AM, Goodstein LP (1994) Cognitive systems engineering. Wiley, New YorkGoogle Scholar
  30. Reising DVC, Sanderson PM (2002) Work domain analysis and sensors II: pasteurizer II case study. Int J Hum Comput Stud 56(6):597–637CrossRefGoogle Scholar
  31. Rooney C et al. (2014) INVISQUE as a Tool for Intelligence Analysis: the Construction of Explanatory Narratives. Int J Hum Comput Interact 30(9):703–717Google Scholar
  32. Takken S, Wong BLW (2013) Tactile reasoning: hands-on vs. hands-off—what’s the difference? In: Paper presented at the 11th international conference on naturalistic decision making NDM 2013, Marseille, France, 22–24 May 2013Google Scholar
  33. Toulmin S (1958) The uses of argument, 8th reprint 2008 edn. Cambridge University Press, Cambridge, EnglandGoogle Scholar
  34. Vicente KJ (1999) Cognitive work analysis: toward safe, productive, and healthy computer-based work. Lawrence Erlbaum Associates, Mahwah, NJGoogle Scholar
  35. Vicente KJ, Rasmussen J (1992) Ecological interface design: theoretical foundations. IEEE Trans Syst Man Cybern 22(4):589–605CrossRefGoogle Scholar
  36. Vicente KJ, Christoffersen K, Pereklita A (1995) Supporting operator problem solving through ecological interface design. IEEE Trans Syst Man Cyber 25(4):529–545CrossRefGoogle Scholar
  37. Vuckovic A, Sanderson P, Neal A, Gaukrodger S, Wong BLW (2013) Relative position vectors: an alternative approach to conflict detection in air traffic control. Hum Factors 55(5):946–964CrossRefGoogle Scholar
  38. Wickens CD, Carswell CM (1995) The proximity compatibility principle: its psychological foundation and relevance to display design. Hum Factors 37(3):473–479CrossRefGoogle Scholar
  39. Wigmore JH (1913) The principles of judicial proof as given by logic, psychology, and general experience and illustrated in judicial trials, 1st edn. Little, Brown & Co., Boston, MAGoogle Scholar
  40. Wong BLW, Varga M (2012) Blackholes, keyholes and brown worms: challenges in sense making. In: Proceedings of HFES 2012, the 56th annual meeting of the human factors and ergonomics society, Boston, MA, 22–26 Oct 2012). HFES Press, Santa Monica, CA, pp 287–291Google Scholar
  41. Wong WBL, O’Hare D, Sallis PJ (1998) The effect of layout on dispatch planning and decision making. In: Johnson H, Nigay L, Roast C (eds) People and computers XIII, HCI ‘98 conference. Springer, in Collaboration with the British Computer Society, Sheffield, UK, pp 221–238CrossRefGoogle Scholar
  42. Wong BLW, Choudhury ST, Rooney C, Chen R, Xu K (2011) INVISQUE: technology and methodologies for interactive information visualization and analytics in large library collections. In: Gradmann S, Borri F, Meghini C, Schuldt H (eds) Research and advanced technology for digital libraries: international conference on theory and practice of digital library TPDL 2011, Berlin, 26–28 Sep 2011. Lecture notes in computer science, vol 6966. Springer, Berlin, pp 227–235Google Scholar
  43. Wong BLW, Kodagoda N, Attfield S (2013) Trialling the SMART approach: identifying and assessing sense-making. In: Proceedings of the human factors and ergonomics society 57th annual meeting, 30 Sep–4 Oct 2005. HFES Press, San Diego, pp 215–219Google Scholar
  44. Wong, B. L. W., & Kodagoda, N. (2016). How analysts think: Anchoring, Laddering and Associations, Proceedings of the Human Factors and Ergonomics Society 60th Annual Meeting, 19-23 September 2016, Washington, D.C., USA (pp. 178-182): SAGE Publications.Google Scholar
  45. Woods DD (1995) Toward a theoretical base for representation design in the computer medium: ecological perception and aiding human cognition. In: Flach J, Hancock P, Caird J, Vicente K (eds) Global perspectives on the ecology of human-machine systems, vol 1. Lawrence Erlbaum Associates, Inc., Publishers, Hillsdale, NJ, pp 157–188Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Interaction Design CentreMiddlesex UniversityLondonUK

Personalised recommendations