Abstract
This chapter is about relation artefacts Type IV which are socio-technical interventions in organizational and wider contexts. Three subtypes of these relation artefacts move the design from the technical to the social: interaction interoperability checkups, digital legacy interventions, and organizational strategy alignments. The HWID platform is shown to provide a massive push toward such interventions, through the continuous relations building between empirical work analysis and interaction design activities that aims to create new local sociomaterial realities for stakeholders involved. Ways of avoiding falling into a ‘techno-determinism’ trap and instead keeping the relations to the social, and vice versa, are introduced. The chapter ends with a summary of how to do socio-technical HCI design interventions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
To increase transparency of the data analysis process, the authors published anonymous interview transcripts, the Nvivo 10 project file of interview analysis, and the survey data in excel format.
References
Alon, L., & Nachmias, R. (2020). Anxious and frustrated but still competent: Affective aspects of interactions with personal information management. International Journal of Human-Computer Studies, 144, 102503.
Baets, W. (1992). Aligning information systems with business strategy. The Journal of Strategic Information Systems, 1(4), 205–213.
Bang, A. L., & Eriksen, M. A. (2014). Experiments all the way in programmatic design research. Artifact: Journal of Design Practice, 3(2), 1–4.
Bardoel, E. A., & Drago, R. (2016). Does the quality of information technology support affect work–life balance? A study of Australian physicians. The International Journal of Human Resource Management, 27(21), 2604–2620.
Cajander, Å., Larusdottir, M., Eriksson, E., & Nauwerck, G. (2015). Contextual personas as a method for understanding digital work environments. IFIP Advances in Information and Communication Technology, 468, 141–152. https://doi.org/10.1007/978-3-319-27048-7_10.
Chavan, A. L. (2005). Another culture, another method. Proceedings of the 11th International Conference on Human-Computer Interaction, 21(2). (Citeseer).
Chivukula, S. S., Brier, J., & Gray, C. M. (2018). Dark Intentions or Persuasion? UX Designers’ Activation of Stakeholder and User Values. In Proceedings of the 2018 ACM Conference Companion Publication on Designing Interactive Systems, 87–91.
Clemmensen, T., Hertzum, M., & Abdelnour-Nocera, J. (2020). Ordinary user experiences at work: A study of greenhouse growers. ACM Transactions on Computer-Human Interaction (TOCHI), June(Article no 16), 1–31. https://doi.org/10.1145/3386089.
Clemmensen, T., Iivari, N., Rajanen, D., & Sivaji, A. (2021). Organized UX professionals. In HWID2021 Unpublished Proceedings (pp. 1–25). Retrieved from https://www.hwid2021.com/.
Clemmensen, T., & Katre, D. (2012). Adapting e-gov usability evaluation to cultural contexts. In Usablity in government systems user experience design for citizens and public servants (pp. 331–344). https://doi.org/10.1016/B978-0-12-391063-9.00053-5.
Clemmensen, T., & Nørbjerg, J. (2019). (not) Working (with) collaborative robots in a glass processing factory. Worst Case Practices Teaching Us the Bright Side.
Erlandsson, M., & Jansson, A. (2007). Collegial verbalisation–A case study on a new method on information acquisition. Behaviour & Information Technology, 26(6), 535–543.
Fallman, D. (2003). Design-oriented human-computer interaction. In CHI’03 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 225–232). Ft. Lauderdale, Florida, USA: ACM. 05–10 Apr 2003.
Feinberg, M. (2017). A design perspective on data. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 2952–2963). https://doi.org/10.1145/3025453.3025837.
Feltovich, P. J., Prietula, M. J., & Ericsson, K. A. (2006). Studies of expertise from psychological perspectives. In: K. A. Ericsson, N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.) (2006). Cambridge Handbook of Expertise and Expert Performance. New York, NY: Cambridge University Press.
Følstad, A., & Hornbæk, K. (2010). Work-domain knowledge in usability evaluation: Experiences with Cooperative Usability Testing. Journal of Systems and Software, 83(11), 2019–2030. https://doi.org/10.1016/J.JSS.2010.02.026.
Friedland, L. (2019). Culture eats UX strategy for breakfast. Interactions, 26(5), 78–81.
Gothelf, J., & Seiden, J. (2013). Lean UX-Applying lean principles to improve user experience. In LEAN UX-Applying Lean Principles to Improve User Experience. https://doi.org/10.1017/CBO9781107415324.004.
Gruning, J., Bullard, J., & Ocepek, M. (2015). Medium, access, and obsolescence: What kinds of objects are lasting objects? In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 3433–3442).
Gulotta, R., Odom, W., Forlizzi, J., & Faste, H. (2013). Digital artifacts as legacy: Exploring the lifespan and value of digital data. In CHI’13. https://doi.org/10.1145/2470654.2466240.
Gutwin, C., Cockburn, A., & Gough, N. (2017). A field experiment of spatially-stable overviews for document navigation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 5905–5916). https://doi.org/10.1145/3025453.3025905.
Hertzum, M. (2020). Usability testing: A practitioner’s guide to evaluating the user experience. Synthesis Lectures on Human-Centered Informatics, 13(1), i–105.
Hertzum, M., & Simonsen, J. (2010). Clinical overview and emergency-department whiteboards: A survey of expectations toward electronic whiteboards. In Proceedings of the 8th Scandinavian Conference on Health Informatics (pp. 14–18).
Hevner, A. R., March, S. T., Park, J., & Ram, S. (2008). Design science in information systems research. Management Information Systems Quarterly, 28(1), 6.
Jansen, B. J., Salminen, J. O., & Jung, S.-G. (2020). Data-driven personas for enhanced user understanding: Combining empathy with rationality for better insights to analytics. Data and Information Management, 4(1), 1–17. https://doi.org/10.2478/dim-2020-0005.
Jansson, A., Erlandsson, M., & Axelsson, A. (2015). Collegial verbalisation–the value of an independent observer: an ecological approach. Theoretical Issues in Ergonomics Science, 16(5), 474–494. https://doi.org/10.1080/1463922X.2015.1027322.
Jansson, A., Erlandsson, M., Fröjd, C., & Arvidsson, M. (2013). Collegial collaboration for safety: Assessing situation awareness by exploring cognitive strategies. In Workshop at INTERACT 2013–14th IFIP TC13 Conference on Human-Computer Interaction, Cape Town, South Africa, September 2013 (pp. 35–40).
Jarzabkowski, P. (2004). Strategy as practice: recursiveness, adaptation, and practices-in-use. Organization Studies, 25(4), 529–560.
Joshi, S. G., & Bratteteig, T. (2016). Designing for prolonged mastery. On involving old people in participatory design. Scandinavian Journal of Information Systems, 28(1).
Kehr, F., Bauer, G., Jenny, G. F., Güntert, S. T., & Kowatsch, T. (2013). Towards a design model for job crafting information systems promoting individual health, productivity and organizational performance.
Khadka, R., Batlajery, B. V., Saeidi, A. M., Jansen, S., & Hage, J. (2014). How do professionals perceive legacy systems and software modernization? In Proceedings of the 36th International Conference on Software Engineering-ICSE 2014, 36–47. https://doi.org/10.1145/2568225.2568318.
Kolko, J. (2015). Design thinking comes of age. Harvard Business Review, 93(9), 66–71. Retrieved from https://hbr.org/2015/09/design-thinking-comes-of-age.
Koskinen, I., Binder, F. T., & Redström, J. (2008). Lab, field, gallery, and beyond. Artifact: Journal of Design Practice, 2(1), 46–57.
Kwon, G. H., Smith-Jackson, T. L., & Bostian, C. W. (2011). Socio-cognitive aspects of interoperability: Understanding communication task environments among different organizations. ACM Transactions on Computer-Human Interaction (TOCHI), 18(4), 1–21.
Lachner, F., Naegelein, P., Kowalski, R., Spann, M., & Butz, A. (2016). Quantified UX: Towards a common organizational understanding of user experience. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction-NordiCHI’16 (pp. 56:1–56:10). https://doi.org/10.1145/2971485.2971501.
Lárusdóttir, M., Cajander, Å., & Gulliksen, J. (2014). Informal feedback rather than performance measurements-User-centred evaluation in Scrum projects. Behaviour and Information Technology. https://doi.org/10.1080/0144929X.2013.857430.
Meneweger, T., Wurhofer, D., Fuchsberger, V., & Tscheligi, M. (2018). Factory workers’ ordinary user experiences: An overlooked perspective. Human Technology, 14(2), 209–232. https://doi.org/10.17011/ht/urn.201808103817.
Mertens, R., Brusilovsky, P., Vornberger, O., & Ishchenko, S. (2009). Bridging the gap between time-and structure-based navigation in web lectures. International Journal on E-Learning, 8(1), 89–105.
Miaskiewicz, T., Sumner, T., & Kozar, K. A. (2008). A latent semantic analysis methodology for the identification and creation of personas. In Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems (Vol. 1, pp. 1501–1510). https://doi.org/10.1145/1357054.1357290.
Nahum-Shani, I., Qian, M., Almirall, D., Pelham, W. E., Gnagy, B., Fabiano, G. A., Waxmonsky, J. G., Yu, J., & Murphy, S. A. (2012). Experimental design and primary data analysis methods for comparing adaptive interventions. Psychological Methods, 17(4), 457.
Neate, T., Bourazeri, A., Roper, A., Stumpf, S., & Wilson, S. (2019). Co-created personas: Engaging and empowering users with diverse needs within the design process. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems-CHI’19 (pp. 1–12). https://doi.org/10.1145/3290605.3300880.
Neave, N., Briggs, P., McKellar, K., & Sillence, E. (2019). Digital hoarding behaviours: Measurement and evaluation. Computers in Human Behavior, 96, 72–77.
Nielsen, L. (2019). Personas-User focused design. https://doi.org/10.1007/978-1-4471-7427-1.
Nielsen, L., Hansen, K. S., Stage, J., & Billestrup, J. (2015). A template for design personas. International Journal of Sociotechnology and Knowledge Development, 7(1), 45–61. https://doi.org/10.4018/ijskd.2015010104.
Paay, J., Raptis, D., Kjeldskov, J., Skov, M. B., Ruder, E. V, & Lauridsen, B. M. (2017). Investigating cross-device interaction between a handheld device and a large display. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 6608–6619). https://doi.org/10.1145/3025453.3025724.
Pfister, J. (2017). “ This will cause a lot of work.” Coping with transferring files and passwords as part of a personal digital legacy. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 1123–1138).
Plaskoff, J. (2017). Employee experience: the new human resource management approach. Strategic HR Review, 16(3), 136–141. https://doi.org/10.1108/SHR-12-2016-0108.
Rai, A. (2017). Editor’s comments: Diversity of design science research. MIS Quarterly, 41(1), iii–xviii.
Raptis, D., Kjeldskov, J., & Skov, M. B. (2016). Continuity in multi-device interaction. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction-NordiCHI’16. https://doi.org/10.1145/2971485.2971533.
Reinecke, K., & Bernstein, A. (2013). Knowing what a user likes: A design science approach to interfaces that automatically adapt to culture. Mis Quarterly, 427–453.
Seidelin, C. (2020). Towards a Co-design perspective on data-Foregrounding data in the design and innovation of data-based services. Ph. D. thesis, IT University Copenhagen.
Sein, M.K., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action design research. MIS Quarterly. https://doi.org/10.2307/23043488.
Shin, Y., Hur, W., Kim, H., & Cheol Gang, M. (2020). Managers as a missing entity in job crafting research: Relationships between store manager job crafting, job resources, and store performance. Applied Psychology, 69(2), 479–507.
Sturm, C., Aly, M., von Schmidt, B., & Flatten, T. (2017). Entrepreneurial & UX mindsets: Two perspectives-one objective. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (pp. 60:1–60:11). https://doi.org/10.1145/3098279.3119912.
Tharanathan, A., Bullemer, P., Laberge, J., Reising, D. V., & Mclain, R. (2012). Impact of functional and schematic overview displays on console operators’ situation awareness. Journal of Cognitive Engineering and Decision Making, 6(2), 141–164.
Vertesi, J., Kaye, J., Jarosewski, S. N., Khovanskaya, V. D., & Song, J. (2016). Data Narratives: Uncovering tensions in personal data management. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 478–490).
Wania, C. E., Atwood, M. E., & McCain, K. W. (2006). How do design and evaluation interrelate in HCI research? In Proceedings of the 6th Conference on Designing Interactive Systems (pp. 90–98).
Xin, X., Cai, W., Zhou, W., Baroudi, S. El, & Khapova, S. N. (2020). How can job crafting be reproduced? Examining the trickle-down effect of job crafting from leaders to employees. International Journal of Environmental Research and Public Health, 17(3), 894.
Yang, R., Ming, Y., Ma, J., & Huo, R. (2017). How do servant leaders promote engagement? A bottom-up perspective of job crafting. Social Behavior and Personality: An International Journal, 45(11), 1815–1827.
Author information
Authors and Affiliations
Corresponding author
6.5 Appendix
6.5 Appendix
Evaluations with persona in standard (legacy) work scenarios
A part of considering legacy systems as socio-technical systems is to consider evaluation of legacy systems in terms of legacy (systems) work scenarios. In such work scenarios interventions are taken and used by expert workers with extreme amounts of practice on the relevant circumscribed set of tasks in that particular work environment (Feltovich, Prietula, & Ericsson, 2006). Evaluation of legacy systems interventions will have to be carried out in such scenarios.
The collegial verbalization (CV) evaluation method
When evaluating with classical UX and usability thinking aloud methods in scenarios where the employees are highly experienced, it is possible to go beyond the established protocols of concurrent and retrospective verbalizations, and include a third protocol of ‘conspective’ verbalization to achieve the needed (Erlandsson & Jansson, 2007; Jansson, Erlandsson, & Axelsson, 2015; Jansson, Erlandsson, Fröjd, & Arvidsson, 2013) In a concurrent verbal protocol the verbalizer is performing the work task, in a retrospective verbal protocol the verbalizer is remembering own performance, and in a conspective verbalization the observer is observing colleagues performance, in real time or as recording of past performance, whilst thinking aloud. Jansson et al. (2015) suggested an evaluation method they called collegial verbalization (CV) that combined the three protocols. The CV evaluation method is relevant for evaluating legacy work scenarios, since it builds on the assumption that when domain knowledge is shared between colleagues, one might find that they share cognitive strategies, and hence a colleague thinking aloud during observation of colleagues’ performance is just as valid as thinking aloud while performing own tasks. Thus, evaluation can be done with independent observers (colleagues) comment in the form of conspective protocols on the behavior of target users in legacy work scenarios well-known to all involved employees.
To identify the target users (the expert workers) to take part in the evaluation, a modified persona method might be helpful. The idea of a persona might deliver an approach to utilize the independent source of data of the CV method (Jansson et al., 2015), if the persona is cast as an expert colleague. Hence personas for selecting participants to do CV for relation artefacts type IV such as interaction designs for digital legacy should be codesigned among work colleagues (the experts) and data-driven with substantial understanding and description of the legacy work scenario(s).
Co-designing Data-Driven Personas for Organizational UX
Personas is a method that for several years has been applied in multiple domains. Though there is no standard definition of personas, there is agreement that it is a description of an end-user based on data with or with-out fictitious elements. Some authors emphasize that the persona description should contain a one-to-one correlation between data and the description. Other authors allow to include fictitious elements in the description for the sake of enhancing the empathy for the users (Nielsen, 2019).
However, the idea that personas should be based on data is being challenged by recent trends of Lean UX and co-design. Lean UX supports the integration of UX activities into the widespread use of agile development by using personas based on the design team’s assumptions. With the perspective of Lean UX, the emergence of the concept of assumption-based personas is spreading. The argument is that these personas are fast and easy to generate and a starting point for “proper” data-driven personas (Gothelf & Seiden, 2013). Similarly, within the participatory design framework the notion of co-designed personas has emerged. Co-created personas encourage users to engage with co-design sessions (Neate, Bourazeri, Roper, Stumpf, & Wilson, 2019). Common for both the personas based on assumptions and the co-created personas is that both have no underlying data. However, co-designing personas should focus on collaboration around the foundation of data for the persona, and that this is particularly important for Organizational UX.
What is known about data collection for personas in general is that data for personas can take many forms both qualitative (Miaskiewicz, Sumner, & Kozar, 2008) and quantitative (Jansen, Salminen, & Jung, 2020). The content is most often related to back-ground information such as demographics, information related to the system to be designed such as work tasks, use of technology, and relation to the domain. Data may also include general business and marketing related information, such as brand relationship, business objectives, and market segment size (Nielsen, Hansen, Stage, & Billestrup, 2015).
For employee-personas in legacy workplace scenarios aimed at designing for organizational UX, data may include data about organizational structure, employees, work tasks, work systems, and their interrelations. That is, who the persona will be with, where they are, what they are doing during the workday, and with what systems (Clemmensen et al., 2020). Often this will reveal a quite ordinary set of user experiences for the persona (Clemmensen et al., 2020; Meneweger, Wurhofer, Fuchsberger, & Tscheligi, 2018). Data collection for personas in workplace scenarios may require consideration of technology as a design element and how the relation between the users and the technical systems are transformed for this purpose.
Co-creating data understandings is a collaborative practice around using data for persona creation, inspired by the work of Feinberg (2017) that views data as something that may be designed through practices and the work by Seidelin (2020) that suggests a collaboratively practice of design with and of data and data structures. Employee data may be understood as the most important data for personas when co-designing personas for organizational UX that requires data about structure, people, tasks, technology and their interrelations. Employee data could be thought of as data about people. Such data is found in organizations’ HR departments. However, employee data should be considered more broadly and in relation to employee experience, that is, centering the organizational and interaction design around the employee’s total experience of the workplace (Plaskoff, 2017). We imagine that employees and other stakeholder participants in different ways can help collect data about structure, people, tasks, technology, and their interrelations. It is thus not enough to parse the HR department’s database; interviews and surveys with employees and managers at different levels may be required, before entering co-design workshops to create personas. An example of co-design of personas that include collection of data about structure, people, tasks, technology and their interrelations is the creation of contextual personas for the Swedish defense reported by Cajander et al. (2015).
Using co-created data understandings for designing personas for legacy work place scenarios may provide a solid empirical basis for designing alternative technology solutions to deal with wicked issues in organizations. Involving employees in co-design of personas can be done by using digital tools that support and empower employees focused data collection about personas. An example of using digital tools that support and empower employees focused data collection about personas is the automation scenario of manufacturing with (a legacy) collaborative robot proposed by Clemmensen and Nørbjerg (2019). Here employees use a digital tutoring tool that nudge them towards systematic data collections about a target user group of (expert) colleagues. Then, the tool supports the employees in identifying, prototyping, and evaluating interactions with the robot and of the social arrangement around the robot, based on the personas coming out of the data (Clemmensen & Nørbjerg, 2019).
In summary, (1) co-design of personas should primarily mean co-design of data-driven personas, (2) co-design of data-driven personas may be particularly important and possible when designing for organizational UX, since it can focus on the organization-specific employee experience, and (3) co-designed data-driven personas focused on organizational UX provides a solid basis for creative variation of socio-technical designs, and thus contribute to evaluation of legacy workplace designs.
Design case with HWID prototyping and interventions in a university course
The design task was to do a quick and dirty relation artefact Type III prototype and a relation artefact Type IV intervention. The context was a university course in a Scandinavian business school. It has a LMS (learning management system) with one module per teaching session. The users were mostly students with a need for feeling competent and the organizational problem was to support easy access for all students to teaching material. A one button solution, Fig. 6.4, was created based on relation artefact Type III subcategories, and evaluated.
Figure 6.4 shows a prototype for both individual and collective user, as the LMS course was used by individuals and on a class basis. The organizational action hypotheses were to give students better overview of course modules to enhance the course capacity for learning. The prototyped worker experiences were a single-step interaction that moves the student directly to today’s session to give a feeling of overview. The field evaluations and UX tests were done as the prototyped hypothesis to be testes in the course setting with actual students interacting with the prototype was that the prototype would show positive indications of overview experience among students.
The data collection was done as individual and organizational UX and usability measures, with home-made “overview”-experience scales and qualitative interviews. The sociomaterial analysis was focus on relation artefact Type IV subcategories. The interaction interoperability checks asked the question if the solution would give the student a similar interaction experience as in other parts of the LMS course environment did? The digital legacy intervention asked the question if there was a way that the student could see the previous overviews provided by the user-interactions with the prototype for this specific course? The organizational strategy alignment asked the question if a one-button solution of this design would be the kind of experiences that the business school would want its students to have?
The theory and methodology in the project included theory about clinical overview in emergency rooms (Hertzum & Simonsen, 2010), field experiments with overviews for document navigation (Gutwin, Cockburn, & Gough, 2017), and particularly time and structure based navigation in web-lectures (Mertens, Brusilovsky, Vornberger, & Ishchenko, 2009).
The design contribution with relation artefact Type IV was thus the one-button answer to the questions of how the desired “experience of overview” in online course settings fitted to usual interactions of the students, if it was ‘their’ overview, and if this was a university-certified experience. The relation artefact Type IV discussions then again opened up for further discussion about if “overview” was the experience that was designed for, or was it perhaps the more classic “situation awareness” with its known socio-technical conditions and implications, see for example (Tharanathan, Bullemer, Laberge, Reising, & Mclain, 2012).
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Clemmensen, T. (2021). Relation Artefacts Type IV. In: Human Work Interaction Design. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-030-71796-4_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-71796-4_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-71795-7
Online ISBN: 978-3-030-71796-4
eBook Packages: Computer ScienceComputer Science (R0)