Skip to main content
Log in

Towards standardisation of user models for simulation and adaptation purposes

  • Long paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

The use of user models can be very valuable when trying to develop accessible and ergonomic products and services taking into account users’ specific needs and preferences. Simulation of user–product interaction using user models may reveal accessibility issues at the early stages of design and development, and this results to a significant reduction in costs and development time. Moreover, user models can be used in adaptive interfaces enabling the personalised customisation of user interfaces that enhances the accessibility and usability of products and services. This paper presents the efforts of the Virtual User Modelling and Simulation Standardisation ‘VUMS’ cluster of projects towards the development of an interoperable user model, able to describe both able-bodied and people with various kinds of disabilities. The VUMS cluster is consisted by the VERITAS, MyUI, GUIDE, and VICON FP7 European projects, all involved in user modelling from different perspectives. The main goal of the VUMS cluster was the development of a unified user model that could be used by all the participant projects and that could be the basis of a new user model standard. Currently, within the VUMS cluster, a common user model has been defined and converters that enable the transformation from each project’s specific user model to the VUMS user model and vice versa have been developed enabling, thus, the exchange of user models between the projects.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://www.vaalid-project.org/.

  2. http://cloud4all.info/.

  3. https://docs.google.com/spreadsheet/ccc?key=0AnAwpf4jk8LSdDd3TEJWLUtmN290YzVfTkNvcHYyMUE&authkey=CPOO65oE.

  4. The complete UML class diagram of the VUMS Exchange Format can be found at http://160.40.50.183/VUMS/VUMSExchangeFormat.jpg.

  5. A literate witness must sign (if possible, this person should be selected by the participant and should have no connection to the research team). Participants who are illiterate should include their thumb print as well.

References

  1. Anderson, J.R., Lebiere, C.: The Atomic Components Of Thought. Lawrence Erlbaum Associates, Hillsdale, NJ (1998)

    Google Scholar 

  2. Anthropos ErgoMAX: http://www.ergomax.de/html/welcome.html. Accessed Jan (2004)

  3. Apkarian, J., Naumann, S., Cairns, B.: A three-dimensional kinematic and dynamic model of the lower limb. J. Biomech. 22, 143–155 (1989)

    Article  Google Scholar 

  4. Balme, L., Demeure, A., Barralon, N., Coutaz, J., Calvary, G.: CAMELEON-RT: a software architecture reference model for distributed, migratable, and plastic user interfaces. Proc SOC EUSAI 2004, 291–302 (2004)

    Google Scholar 

  5. Barnard, P.: The emotion research group website, MRC cognition and brain sciences unit. http://www.mrc-cbu.cam.ac.uk/~philb. Accessed 1 Jul (2007)

  6. Biswas, P., Langdon, P.: Developing multimodal adaptation algorithm for mobility impaired users by evaluating their hand strength. Int. J. Human-Comput. Inter. 28(9), Taylor & Francis (2012)

  7. Biswas, P., Langdon, P., & Robinson, P.: Designing inclusive interfaces through user modelling and simulation. Int. J. Hum. Comput. Inter., Taylor & Francis, vol. 28 Issue 1 (2012)

  8. Boden, M.A.: Computer Models of Mind: Computational Approaches in Theoretical Psychology, Cambridge University Press, Cambridge (1985)

  9. Byrne, M.D.: ACT-R/PM and menu selection: applying a cognitive architecture to HCI. Int. J. Hum. Comput. Stud. 55, 41–84 (2001)

    Article  MATH  Google Scholar 

  10. Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Bouillon, L., Vanderdonckt, J.: A unifying reference framework for multi-target user interfaces. Interact. Comput. 15(3), 289–308 (2003)

    Article  Google Scholar 

  11. Card, S., Moran, T., Newell, A.: The Psychology of Human-Computer Interaction. Lawrence Erlbaum Associates, Hillsdale, NJ (1983)

    Google Scholar 

  12. Carmagnola, F., Cena, F., Gena, C.: User model interoperability: a survey. User Model. User-Adap. Inter. 21(3), 285–331 (2011)

    Article  Google Scholar 

  13. Cappelli, T.M., Duffy, V.G.: Motion capture for job risk classifications incorporating dynamic aspects of work. Digital human modeling for design and engineering conference, Lyon. SAE International, Warrendale, July 4–6 (2006)

  14. Choi, J.: Developing a 3-dimensional Kinematic Model of the Hand for Ergonomic Analyses of Hand Posture, Hand Space Envelope, and Tendon Excursion. PhD thesis, The University of Michigan (2008)

  15. Cognitive Architectures. http://en.wikipedia.org/wiki/ Cognitive_architecture. Accessed 1st July 2007

  16. Coluccini, M., Maini, E.S., Martelloni, C., Sgandurra, G., Cioni, G.: Kinematic characterization of functional reach to grasp in normal and in motor disabled children. Gait Posture 25(4), 493–501, ISSN 0966-6362, doi:10.1016/j.gaitpost.2006.12.015. (http://www.sciencedirect.com/science/article/pii/S0966636207000136)

  17. Coutaz, J.: User interface plasticity: model driven engineering to the limit!. In Proceedings of the 2nd ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS ‘10). New York: ACM. pp. 1–8 (2010)

  18. DeCarlo, D., Metaxas, D., Stone, M.: An anthropometric face model using variational techniques. In Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques SIGGRAPH ‘98. ACM, New York, pp. 67–74 (1998)

  19. DiLorenzo, P.C., Zordan, V.B., Sanders, B.L.: Laughing out loud: control for modeling anatomically inspired laughter using audio. ACM Trans. Graph. 27(125), 1–8 (2008)

    Article  Google Scholar 

  20. Duffy, V.G.: Handbook of Digital Human Modeling: Research for Applied Ergonomics and Human Factors Engineering. CRC Press, Florida (2008)

    Book  Google Scholar 

  21. Eng, J.J., Winter D.A.: Kinetic analysis of the lower limbs during walking: what information can be gained from a three-dimensional model? J. Biomech. 28(6), 753–758 (1995)

    Article  Google Scholar 

  22. Eng, K., Lewis, R.L., Tollinger, I., Chu, A., Howes, A., Vera A.: “Generating Automated Predictions of Behavior Strategically Adapted to Specific Performance Objectives.” ACM/SIGCHI Conference on Human Factors in Computing Systems. pp. 621–630 (2006)

  23. Feyen, R., Liu, Y., Chaffin, D., Jemmerson, G., Joseph, B.: Computer-aided ergonomics: a case study of incorporating ergonomics analyses into workplace design. Appl. Ergon. 2000(31), 291–300 (2000)

    Article  Google Scholar 

  24. Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 47, 381–391 (1954)

    Article  Google Scholar 

  25. Fortin, C., Gilbert, R., Beuter, A., Laurent, F., Schiettekatte, J., Carrier, R., Dechamplain, B.: SAFEWORK: a microcomputer-aided workstation design and analysis. New advances and future developments. In: Karkowski, W., Genaidy, A.M., Asfour, S.S. (eds.) Computer-Aided Ergonomics, pp. 157–180. Taylor and Francis, London (1990)

    Google Scholar 

  26. Gajos K. Z., Wobbrock J. O. and Weld D. S.: Automatically generating user interfaces adapted to users’ motor and vision capabilities. In: ACM Symposium on User Interface Software and Technology, pp. 231–240 (2007)

  27. Gajos, K.Z., Weld, D.S., Wobbrock, J.O.: Automatically generating personalized user interfaces with Supple. Artif. Intell. 174(12–13), 910–950 (2010)

    Article  Google Scholar 

  28. Garner, B.A., Pandy, M.G.: Estimation of Musculotendon Properties in the Human Upper Limb. Ann. Biomed. Eng. 31, 207–220 (2003)

    Article  Google Scholar 

  29. Hampson, P.J., Moris, P.E.: Understanding Cognition. Blackwell Publishers Ltd., Oxford (1996)

    Google Scholar 

  30. Hick, W.E.: On the rate of gain of information. J. Exp. Psychol. 4, 11–26 (1952)

    Article  Google Scholar 

  31. Hingtgen, B.A., McGuire, J.R., Wang, M., Harris, G.F. Design and validation of an upper extremity kinematic model for application in stroke rehabilitation, Engineering in Medicine and Biology Society, 2003. Proceedings of the 25th Annual International Conference of the IEEE, vol. 2, pp. 1682–1685, 17–21 (2003)

  32. Holzbaur, K.R.S., Murray, W.M., Delp, S.L.: A model of the upper extremity for simulating musculoskeletal surgery and analyzing neuromuscular control. Ann. Biomed. Eng. 33, 829–840 (2005)

    Article  Google Scholar 

  33. Hornof, A.J., Kieras D.E.: Cognitive Modeling Reveals Menu Search Is Both Random And Systematic. ACM/SIGCHI Conference on Human Factors in Computing Systems. pp 107–114 (1997)

  34. Howes, A., Vera, A., Lewis, R.L., Mccurdy, M.: Cognitive Constraint Modeling: A Formal Approach To Reasoning About Behavior. Annual meeting of the Cognitive Science Society Lawrence Erlbaum Associates (2004)

  35. ISO/TC 159 Ergonomics: ISO 9241-171:2008 Ergonomics of human-system interaction—Part 171: Guidance on software accessibility (2008)

  36. Jameson, A.: Systems that Adapt to their Users: An Integrative Perspective. Saarland University, Saarbrücken (2001)

    Google Scholar 

  37. John, B.E., Kieras, D.: The GOMS family of user interface analysis techniques: comparison and contrast. ACM Trans. Comput. Hum. Interact. 3, 320–351 (1996)

    Article  Google Scholar 

  38. Johnson-Laird, P.A.: The Computer and The Mind. Harvard University Press, Cambridge, MA (1988)

    Google Scholar 

  39. Kähler, K., Haber, J., Yamauchi, H., Seidel, H.P. (2002). Head shop: Generating animated head models with anatomical structure. In ACM SIGGRAPH/EG Symposium on Computer Animation, 55–64

  40. Kane, S.K., Wobbrock, J.O. & Smith, I.E.: Getting off the treadmill: evaluating walking user interfaces for mobile devices in public spaces. In Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI ‘08). New York: ACM. pp. 109–118 (2008)

  41. Keates, S., Clarkson, J., Robinson, P.: Investigating The Applicability of User Models For Motion Impaired Users. ACM/SIGACCESS Conference on Computers and Accessibility, pp. 129–136 (2000)

  42. Kieras, D., Meyer, D.E.: An overview of the EPIC architecture for cognition and performance with application to human-computer interaction. Hum. Comput. Interact. 12, 391–438 (1990)

    Article  Google Scholar 

  43. Kirisci, P.T., Klein, P., Modzelewski, M., Lawo, M., Mohamad, Y., Fiddian, T., Bowden, C., Fennell, A., Connor, J.O.: Supporting inclusive design of user interfaces with a virtual user model. HCI 6: 69–78. The four-volume set LNCS 6765–6768 (2011)

  44. Komura, T., Shinagawa, Y., Kunii, T.L.: Creating and retargeting motion by the musculoskeletal human body model. Vis. Comput. 16(5), 254–270 (2000)

    Article  Google Scholar 

  45. Koo, T.k., Mak, A.F.: Feasibility of using EMG driven neuromusculoskeletal model for prediction of dynamic movement of the elbow. J. Electromyogr. Kinesiol. 15(1), 12–26 (2005). doi:10.1016/j.jelekin.2004.06.007. (http://www.sciencedirect.com/science/article/pii/S1050641104000616)

  46. Koo, T.K., Mak, A.F., Hung, L.K.: In vivo determination of subject-specific musculotendon parameters: applications to the prime elbow flexors in normal and hemiparetic subjects. Clin. Biomech. 17(5), 390–399 (2002). doi:10.1016/S0268-0033(02)00031-1. (http://www.sciencedirect.com/science/article/pii/S0268003302000311)

  47. Laitila, L.: Datormanikinprogram om verktyg vid arbetsplatsutformning—En kritisk studie av programanvändning. Thesis. Luleå Technical University, Luleå (2005)

  48. Lallement, Y., Alexandre, F.: Cognitive Aspects of Neurosymbolic Integration. Connectionist-Symbolic Integration Ed. Sun R. and Alexandre F.London, UK: Lawrence Erlbaum Associates (1997)

  49. Lamkull, D., Hanson, L., Ortengren, R.: A comparative study of digital human modelling simulation results and their outcomes in reality: a case study within manual assembly of automobiles. Int. J. Ind. Ergon. 39, 428–441 (2009)

    Article  Google Scholar 

  50. Lee, S.H., Terzopoulos, D.: Heads up! Biomechanical modeling and neuromuscular control of the neck. ACM Trans. Graph. 253, 1188–1198. Proc. ACM SIGGRAPH 06 (2006)

  51. Lind, S., Krassi, B., Johansson, B., Viitaniemi, J., Heilala, J., Stahre, J., Vatanen, S., Fasth, Å., Berlin., C.: SIMTER: A Production Simulation Tool for Joint Assessment of Ergonomics, Level of Automation and Environmental Impacts. The 18th International Conference on Flexible Automation and Intelligent Manufacturing (FAIM 2008), June 30–July 2, (2008)

  52. Mankoff, J., Fait, H., Juang, R.: Evaluating accessibility through simulating the experiences of users with vision or motor impairments. IBM Syts. J. 44(3), 505–518 (2005)

    Article  Google Scholar 

  53. Marshall, R., Case, K., Porter, J.M., Sims, R.E., Gyi, D.E.: Using HADRIAN for eliciting virtual user feedback in ‘Design for All’. Proc. Inst. Mech. Eng. B: J. Eng. Manuf. 218(9), 1203–1210 (2004)

  54. Mcmillan, W.W.: Computing for users with special needs and models of omputer-human interaction. ACM/SIGCHI Conference on Human Factors in Computing Systems. pp. 143–148 (1992)

  55. Moran, T.P.: Command language grammar: a representation for the user interface of interactive computer systems. Int. J. Man Mach. Stud. 15(1), 3–50 (1981)

    Article  Google Scholar 

  56. Newell, A.: Unified Theories of Cognition. Harvard University Press, Cambridge, MA (1990)

    Google Scholar 

  57. Nichols, J., Myers, B.A.: Creating a lightweight user interface description language: An overview and analysis of the personal universal controller project. ACM Trans. Comput.-Hum. Interact. 16(4), Article 17, 37 pp (2009)

  58. Nichols, J., Myers, B.A., Rothrock, B.: UNIFORM: automatically generating consistent remote control user interfaces. In Proceedings CHI ‘06. ACM, New York. pp. 611–620 (2006)

  59. Nichols, J., Rothrock, B., Chau, D.H., Myers, B.A.: Huddle: automatically generating interfaces for systems of multiple connected appliances. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology (UIST ‘06). ACM, New York, pp. 279–288 (2006)

  60. Oka, N.: Hybrid cognitive model of conscious level processing and unconscious level processing. IEEE International Joint Conference on Neural Networks, pp. 485–490 (1991)

  61. Oppermann, R.: Adaptively supported adaptability. Int. J. Hum. Comput. Stud. 40(3), 455–472 (1994)

    Article  Google Scholar 

  62. Ouerfelli, M., Kumar, V., Harwin, W.S.: Kinematic modeling of head-neck movements, Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, vol. 29, no.6, pp. 604–615 (1999). doi:10.1109/3468.798064, URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=798064&isnumber=17313

  63. Pai, Y.C., Patton, J.L.: Center of mass velocity-position predictions for balance control. J. Biomech. 11, 341–349 (1997)

    Google Scholar 

  64. Pai, Y.C., Patton, J.L.: Erratum: center of mass velocity-position predictions for balance control. J. Biomech. 31, 199 (1998)

    Google Scholar 

  65. Pai, Y.C., Rogers, M.W., Patton, J.L., Cain, T.D., Hanke, T.: Static versus dynamic predictions of protective stepping following waist-pull perturbations in young and older adults. J. Biomech. 31, 1111–1118 (1998)

    Article  Google Scholar 

  66. Paterno, F., Santoro, C., Mäntyjärvi, J., Mori, G., Sansone, S.: Authoring pervasive multimodal user interfaces. Int. J. Web Eng. Technol. 4(2), 235–261 (2008)

  67. Paterno, F., Santoro, C. & Spano, L. D.: MARIA: a universal, declarative, multiple abstraction-level language for service-oriented applications in ubiquitous environments. ACM Trans. Comput.-Hum. Interact. 16(4), Article 19 (2009)

  68. Patton, J.L., Pai, Y.C., and Lee, W.A.: A Simple Model of the Feasible Limits to Postural Stability, presented at IEEE/Engineering in Medicine an Biology Society Meeting, Chicago (1997)

  69. Patton, J.L., Lee, W.A., Pai, Y.C.: Relative stability improves with experience in a dynamic standing task. Exp. Brain Res. 135, 117–126 (2000)

    Article  Google Scholar 

  70. Peissner, M., Häbe, D., Janssen, D. & Sellner, T. (2012). MyUI: generating accessible user interfaces from multimodal design patterns. In Proceedings of the 4th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS ‘12). ACM, New York pp. 81–90

  71. Pennestrì, E., Stefanelli, R., Valentini, P.P., Vita, L.: Virtual musculo-skeletal model for the biomechanical analysis of the upper limb. J. Biomech. 40(6), 1350–1361 (2007). doi:10.1016/j.jbiomech.2006.05.013. (http://www.sciencedirect.com/science/article/pii/S0021929006001679)

  72. Phillips, C.B., Badler, N.I.: Jack: a toolkit for manipulating articulated figures. In: Proceedings of the 1st Annual ACM SIGGRAPH Symposium on User Interface Software, pp. 221–229. ACM, New York (1988)

  73. Porter, J., Case, K., Freer, M.T., Bonney, M.C.: Automotive Ergonomics, Chapter Computer-aided ergonomics design of automobiles. Taylor and Francis, London (1993)

    Google Scholar 

  74. Porter, J.M., Marshall, R., Freer, M., Case, K.: SAMMIE: a computer aided ergonomics design tool. In: Delleman, N.J., Haslegrave, C.M., Chaffin, D.B. (eds.) Working Postures and Movements: Tools for Evaluation and Engineering, pp. 454–462. CRC Press LLC, Boca Raton (2004)

    Google Scholar 

  75. Prince, F., Corriveau, H., Hebert, R., Winter, D.A.: Gait in the elderly. Gait Posture 5(2): 128–135 (1997)

  76. Rao, S.S., Bontrager, E.L., Gronley, J.K., Newsam, C.J., Perry, J.: Threedimensional kinematics of wheelchair propulsion. IEEE Trans. Rehabil. Eng. 4, 152–160 (1996)

    Article  Google Scholar 

  77. Ringbauer, B., Peissner, M., Gemou, M.: From “design for all” towards “design for one”: a modular user interface approach. In: Stephanidis, C. (ed.) Universal Access in HCI, Part I, HCII 2007, LNCS 4554, pp. 517–526. Springer, Berlin (2007)

    Google Scholar 

  78. Sapin, E., Goujon, H., de Almeida, F., Fodé, P., Lavaste, F.: Functional gait analysis of trans-femoral amputees using two different single-axis prosthetic knees with hydraulic swing-phase control: kinematic and kinetic comparison of two prosthetic knees. Prosthet. Orthot. Int. 32(2), 201–218 (2008)

    Article  Google Scholar 

  79. Savidis, A., Stephanidis, C.: Unified user interface design: designing universally accessible interactions. Interact. Comput. 16(2), 243–270 (2004)

    Article  Google Scholar 

  80. Serna, A., Pigot, H., Rialle, V.: Modeling the progression of Alzheimer’s disease for cognitive assistance in smart homes. User Model. User-Adap. Inter. 17, 415–438 (2007)

    Article  Google Scholar 

  81. Shapiro, A., Faloutsos, P., Ng-Thow-Hing, V.: Dynamic animation and control environment. In Proceedings of Graphics Interface 2005, pp. 61–70 (2005)

  82. SimTk. OpenSim (2008), https://simtk.org/home/opensim

  83. Sottet, J-.S., Ganneau, V., Calvary, G., Coutaz, J., Demeure, A., Favre, J.-M., Demumieux, R.: Model-driven adaptation for plastic user interfaces. In: Baranauskas, C., Palanque, P., Abascal, J., Junqueira Barbosa, S. (eds.). Proceedings of the 11th IFIP TC 13 International Conference on Human-Computer Interaction (INTERACT’07), pp. 397–410. Springer, Berlin (2007)

  84. Stephanidis, C., Constantinou, P.: Designing human computer interfaces for quadriplegic people. ACM Trans. Comput. Hum. Interact. 10(2), 87–118 (2003)

    Article  Google Scholar 

  85. Stephanidis, C., Paramythis, A., Sfyrakis, M., Stergiou, A., Maou, N., Leventis, A., Paparoulis, G., Karagiannidis, C.: Adaptable and adaptive user interfaces for disabled users in the AVANTI project. Intelligence in services and networks, LNCS-1430, pp. 153–166, Springer (1998)

  86. Tollinger, I., Lewis, R. L., McCurdy, M., Tollinger, P., Vera, A., Howes, A., Pelton, L.: Supporting efficient development of cognitive models at multiple skill levels: exploring recent advances in constraint-based modeling. ACM/SIGCHI conference on human factors in computing systems, pp. 411–420 (2005)

  87. van der Meulen, P., Seidl, A.: RAMSIS- the leading cad tool for ergonomic analysis of vehicles, digital human modeling, HCII 2007. LNCS 4561, 1008–1017 (2007)

    Google Scholar 

  88. Van Nierop, O.A., Van der Helm, A., Overbeeke, K.J., Djajadiningrat, T.J.: A natural human hand model. Vis. Comput. 24(1), 31–44 (2008)

    Article  Google Scholar 

  89. VSR Research Group: Technical report for project virtual soldier research. Tech. rep., Center for computer-aided design, The University of IOWA (2004)

  90. Weibelzahl, S.: Evaluation of adaptive systems. Dissertation. Trier: University of Trier (2002)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to N. Kaklanis.

Appendices

Appendix 1: VUMS user profile example

A sample user profile in VUMS Exchange Format

Appendix 2: VUMS Exchange Format—UML class diagrams

See Figs. 6, 7, 8, 9, 10, 11, and 12.

Fig. 6
figure 6

Affected (by the disabilities) tasks and general information about the user—UML class diagram

Fig. 7
figure 7

Anthropometric variables—UML class diagram

Fig. 8
figure 8

Visual variables—UML class diagram

Fig. 9
figure 9

Auditory variables—UML class diagram

Fig. 10
figure 10

Speech variables—UML class diagram

Fig. 11
figure 11

Cognitive variables—UML class diagram

Fig. 12
figure 12

Mobility variables—UML class diagram

Appendix 3: Common informed consent

Title of the project:

Coordinator:

Local Principal Researcher:

Institution:

Financed by:

Project duration:

3.1 Participant’s name

The study described in this document is a part of the project called ‘Title of the project’, financed by the European Commission under the 7th Framework Programme (Consortium Agreement: Number of Consortium Agreement).

This consent sheet may contain words you do not understand. Please ask either the contact researcher or any professional in the study to explain any word or give any further information. You may take a copy of this consent to think about it or talk to your family before taking a decision. At all times, we try to assure the compliance of the current legislation.

3.2 Introduction

You have been invited to participate in a research study. Before deciding whether you want to participate, we would kindly request that you read this consent carefully. Please ask any questions that may come to your mind in order to make sure you understand all the procedures of the study, including the risks and benefits.

3.3 Purpose of the study

The main aim of the project was… (described here the aim of the project). In the document entitled: ‘Information Page’, you will find more information about the purpose of the study

3.4 Type of research intervention

This research will consist of focus groups, completion of questionnaires, etc. Your participation would consist of……

3.5 Participant selection

Explain why a person was selected to participate in the research.

3.6 Participants in the study and possible participation in it

We kindly request your voluntary participation in this research study. This informed consent includes information about the study. We would like to assure that you are perfectly informed about the purpose of our study and what your participation in it implies.

Please ask us to clarify any section in this information document that may be necessary. Please do not sign if you are not sure that you have understood all the aspects of the study and its objectives.

In this part of the study, we would like to know (complete with the concrete aim of the study for which the participation of the participant is required).

3.7 Voluntary participation

Your participation in this research is entirely voluntary. It is your choice whether to participate or not. You can give up at any moment without being penalised or losing benefits.

3.8 The participants will be elderly people older than 60 years

The travel costs from your home to the laboratory will be covered by us.

3.9 Duration

The research takes place over ___ (number of) days/or ___ (number of) months in total. During that time, we will contact you ____ times for ____ you at ____ interval and each interview will last for about ____ hour each.

3.10 Risks or inconveniences

No risk or damage is foreseen during the test application.

3.11 Benefits

It is probable that you will not receive any personal benefit for your participation in this study. In any case, the data collected in this study might result in a better knowledge and later intervention for elderly people.

3.12 Reimbursements

E.g.: You will not be provided any incentive to take part in the research. However, we will give you [provide a figure, if money is involved] for your time, and travel expense (if applicable).

3.13 Privacy and confidentiality

We will record your answers to our notes that will not hold any identification of yourself nor will it be possible to identify yourself later on. In other words, when someone agree to participate in the research, they receive a code number, and from that moment, every personal data are under that code, because of that, no one could know to whom the data belong to. The information will be processed during the analysis of the data obtained and will appear in the project deliverables but again—only in the way that it will not be possible to identify from whom we received the information assuring in every moment the performance of (include the national laws that will be guaranteed).

The results of this research can be published in scientific magazines or be presented in gerontological sessions, always guaranteeing the complete anonymity.

The authorisation for the use and access of the information for the aim of research is totally voluntary. This authorisation will be applied to the end of the study unless you cancel it before. In this case, we will stop using of your data. All the data will be destroyed 5 years after the end of the project.

If you decide to withdraw your consent later on, we will ask you to contact the principal researcher of this study and let him know that you are withdrawing from the study.

3.14 Sharing results

Nothing that you tell us today will be shared with anybody outside the research team, and nothing will be attributed to you by name. The knowledge that we get from this research will be shared with you and your community before it is made widely available to the public. Each participant will receive a summary of the results. There will also be small meetings in the community and these will be announced. Following the meetings, we will publish the results so that other interested people may learn from the research.

3.15 Right to refuse or withdraw

You do not have to take part in this research if you do not wish to do so, and choosing to participate will not affect your rights in any way. You may stop participating in the study at any time that you wish. We will give you an opportunity at the end of the study to review your remarks and you can ask to modify or remove portions of those, if you do not agree with our notes or if we did not understand you correctly.

From the moment of your withdrawal, your data will not be newly processed in any further phases of the research project. However, it will not be possible to alter already existing published documents or completed project deliverables.

3.16 Who to Contact

The principal researcher can be contacted under the following address:

Name of the Contact Person

Organisation name

Street

City

Telephone

For further information about your rights as a research participant, or if you are not satisfied with the manner in which this study is being conducted or if you have any questions or sustain any injury during the course of the research or experience any adverse reaction to a study procedure, please contact the principal researcher.

3.17 Consent certificate

Your participation in the study is possible only if you sign a stand-alone consent form that would authorise us to use your personal information and the information about your health status. If you do not wish to do so, please do not take part in this study.

I confirm that I have read and understand the information sheet dated……………………. for the above study.

  • I have had the opportunity to consider the information, ask questions, and have had these answered satisfactorily. YES/NO

  • I understand that my participation is voluntary and that I am free to withdraw at any time, without giving any reason, without my medical care or legal rights being affected. YES/NO

  • I understand that relevant sections of any of my data collected during the study, may be looked at by responsible individuals from [company/organisation name], where it is relevant to my taking part in this research. I give permission for these individuals to have access to my anonymised records. YES/NO

  • I consent voluntarily to be a participant in this study. YES/NO

Name of participant

 

Date

 

Signature

Name of Person taking consent (if different from researcher)

Date

 

Signature

 

Researcher

 

Date

 

Signature

When completed, 1 copy will be for the participant and 1 copy for the researcher site file.

If illiterate.Footnote 5

I have witnessed the accurate reading of the consent form to the potential participant, and the individual has had the opportunity to ask questions. I confirm that the individual has given consent freely.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kaklanis, N., Biswas, P., Mohamad, Y. et al. Towards standardisation of user models for simulation and adaptation purposes. Univ Access Inf Soc 15, 21–48 (2016). https://doi.org/10.1007/s10209-014-0371-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-014-0371-2

Keywords

Navigation