Trusted Autonomy in Training: A Future Scenario

  • Leon D. Young
Open Access
Part of the Studies in Systems, Decision and Control book series (SSDC, volume 117)


Being able to trust your teacher has been a pivotal assumption within the training systems. The advent of autonomous systems capable of delivering training in innovative and traditional ways creates a number of questions. The premise of this book allows us to examine how autonomous systems, that is non-human, will impact training and learning environments. The following section seeks to explore the future of trusted autonomy within a training context through both an extrapolation of current trends and creative thought.

19.1 Introduction

Why are you reading this book? Are you a student hoping to learn the right answer or perhaps a teacher looking for guidance in the development of a course? Regardless, you undoubtedly hope to learn something new. You are expecting to improve your-self and perhaps, through extension, others. If you accept this proposition then you will accept that at some level you trust the authors of this book. You trust that what we write is accurate and that we want to improve you. In effect you have ceded responsibility for your learning, your training, to us, the authors.

Why is this important? Quite simply, trusted autonomy in a training or learning environment is not new. It existed before Alexander the Great sat at Aristotle’s knee. The autonomous entity has always been a human teacher and trust has always been ceded to the teacher. Should this change? Will this change? The premise of this book allows us to examine how autonomous systems , that is non-human, will impact training and learning environments. The following section seeks to explore the future of trusted autonomy within a training context through both an extrapolation of current trends and creative thought. There is no expectation that this work is predictive however it should show you what is possible and perhaps even what could be plausible.

19.2 Scan of Changes

When investigating what the future could look like, it is important to understand the baseline and potential for change within the environment. Often we look to extreme changes as this allows us to understand the inherent risks and uncertainties contain with those radical changes.

Training and education have quietly been going through a long-term change from didactic teacher-centred and subject based teaching to the use of interactive, problem based, student-centred learning [10]. We are witnessing a massive change in schools and worksites where the individual learner needs are being regarding as one of the central pillars of learning. There is an acceptance that the learners of the future will be provided lessons that “are custom tailored to their specific and individual needs” [12].

The social trend demonstrates the ongoing change in the way teaching is delivered however it is strongly enabled through the technological advances. These advances though are only useful if they are accepted by the general population. That is, there appears to be a greater acceptance of technological interference with our lives. Some of the recognised factors that influence the acceptance of technology into our lives include expected benefits, available alternatives, need for technology and social influence (external pressure) [9].

For instance the rapid increase in Massive Open Online Course (MOOC) has taken many people by surprise even while the reasons for uptake are still opaque [6]. What we can ascertain is that MOOCs offered a perceived benefit (tertiary education); there were few alternatives (part-time, distance providers); technology such as rapid and virtual communication methods; and social pressure was predominately positive due to low risk and high potential. The combination of these factors has led to the delivery of thousands of courses to millions of students, through hundreds of universities after only several years. The popularity of MOOC has seen it placed amongst interactive gaming, social learning, on-demand training and mobile learning as the most likely changes to corporate training [7].

Taking these trends a step further, Kevin Young, head of SkillSoft EMEA, envisions that the future of education will be (1) trainee lead and (2) holistic. For instance “imagine an alert popping up in the corner of your device offering to show you how to complete the task you have just done more effectively, in a quick, five-minute burst [1].” This use of micro-learning appears to empower the learner as it maximises individual ‘device’ compatibility, decreases up-front information (akin to flash cards) and increases interaction [5].

The explosion in data and our ability (some say inability) to access it grows exponentially every year. In 2015, IBM are quoted as stating that over 2.5 EB (1 EB = 1,000,000 TB) of data was being generated per day [4]. The volume of data, speed of data transfers, variety of data collected, potential value of the information and the veracity of the information collated has led to the rapid development in the field of data analytics. The addition of cloud computing has allowed smaller organisations to leverage the technology and analytical algorithms required to take advantage of this potential gold mine [3].

19.3 Trusted Autonomy Training System Map

Simplistically we can develop a system map for a trusted autonomy training system (Fig. 19.1) that allows us to visualise the dominated drivers that are likely to change the future of trusted autonomy in training. In this case we see the three key drivers: autonomous systems, training systems, and trust (between entities).

Autonomous systems refers to all those advances that allow an entity to sense, effect and choose a response from a relevant knowledge base. It has been identified previously that the ability to choose the action, the appearance of free will, engenders trust on both sides [2]. While autonomous systems generally implies technological solutions, human solutions are equally relevant. Training system includes all of those parts that are necessary for training or learning to occur. This includes the teacher and learner, training space and knowledge transfer required. Additionally we include measuring change in the learner as this is how success of a training system is quantified.
Fig. 19.1

Trusted autonomy training system map

The final driver, trust , is probably the most significant as it allows complete imbedding of the autonomous system within the training system. Trust requires a recognised need for the effect, an acceptance of the technological solution, shared values between trustor and trustee, and finally all alternatives are less appealing.

19.4 Theory of Change

We now have an understanding of the baseline and trends relevant to trusted autonomy and training. The system map is a simple method that allows us to see the significant drivers that could affect the future of trusted autonomy in training. The final step before we develop a narrative on the future is construct an appropriate theory that helps us understand the change.

Of the three drivers illustrated in Fig. 19.1. the one that appears to be most significant is trust. So, in simple terms, what does trust depend upon? The strength of interpersonal trust is often dependent upon both a cognitive and an affective component [8]. While trust requires a cognitive recognition of ability, benevolence, integrity and predictability, a trusted relationship also required an affective recognition of shared values between both entities [11]. It is this affective component of trust that appears to be a significant factor in the acceptance of autonomous systems within training environments. We can speculate, with a fair degree of accuracy, that trust between two entities, is a function of their familiarity. That is, the more familiar you are with someone else, the more likely you are to trust them. Regarding trusted autonomy, this is illustrated as a function over time in Fig. 19.2.
Fig. 19.2

Trust as a function of familiarity over time

Last century, and for many centuries, humans have been taught by humans. The classic example is the master and the apprentice. The apprentice trusts the master through dint of their reputation and familiarity. As we move into the 20th century, teachers become dislocated from the contextual environment within schooling systems, and it reasonable to see trust diminish slightly (due to generation gaps, non-familial/communal ties, and divorced from practical application) yet trust is still high.

As we move through the 21st Century, who would you trust more, an experienced human mentor or a clever computer algorithm? This is the basis of the trust function. As our algorithms and technology improves, and even surpasses the human equivalent, the form of the ‘teacher’ becomes less familiar. However, after time, this familiarity should increase as either the generational change allows greater acceptance of what is currently unfamiliar, or the autonomous system becomes less obtrusive and more ‘natural’. While this theory may not be exact, or even correct, it feels right and allows us to explore a set of future scenarios.

19.5 Narratives

19.5.1 The Failed Promise

Fiona felt frustrated. This was meant to be the age of enlightenment. Finally we had come to understand that the differences in learners required a learner-centric approach. Rapid advancement in teaching methodologies followed. Unfortunately the variety of pedagogical responses required an increased number of teachers. The rapid increase in teacher requirement, coupled with low remuneration and state support, created a significant gap in capability. Fortunately, it seemed at the time, science had an answer.

The teacher was a deep learning , online bot designed to provide the best information available. However there was no doubt of the non-natural system nor its origin. Everyone sat in front of curved screens within airy classroom. While, technically, learning could take place at home - the information was cloud based and really it was just a stack of circuits - the training systems were very expensive and clunky. No school wanted to risk de-linking the training system from the traditional training space. You could almost say that it was almost a matter of state control.

That said, despite the obvious artificialness of the training artifact, the information was presented well. It took into account if you were a visual, aural or kinesthetic learner and adapted the delivery of material appropriately. Though, admittedly, there was little these systems could do for kinesthetic. The frustration however came from the requirement to learn a new training system. Fiona felt like she had used a different system every year. Sure, in the old days, the human teachers rotated like a carrousel but at least they looked the same and the teaching was the same. Why couldn’t we just get more teachers?

19.5.2 Fake It Until You Break It

Alex was frustrated. The rapid increase in autonomous systems had led to noisy revolution in the work place. All of a sudden, those simple menial tasks at the bottom of the work food chain were being completed by bots. Robot waiters, robot cleaners, automated financial advisors were ubiquitous throughout the service industries. What initially felt like a boon, quickly became a social nightmare. The problem with the autonomous systems taking all of the low-skilled jobs was manifold. Firstly, it created a large unemployed and disenchanted sub-culture that were, apparently, incapable of upskilling into positions that were still available. And, who was to say these jobs wouldn’t quickly disappear? Alex’s younger brother was in this group. This time though, he was not this particular source of frustration.

No. Alex was a victim of the middle manager curse. Normally, when you started with an organisation, it was customary for you to rotate through the low-skill jobs. There was little expectation that you would remain the mailman, or spend the rest of your career developing simple algorithms solving basic problems. There was an expectation that exposure to these jobs, particularly many of them, gave you a detailed understanding of the inner working of the organisation. The benefits of this under-standing, whilst initially painful to acquire, became obvious when you moved into executive roles. You were able to intuitively understand how decisions would impact the organisations and how changes in the environment could present opportunities or uncertainties. Alex didn’t benefit from that exposure and was rapidly becoming un-done with a number of poor decisions. How could that experience been gained now that the robots had taken away the opportunities in the name of efficiency?

19.5.3 To Infinity, and Beyond!

Ari was excited. Super excited. Of all the possible employment opportunities available, building the space bridge between the Home Solar System and Keplar-442 in the Lyra constellation was not only the most exciting, it was also the most ambitious. Settlers had been migrating to Keplar-442b (now affectionately named Sussana after Johannes Kepler’s first living daughter) for close to fifty years however the support mechanisms were too lengthy and, quite frankly, relied on luck. The space bridge intended to set up way stations similar to the original postal services on Earth until the J.T. Kirk Project finally delivered a sustainable FTL capability (if it was even possible).

Ari had never been to space before. That was OK. Ari had no experience in structural engineering in a zero G environment. That was OK. Ari was taking his lifetime mentor with him and all of the training would be on-the-job. Ari learnt best through experience. He hated being stuck in a room trying to memorise abstract concepts or scrolling through historical exemplars. Fortunately, his mentor knew this and was built to exploit Ari’s strengths. His mentor was an autonomous training system embedded within Ari at birth. This system grew as he grew and learnt its place in the world as it developed its relationship with Ari.

They knew each other intimately. Ari’s mentor - called Jaws in an archaic throw-back - understood how Ari learnt and had access to the world’s knowledge. Knowledge was delivered through augmented visual (bionic eyes), aural and tactile cues. This created a formidable team. Ari brought quick intuitive creativity with a flexible ability to physically affect the environment. Jaws could trust Ari to complete the mission. Similarly Ari trusted Jaws to deliver the right mentoring at the right time. Theirs was a symbiotic relationship that replicated throughout the society. It is no wonder that these symbionts were able to quickly breach the solar system and extend to the stars.


  1. 1.
    What will training look like in the workplace of the future, 05 August 2016
  2. 2.
    A. Hussein, Abbass, Eleni Petraki, Kathryn Merrick, John Harvey, and Michael Barlow. Trusted autonomy and cognitive cyber symbiosis: Open challenges. Cognitive computation 8(3), 385–408 (2016)CrossRefGoogle Scholar
  3. 3.
    Trends and future directions, Marcos D Assunção, Rodrigo N Calheiros, Silvia Bianchi, Marco AS Netto, and Rajkumar Buyya. Big data computing and clouds. Journal of Parallel and Distributed Computing 79, 3–15 (2015)Google Scholar
  4. 4.
    Recent achievements and new challenges, Gema Bello-Orgaz, Jason J Jung, and David Camacho. Social big data. Information Fusion 28, 45–59 (2016)Google Scholar
  5. 5.
    Laura Callisen. Micro learning: The future of training in the workplace, elearning industry,, 2016. 05 August 2016
  6. 6.
    Gayle Christensen, Andrew Steinmetz, Brandon Alcorn, Amy Bennett, Deirdre Woods, and Ezekiel J Emanuel. The mooc phenomenon: who takes massive open online courses and why? Available at SSRN 2350964, working paper available at, 2013
  7. 7.
    Kangan Institute. The future of corporate training,
  8. 8.
    J. Daniel, McAllister. Affect-and cognition-based trust as foundations for interpersonal cooperation in organizations. Academy of management journal 38(1), 24–59 (1995)CrossRefGoogle Scholar
  9. 9.
    T.M. Sebastiaan, Peek, Eveline JM Wouters, Joost van Hoof, Katrien G Luijkx, Hennie R Boeije, and Hubertus JM Vrijhoef. Factors influencing acceptance of technology for aging in place: a systematic review. International journal of medical informatics 83(4), 235–248 (2014)CrossRefGoogle Scholar
  10. 10.
    Lasitha Samarakoon, Tharanga Fernando, Chaturaka Rodrigo, Senaka Rajapakse, Learning styles and approaches to learning among medical undergraduates and postgraduates. BMC medical education 13(1), 1 (2013)CrossRefGoogle Scholar
  11. 11.
    Wu Jyh-Jeng, Ying-Hueih Chen, Yu-Shuo Chung, Trust factors influencing virtual community members: A study of transaction communities. Journal of Business Research 63(9), 1025–1032 (2010)Google Scholar
  12. 12.
    Brent Yonk, State of training and development: The future of learning looks bright (The next level, education (LinkedIn Pulse, Careers, 2015)Google Scholar

Copyright information

© The Author(s) 2018

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Department of DefenceWar Research CentreCanberraAustralia

Personalised recommendations