Advertisement

Sports Medicine

, Volume 45, Issue 1, pp 1–7 | Cite as

The Illusion of Competency Versus the Desirability of Expertise: Seeking a Common Standard for Support Professions in Sport

  • Dave CollinsEmail author
  • Veronica Burke
  • Amanda Martindale
  • Andrew Cruickshank
Current Opinion

Abstract

In this paper we examine and challenge the competency-based models which currently dominate accreditation and development systems in sport support disciplines, largely the sciences and coaching. Through consideration of exemplar shortcomings, the limitations of competency-based systems are presented as failing to cater for the complexity of decision making and the need for proactive experimentation essential to effective practice. To provide a better fit with the challenges of the various disciplines in their work with performers, an alternative approach is presented which focuses on the promotion, evaluation and elaboration of expertise. Such an approach resonates with important characteristics of professions, whilst also providing for the essential ‘shades of grey’ inherent in work with human participants. Key differences between the approaches are considered through exemplars of evaluation processes. The expertise-focused method, although inherently more complex, is seen as offering a less ambiguous and more positive route, both through more accurate representation of essential professional competence and through facilitation of future growth in proficiency and evolution of expertise in practice. Examples from the literature are also presented, offering further support for the practicalities of this approach.

Key Points

The paper examines limitations in the commonly applied competency method of evaluation for support professions and promotes an alternative, expertise-focused approach.

The expertise approach goes beyond the use of competency-based systems, and even the definitions of competence provided in this paper, to evaluate and facilitate capacities for more elaborative and adaptive thinking, judgment and growth.

Bodies responsible for professional development and evaluation need to lead a long overdue, widespread shift from competency-driven to expert practice across the spectrum of science and coaching in sport, reflecting the situation already common in medicine.

1 Introduction

As support professions in sport science and medicine evolve, two distinct lines of accreditation and consequent development have emerged. The first, built around evaluation against a prescribed list of competencies, have become standard features of many accreditation pathways in the sciences [1] and the core support role of coaching [2]. While undeniably necessary and important, however, higher-level proficiency, or the development of professional expertise, requires more than just the demonstration of inherently limited prescribed competencies [3]. Perhaps as a consequence, a second and more expertise-based system of training and accreditation has developed, led largely by the medical professions. Somewhat confusingly, this approach is often referred to as the evaluation of competence; we examine the essential differences between these two apparently identical terms in Sect. 2 of this paper. Expressly, however, and extending beyond the general response to general challenge patterning of competency-based models, this expertise/competence focus is grounded in the assumption that a multiplicity of solutions often exist for particular problems and that optimum solutions often require specific or even idiosyncratic blends [4, 5, 6]. Given that professional bodies must develop practitioners for complex and multifaceted environments, we argue that an elevation in the standards and reputation of sport science and coaching, as well as the efficacy of their interactions, requires a greater emphasis on expertise than currently afforded.

Indeed, while acquisition of specific competencies may be a valuable building block for initial development (e.g. providing the basic tools of the trade), discrepancies across professions with regards to their competency or expertise/competence orientation during final accreditation/continued assessment phases also pose particular issues for multi- and interdisciplinary support provision. More explicitly, the use of these different approaches is, we suggest, illogical, suboptimal and perhaps even divisive. To clarify, it is strange for parallel professions, working in the same domain and in increasingly closer interdisciplinary harmony, to be trained and evaluated in such contrasting ways. A simple example of this is medics being evaluated by expertise/competence, including the appropriate weighting of factors to meet specific but diverse challenges, as opposed to coaches who are usually evaluated on behavioural competency alone [5, 6, 7]. Second, with two almost opposite styles, one has to be suboptimal to the other. Third, we see it as divisive (at least potentially) in that these two approaches make clearly contrasting statements about the nature of professionalism and the ways in which the professions should work. In simple terms, practice is either grounded in judgment and decision or reproduction of (often prescribed) behaviour. Accordingly, it seems that reconsideration on this matter is overdue.

In undertaking such evaluation, we suggest that competency-based approaches are not only inherently limited but also unsuitable for facilitating high-level proficiency in the sports science, medicine, and coaching professions. This contention is not new. In sport psychology, for example, and despite ongoing support for competency-based approaches to training and continued professional development [8], it has been acknowledged that learning from ‘recipe-like’ experiences of expert practitioners (i.e. what they did) is limited unless considered in tandem with why they did it [9]. In similar fashion, Jones and Wallace [10] have highlighted how the ambiguities inherent in coaching require a much broader adaptive expertise [11] if one is to effectively deal with the role’s regular challenges. In strength and conditioning, an increased recognition of the need for individualised [12] and evidence-based [13] prescription is also reflective of this thrust. Unfortunately, despite this growing awareness, the positive examples set by medical disciplines (e.g. Girot [4, 6] and the General Medical Council [4, 6]), and even explicit and detailed coverage of what competence assessment should look like [see Kaslow et al. [14] (developed in psychology but, so far in our experience, not followed by sport psychology organisations)], competency models nonetheless remain an industry standard.

Our case for expertise (rather than competency)-based approaches in supporting and guiding sports disciplines along pathways to expert performance is made in four parts. First, we offer some clarification between the various terms which serve to obfuscate debate. Second, we consider some limitations of competency-based models. Third, we examine some exemplars of how expertise-based models can work to comparatively greater effect. Finally, we conclude by suggesting some simple steps for action, together with a call for this issue to be placed at the forefront of organisational debate over professional accreditation and development systems.

2 Competency, Competence, Expertise and Professionalism

While the competency approach retains popularity across many interpersonal settings, the inherent difficulty, as either a specific or generic term, is illustrated by the tautological definition of Dooley et al. [15]: “competency based behavioral anchors are defined as performance capabilities needed to demonstrate knowledge, skill and ability (competency) acquisition”. According to this view, and problematically, competency is therefore a subdivision of itself. Unsurprisingly, competency has therefore been described as a “fuzzy concept” [16] and the few attempts to establish a coherent terminology appear to have had little impact [17]. As such, typical competencies such as “arrives before the start of each session in order to plan and prepare appropriately” offer apparent clarity but leave much unanswered (e.g. what needs to be planned and what is appropriate?).

In contrast to competency, competence is more positively defined by Epstein and Hundert [18] (p. 227) (in relation to medical physicians) as “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served”. Crucial for our argument, these authors, Kaslow et al. [14] and Schön [19] see professional competence as more than the acquisition and application of knowledge to simple problems. Rather, “it is defined by the ability to solve ambiguous problems, tolerate uncertainty, and make decisions with limited information” [18] (p. 227). This definition, we suggest, resonates more closely with the type of problem likely to be met by professionals across the performance sport environment. Additionally, and although Epstein and Hundert [18] (p. 227) still class the “demonstration of [more than] isolated competencies” as a “competence”, it also fits within the construct of expertise, which has been defined in terms of: (a) cognitive development (progression from superficial and literal understanding to articulated, conceptual and principled understanding); (b) knowledge structure (more sophisticated knowledge organisation, and more elaborate mental models); and (c) reasoning processes (enhanced perceptual skill, more case-based reasoning and greater reasoning flexibility) [20]. Finally, it also matches Carr’s fifth distinguishing characteristic of a profession; namely, that which requires “a high degree of individual autonomy—independence of judgment—for professional practice” [21] (p. 34).

In summarising this overview of definitions, we would highlight four issues which seem to stand out as requirements for professional practice additional to subject knowledge; namely, judgment, elaboration, flexibility and decision making. We return to these factors later. However, they should be borne in mind as criteria against which any standard of professional practice may be measured. The key differences between competence and competency are, hopefully, demonstrated as a lot more than mere semantics.

3 Competency-Based Problems

3.1 Apparent Comprehensiveness Masks Over-Simplification

As a core feature of competency-based models, the number of statements that comprise a particular ability suggests a careful and, at first sight, creditable attention to detail from those who oversee professional training and evaluation. With more careful consideration, however, this must be questioned. For example, at the time of writing the British General Medical Council set 16 “outcomes” that must be realised in the 5,500 h of training required for doctors [22]. Acknowledging that single, correct solutions can rarely be prescribed—as practitioners cope with uncertainty and dynamic challenge in complex and individual ways [4]—training and evaluation in this setting is, therefore, inherently thematic [5, 6]. In contrast, qualification as a UK Level 1 sports coach (capable only of assisting other coaches) requires development and assessment of some 18 “competency units”, each with numerous sub-divisions, in a 33-h period [7]. Allowing for the various sub-components, and diverging from medicine’s use of broader criteria to enable adaptive and creative problem solvers, those that aspire to this coaching award (clearly of a much lower level than medical training) must therefore satisfy a set of 123 learning criteria! Despite the complexity faced, such lists of standalone abilities and activities are also found in many other support discipline qualifications [1, 23, 24].

To emphasise our point, addressing such an extensive range of attributes is both practically impossible and epistemologically questionable in that practitioners are being trained and assessed in a way that is at odds with their operational environment. Thus, competency-based models provide an apparently comprehensive yet ultimately deceptive portrayal of practice requirements. The completeness of the competency-based descriptor is clearly compromised by the volume of items covered, making it virtually impossible to address all facets. As a consequence, examiners must opt to focus more on some criteria than others and, paradoxically, thereby defy the logic on which the competency approach is founded. By contrast, in an expertise-based approach, the differential weighting of factors (some are clearly more important than others, and this differential co-varies with time) is made explicit and overtly situated as a part of the evaluation, if only because fewer factors are completed in more comprehensive detail [14] (also see our worked example in Sect. 4.2). This approach, we suggest, is much closer to the real-world challenges inherent in interpersonal tasks.

3.2 The Problems of Relevance, Balance and Complexity

Contrary to optimally impactful real-world practice, the key competencies currently espoused by many professions can be viewed as context independent, generic and apparently applicable across different settings, occupations and tasks [1, 23, 24]. If, as Bolden and Gosling [25] suggest, competencies are derived from practical job analyses, then they are primarily functional, simplistic and possess little applicability to the development and training of professionals. To an extent, therefore, job competencies are limited to the expression of what is measurable, tangible and technical.

As such, one critical issue is relevance; in short, there is a fundamental lack of fit between the basic premise of the competency approach and its practical applicability to interpersonal settings in general, and even less to specific sport environments. For example, how does the notion of competency relate to the moral, emotional and relational dimensions of client/patient/performer interaction? The problem, we argue, resides with the competency approach’s preoccupation with a set of job performance measures which (presumably) represent the desired standard across environments. Thus, even when a Likert scale is employed for measurement, the use of competencies implies that there is a right and wrong way to perform; obviously a situation that is sometimes correct but usually not so in the more complex challenges which typify the interpersonal elements of coaching and science support [26, 27]. In effect, the emphasis on whether or not an individual is competent patently neglects the essential subtleties of executional decision making, and emphasis on the ‘what’ instead of the ‘why’ represents satisfaction of a minimum rather than the far more desirable expert standard.

Additionally, competencies are commonly concerned with an extremely broad but undifferentiated range of skills. In the sport-support profession of psychology, for example, the application of ethical principles, conducting research, delivering presentations and (of greatest relevance) planning consultancy are all presented as equally weighted competencies [23]. Furthermore, as these ‘whats’ are often presented as equivalent, both in importance and complexity, the practitioner’s ability to monitor fundamental client/patient/performer safety or comfort is presented with the same weighting as his/her ability to form effective relationships, discern and design optimum actions/interventions for each situation, or even make long-term, interdisciplinary plans with a broad range of support staff [26]. Such issues exemplify the challenges of balance that are left unaddressed by the competency approach.

Finally, competency frameworks are also somewhat limited in their sensitivity to and management of complexity. In coaching, for instance, it could be argued that key activities such as safety checks and basic planning fit well with competency criteria. When applied to a more esoteric and crucial responsibility, however, this framework is far less pertinent. For example, when managing change in high-level coaching the landscape is characterised by a level of uncertainty, unpredictability and discretion that runs counter to the essence of the competency model (i.e. to separate and silo work roles rather than to represent them holistically). Arguably, the notion of competency represents only a fraction of the complexity. On this premise, the acceptance of competencies as a basis for evaluating complex performance seems particularly problematic and misplaced [28].

3.3 Inherently Limited Applications for Optimising Performance

Despite their prevalence across a host of domains, Mintzberg [29] has identified that “acquiring various competencies does not necessarily make an individual competent”. Indeed, simply exhibiting a competency in the test environment, or meeting a baseline requirement, does not guarantee that the competency will be used appropriately in other settings; nor does the absence of a competency in a test make one incompetent unless reasons for its omission are considered. Recognising that the measurement-driven approach also fails to consider the appropriateness of using a particular behaviour for a particular context, such data are unlikely to provide an accurate picture of a professional’s performance, or provide much in the way of facilitating optimally critical and informative feedback. For instance, the overuse of a normally beneficial competency can become a weakness in certain circumstances, as studies on organisation derailment have demonstrated [30, 31]. This is acknowledged in some competency frameworks, although such approaches would seem to reflect a move towards the more reason-focused expertise approach described in Sect. 4 of this paper.

Furthermore, and problematically, the idea of a competency-based performance measure clearly undermines its applicability for formative purposes [14]. For example, if individuals feel that they are being assessed, this can impact significantly on the criticality and openness required for a developmental process to work. Furthermore, the ‘experimenter mentality’ [32] requires a tolerance for the drop in performance that often results from engaging in development-focused activities. In simple terms, competencies are commonly too gross to account for the important nuances or the shades of grey that are often the subtle tipping points between success and failure in high-level sport [33, 34].

4 Advantages and Exemplars of Expertise-Based Solutions

4.1 What Does it Take to Get Better? Pursuing a Developmental Focus

Perhaps if competency frameworks were used to suggest what individuals ‘could do’, rather than what they ‘should do’ (i.e. proficiency scaling), this would offer a productive way forward. In this manner, switching the focus toward exploring the factors affecting progression, including the ability to learn, reflect and adapt [35], would facilitate the evolution of new variants and mental models on professional service delivery [36]. Focused on individual and organisational needs, competencies could then be deployed as hypothesis-generating (rather than hypothesis-testing) tools to drive development- (rather than assessment-) oriented conversations [37]. This fits well with the view of practitioner as experimenter [32] and would lead to even greater benefit from the skills of reflective practice which, perhaps inappropriately, currently coexist with competency models: culminating in a problematic mix of shades of grey with black and white! Unfortunately, while professional adaptability and judgment require such an experimental approach, this isn’t an inherent feature of competency evaluations. In short, the ‘it depends on the context’ outcomes of carefully considered critical reflection are often inherently at odds with the ‘do it this way to pass’ specificity of competency assessments; at least, how they are currently employed in many sports settings.

In fact, the potential to focus on features of effective performance evolution (evaluating ongoing growth rather than just current competence) is already well-established in sport, with the characteristics of both the developing individual [38, 39] and the optimum development environment [40, 41] having been established, applied and successfully exploited. In the support practitioner domain, it is interesting to see that coaches think similar features apply to their own profession; indeed, an orientation to which they might aspire [42]. As such, the ‘skills to become more expert’ are already apparent and tacitly accepted, offering an important potential for growth [42].

4.2 An Exemplar of the Expertise Approach: A Focus on Decision Making

As eloquently stated by Smith and colleagues [43] (p. 4), “academic research generally and our society particularly have largely neglected the fact that sound judgment and decision making are the crux of many professions. By understanding and communicating what professional decision makers do and how they do it well, we make valuable contributions both to our field and to the professional community at large”. Of course, a much wider range would be used in an expertise approach; for example, the development of more self-driven, autonomous approaches to development. For the present, however, we outline the understanding and development of declarative reasoning as an exemplar focus that may effectively address our identified four-part curriculum of judgment, elaboration, flexibility and decision making. It is in this vein that we see the scenario-based training and formative testing of expertise in support professionals as offering an opportunity for facilitating expert learning; enabling practitioners to form more complete mental models of practice; providing a “cognitive apprenticeship” model which makes thinking “visible” to peers and supervisees [44]; and establishing “cognitive authenticity” [45]. Significantly, fewer factors are considered but in a lot more detail, with the underpinning rationale of decisions and choices explicitly explored. With regard to the weighting issues highlighted earlier, only key factors are considered, whilst other, less important aspects are examined only if they impact on these core issues.

Given that time on the job alone is insufficient for developing expertise [46], teaching the structures of ‘ideal’ thinking [47], rather than ideal solutions, holds great promise for professional training and evaluation. Once again, there is already a good start in this direction; for example, the recent work by Kahneman and Klein [48] on the blending of systematic analysis and skilled intuition. Teaching and assessing the skills of professional judgment also offers a structure to the more widespread (although often suboptimally applied) ideas of Schön [32] on critical thinking. Significantly, this approach offers a means to enhance aspects of expertise that seemingly play no role in the existing evaluative structures of competency. Moreover, the existence of a strong literature base [49] means that application of expertise would be more strongly grounded than the competency-based models that represent the pillars of effective practice (at least as it is currently defined).

As a means of briefly demonstrating the differences underpinning the approaches considered, consider the evaluation process to be followed with a coach under a competency or an expertise focus. In the former case, the evaluator would look for behavioural or verbal examples of satisfying the criteria; typical examples would include “identify the types of information needed to plan an activity within sessions” or, from a higher-level award, “explain how to structure language during instruction that is appropriate to participants”. In contrast, an expertise-focused evaluation would consider the processes and meta-processes associated with these target behaviours. Exploring the why of a behaviour, the reasoning underpinning its selection and use, candidates would also be asked about alternatives; namely, what other options were considered, why they were rejected, and what would need to change for a different option to be taken [50]. Through this focus on decision making, training routes would therefore help to develop the aforementioned thinking structures, adaptability and critical analysis that will allow practitioners to prosper in their dynamic and complex (and eventually unsupervised) applied environments.

Furthermore, specific reference to underpinning principles (e.g. what jargon terms were and were not essential and why, or the need for declarative knowledge in certain kinds of learner) would be required, further increasing the candidate’s ability to make judgments and adapt in different situations to that presented as the test environment [4, 27]. Such approaches would seem essential if trainees are to go beyond clear knowledge that X means Y towards the more subtle blending and elaboration necessary for professional practice [51]. As another example, we would highlight the use of validated measures of reflective thinking, once again using a range of simulations, which are highly predictive of effective clinical thinking and decision making later in training [52].

5 Conclusion

In concluding this brief overview, we should stress that not all practitioners who utilise a competency-based approach are guilty of the problems identified in this paper. As with so many prescribed methods, these approaches are, as we have observed, used solely for guidance while the assessment process encourages broader and extra evaluation debate so as to offer formative direction to the candidate. It is interesting that such a reflective coaching approach has sometimes been criticised as ‘going beyond’ the process. Accordingly, in support of more effective professional practice and skilful practitioners across the board, we would hope that an expertise-based approach would be encouraged as more of a core modus operandi rather than an infrequent and unregulated extra.

Indeed, our message is that competency approaches are just too simplistic for all but the most basic of roles and responsibilities apparent in the sports world. As an alternative, the expertise approach seems to fit better with the characteristics of professionalism, even going beyond the definition of competence (as distinguished throughout this paper from competency) to evaluate and facilitate capacities for more elaborative and adaptive thinking, judgment and growth. Of course, this is inherently more complex (matching the situations it is designed to test for) but the complexities are both lower on difficulty and higher on reward than staying with the existing, albeit well-established, system of competency-based evaluation.

Finally, we should stress that the differences between competence and competency evaluations are far from simple semantics. The first has a well-grounded and theoretically consistent basis while the second seems to have emerged from administration-heavy assessment systems (see, for example, the criteria applied by the UK Coaching Certificate, BASES sport science accreditation, or SESNZ sport science accreditation) [1, 2, 24], with little or no theoretical or empirical support.

Accordingly, we hope that this paper has presented a strong case for change. From a sports perspective, expertise and professional judgment and decision making have already been well-examined in sport psychology [53, 54], coaching [26], and strength and conditioning [55], and therefore provide a strong base from which these approaches can be exploited. There are also, notably, training and evaluation methods already available in the public domain [27]. As a consequence, we hope that bodies responsible for professional development and evaluation recognise and harness this evidence base and lead the long-overdue, widespread shift from competency-driven to expert practice across the spectrum of science, medicine and coaching in sport.

Notes

Acknowledgments

Preparation of this manuscript was not funded and we see no author conflicts of interest. We gratefully acknowledge the contributions of several colleagues, particularly Andy Abraham and Áine MacNamara, in the evolution of these ideas.

References

  1. 1.
    British Association of Sport and Exercise Sciences. Supervised experience competency profile. 2013. http://www.bases.org.uk/SE-Application-Documents-and-Guidelines. Accessed 11 Nov 2013.
  2. 2.
    UKCC level 1 guide. 2013. http://www.sportscoachuk.org/sites/default/files/UKCC-Level-Guide.pdf. Accessed 12 Nov 2013.
  3. 3.
    Hoffman RR, Andrews DH, Feltovich PJ. What is “accelerated learning”? Cogn Technol. 2012;17(1):7–10.Google Scholar
  4. 4.
    Girot EA. Graduate nurses: critical thinkers or better decision makers? J Adv Nurs. 2000;31:288–97.PubMedCrossRefGoogle Scholar
  5. 5.
    van der Vleuten C, Schuwirth L. Assessing professional competence: from methods to programmes. Med Educ. 2005;39:309–17.PubMedCrossRefGoogle Scholar
  6. 6.
    General Medical Council. Tomorrow’s doctors. 2009. http://www.gmc-uk.org/static/documents/content/Tomorrow_s_Doctors_0414.pdf. Accessed 22 Oct 2013.
  7. 7.
    Sports Coach UK. Level 1 generic mapping template. Internal planning document applied to UK Coaching awards. 2011.Google Scholar
  8. 8.
    Fletcher D, Maher J. Toward a competency-based understanding of the training and development of applied sport psychologists. Sport Exerc Perform Psychol. 2013;2:265–80.CrossRefGoogle Scholar
  9. 9.
    Martindale A, Collins D. But why does what works work? A response to Fifer, Henschen, Gould, and Ravizza. Sport Psychol. 2010;24:113–6.Google Scholar
  10. 10.
    Jones RL, Wallace M. Another bad day at the training ground: coping with ambiguity in the coaching context. Sport Educ Soc. 2005;10(1):119–34.CrossRefGoogle Scholar
  11. 11.
    Hatano G, Inagaki K. Two courses of expertise. In: Stevenson H, Azuma H, Hakuta K, editors. Child development and education in Japan. New York: WH Freeman; 1986. p. 262–72.Google Scholar
  12. 12.
    Kiely J. Planning for physical performance: the individual perspective. In: Collins D, Button A, Richards H, editors. Performance psychology: a practitioner’s guide. Oxford: Elsevier; 2011. p. 139–60.CrossRefGoogle Scholar
  13. 13.
    English K, Amonette W, Graham M, et al. What is “evidence-based” strength and conditioning? Strength Cond J. 2012;34(3):19–24.CrossRefGoogle Scholar
  14. 14.
    Kaslow NJ, Bebeau MJ, Lichtenberg JW, et al. Guiding principles and recommendations for the assessment of competence. Prof Psychol Res Pr. 2007;38(5):441–51.CrossRefGoogle Scholar
  15. 15.
    Dooley KE, Lindner JR, Dooley LM, et al. Behaviorally anchored competencies: evaluation tool for training via distance. Hum Resour Dev Int. 2004;7(3):315–32.CrossRefGoogle Scholar
  16. 16.
    Boon K, van der Klink M. Competencies: the triumph of a fuzzy concept. Academy of Human Resource Development Annual Conference; 27 Feb–3 Mar 2002; Honolulu. Proceedings 2002;1:327–34.Google Scholar
  17. 17.
    Winterton J, Delamare-Le Deist F, Stringfellow E. Typology of knowledge, skills and competencies: clarification of the concept and prototype. Research report for CEDEFOB (European Centre for the Development of Vocational Training Cedefop. Reference series; 64. Luxembourg: Office for Official Publications of the European Communities, 2006.Google Scholar
  18. 18.
    Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287(2):226–35.PubMedCrossRefGoogle Scholar
  19. 19.
    Schön DA. The reflective practitioner: how professionals think in action. New York: Basic Books; 1983.Google Scholar
  20. 20.
    Hoffman RR. How can expertise be defined? Implications of research from cognitive psychology. In: Williams R, Faulkner W, Fleck J, editors. Exploring expertise. New York: Macmillan; 1998. p. 81–100.Google Scholar
  21. 21.
    Carr D. Professional education and professional ethics. J Appl Philos. 1999;16(1):33–46.CrossRefGoogle Scholar
  22. 22.
    General Medical Council. Assessment in undergraduate medical education: advice supplementary to Tomorrow’s Doctors (2009). Feb 2011. http://www.gmc-uk.org/Assessment_in_undergraduate_web.pdf_38514111.pdf. Accessed 22 Oct 2013.
  23. 23.
    British Psychological Society. Key role competency checklist grid. 2013. http://www.bps.org.uk/careers-education-training/society-qualifications/sport-exercise-psychology/sport-exercise-psycholog. Accessed 22 Oct 2013.
  24. 24.
    Sport and Exercise Science New Zealand. SESNZ accreditation criteria and competencies. 2014. http://www.sesnz.org.nz/Accreditation/. Accessed 29 Apr 2014.
  25. 25.
    Bolden R, Gosling J. Leadership competencies: time to change the tune. Leadership. 2006;2:147–63.CrossRefGoogle Scholar
  26. 26.
    Abraham A, Collins D. Taking the next step: ways forward for coaching science. Quest. 2011;63:366–84.CrossRefGoogle Scholar
  27. 27.
    Martindale A, Collins D. The development of professional judgment and decision making expertise in applied sport psychology. Sport Psychol. 2013;27:390–8.Google Scholar
  28. 28.
    Carroll B, Levy L, Richmond D. Leadership as practice: challenging the competency paradigm. Leadership. 2008;4:363–78.CrossRefGoogle Scholar
  29. 29.
    Mintzberg H. Managers not MBAs: a hard look at the soft practice of managing and management development. London: FT Prentice-Hall; 2004.Google Scholar
  30. 30.
    Stein M. When does narcissistic leadership become problematic? Dick Fauld at Lehman Brothers. J Manag Inq. 2013;22:282–93.Google Scholar
  31. 31.
    Cooper D. Leadership risk: a guide for private equity and strategic investors. Hoboken: Wiley; 2010.Google Scholar
  32. 32.
    Schön D. Educating the reflective practitioner. San Francisco: Jossey-Bass; 1987.Google Scholar
  33. 33.
    Collins D, Trower J, Cruickshank A. Coaching high performance athletes and the high performance team. In: De Bosscher V, Sotiriadou P, editors. Managing high performance sport. Abingdon: Routledge; 2012. p. 205–20.Google Scholar
  34. 34.
    Collins D, Cruickshank A. Preparing Team GB for London 2012. In: Girginov V, editor. Handbook of the 2012 London Olympic and Paralympic Games. London: Routledge; 2012. p. 114–29.Google Scholar
  35. 35.
    Knowles Z, Gilbourne D, Cropley B, et al. Reflective practice in the sport and exercise sciences: contemporary issues. Abingdon: Routledge; 2013.Google Scholar
  36. 36.
    Naudhaug O. Human capital in organisations. Oslo: Scandanavian University Press; 1993.Google Scholar
  37. 37.
    Alimo-Metcalfe B, Alban-Metcalfe J. Leadership in public sector organizations. In: Storey J, editor. Leadership in organizations: current issues and key trends. Milton Park: Routledge; 2004. p. 173–202.Google Scholar
  38. 38.
    MacNamara Á, Button A, Collins D. The role of psychological characteristics in facilitating the pathway to elite performance. Part 1: identifying mental skills and behaviours. Sport Psychol. 2010;24:52–73.Google Scholar
  39. 39.
    MacNamara Á, Button A, Collins D. The role of psychological characteristics in facilitating the pathway to elite performance. Part 2: examining environmental and stage related differences in skills and behaviours. Sport Psychol. 2010;24:74–96.Google Scholar
  40. 40.
    Martindale RJJ, Collins D, Abraham A. Effective talent development: the elite coach perspective within UK sport. J Appl Sport Psychol. 2007;19(2):187–206.CrossRefGoogle Scholar
  41. 41.
    Martindale RJJ, Collins D, Douglas C, et al. Examining the ecological validity of the talent development environment questionnaire. J Sports Sci. 2013;31(1):41–7. doi: 10.1080/02640414.2012.718443.
  42. 42.
    Stozkowski J, Collins D. Communities of practice, social learning and networks: exploiting the social side of coach development. Sport Educ Soc. 2014;19(6):773–88.Google Scholar
  43. 43.
    Smith K, Shanteau J, Johnson P. Psychological investigations of competence in decision making. Cambridge: Cambridge University Press; 2004.Google Scholar
  44. 44.
    Collins A, Brown SJ, Holum A. Cognitive apprenticeship: making thinking visible. Am Educ. 1991;6(11):38–46.Google Scholar
  45. 45.
    Ross KG, Pierce LG. Cognitive engineering of training for adaptive battlefield thinking. Proc Hum Fact Ergon Soc Annu Meet; 2000 Jul;44(11):410–13.Google Scholar
  46. 46.
    Ericsson KA. Development of professional expertise. Cambridge: Cambridge University Press; 2009.CrossRefGoogle Scholar
  47. 47.
    Nutall G. Learning how to learn: the evolution of students’ minds through the social processes and culture of the classroom. Intern J Educ Res. 1999;31(3):139–256.Google Scholar
  48. 48.
    Kahneman D, Klein G. Conditions for intuitive expertise: a failure to disagree. Am Psychol. 2009;64(6):515–26.PubMedCrossRefGoogle Scholar
  49. 49.
    Yates JF, Tschirhart MD. Decision-making expertise. In: Ericsson KA, Charness N, Hoffman RR, Foltovich PJ, editors. The Cambridge handbook of expertise and expert performance. Cambridge: Cambridge University Press; 2006. p. 421–38.CrossRefGoogle Scholar
  50. 50.
    Collins L, Collins D. Integration of in-action reflective practice as a component of professional judgment and decision making in high level adventure sports coaching practice. J Sports Sci. 2014. doi: 10.1080/02640414.2014.953980.
  51. 51.
    Kassebaum DG, Eaglen RH. Shortcomings in the evaluation of students’ clinical skills and behaviours in medical school. Acad Med. 1999;74:842–8.PubMedCrossRefGoogle Scholar
  52. 52.
    Braillovsky C, Charlin B, Beausoleill S, et al. Measurement of clinical reflective capacity early in training as a predictor of clinical reasoning performance at the end of residency: an experimental study on the script concordance test. Med Educ. 2001;35:430–6.CrossRefGoogle Scholar
  53. 53.
    Martindale A, Collins D. Professional judgment and decision making: the role of intention for impact. Sport Psychol. 2005;19(3):303–17.Google Scholar
  54. 54.
    Martindale A, Collins D. Enhancing the evaluation of effectiveness with professional judgment and decision making. Sport Psychol. 2007;21(4):458–74.Google Scholar
  55. 55.
    Collins D, Moody J. Role and competency for the S & C coach. In: Moody J, editor. The UKSCA handbook of strength and conditioning (in press).Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Dave Collins
    • 1
    Email author
  • Veronica Burke
    • 2
  • Amanda Martindale
    • 3
  • Andrew Cruickshank
    • 1
  1. 1.Institute of Coaching and PerformanceUniversity of Central LancashirePrestonUK
  2. 2.Centre for Management DevelopmentCranfield UniversityBedfordUK
  3. 3.Institute of Physical Education, Sport and Health SciencesUniversity of EdinburghEdinburghUK

Personalised recommendations