1 Introduction

The cognitive ability to focus our attention enables us to conduct many mundane as well as more complex tasks successfully. Yet, it is difficult to sustain voluntary attention for relatively long periods without training. Cognitive neuroscience aims to understand the basic neural processes that underlie complex, higher-order cognitive operations such as sustained attention and its functional domains (e.g., Ertan et al., 2021). Sport psychology or cognitive-motor neuroscience assess variation between expert and novice performances regarding their psychology and functional neuroanatomy (Kim et al., 2008). Such studies highlight the stimulation or development of neurocognitive traits when people practice certain actions habitually. For example, the sensorimotor adjustments inherent in sports training may temper or regulate the way the brain processes information, stimulating the ability to filter out irrelevant sensory information to better perform the task at hand (Lo et al., 2019). Experienced athletes display higher-order cognitive operations through adopting advantageous strategies, rapid decision-making processes, and heightened situation awareness (Carrillo et al., 2011).

Material Engagement Theory argues that the use of a technology or material culture is one of the aspects that shapes the brain and how it thinks (Malafouris, 2019). For example, Liu et al. (2023) demonstrated how a single bout (45 min) of Chinese archery affects the performance of three subdomains involved in core executive functions that underlie goal-directed behaviour, namely, inhibition control (the ability to control attention, behaviour, thoughts and/or emotions), working memory, and cognitive flexibility in preadolescent children. Magnetic resonance imaging (MRI) of the brains of badminton, tennis, and table tennis athletes demonstrates how the human brain undergoes neuroplastic adaptations caused by such visuospatial technical skill training, increasing the neural efficiency of brain regions associated with attentional-motor modulation and executive control (Yang et al., 2020). The cognitive-motor neuroscience associated with sports that involves the use of multi-part technologies or tools in spatiotemporal contexts, such as bats-and-balls or bows-and-arrows, could thus serve as neuro-cognitive proxies for similar material engagement outside the sporting arena.

Cognitive archaeology aims to reconstruct aspects of past human cognition through the study of ancient cultural material (e.g., Coolidge & Wynn, 2016), whereas palaeo-neurology aims to reconstruct the evolution of brain morphology and sometimes the associated cognitive adaptations (e.g., Bruner, 2021). Most attempts at neuro-archaeology, wherein neurological data about human tool engagement is generated, discuss the making of tools (e.g., Stout & Hecht, 2015), instead of considering the habitual use of ancient technologies and how practicing such use may have shaped our minds. With this contribution, I explore the cognitive-motor neuroscience of modern archery to reflect on the neurocognition of a bimanual, multi-part toolset – the bow-and-arrow – as an example of how the habitual use of such ancient machines during the African Middle Stone Age may have contributed to the shaping of the sapient mind in terms of attention.

2 Modern archers and their attentive minds as proxy for the minds of ancient bow hunters

Modern archery is referred to as a ‘mental sport’, strongly linked to motivation, concentration, anxiety management, and emotion control (Xu et al., 2023). Physically it is relatively static, involving only six body phases, i.e., stance, nock-arrow, pre-draw, full-draw, release and follow through (Ariffin et al., 2020). The process requires the development of strength and endurance in both arms, with the bow arm holding the weight of the bow, while the draw arm creates the tensile energy for firing the arrow. The draw arm requires focus to steady and aim, whilst the bow arm requires force to resist gravitational and tensile energy (Ariffin et al., 2020). Cognitively, archery is a sport of precision and focus, wherein an archer must coordinate fine-motor execution precisely with visual information processing and focussed attention (Behan & Wilson, 2008; Ertan et al., 2021).

Proficient archery requires long-term physical and mental training. Even the smallest misreading of the relationships between the archer’s body, the bow-and-arrow set, the target, and the surrounding circumstances (sound, light, movement, wind direction, etc.) will cause inaccurate shooting. Somatosensory information processing plays a key role in such meticulous, spatiotemporal, goal-directed technical engagement. Practicing archery therefore stimulates the selective, yet simultaneous, processing of both internal and external information, so that the archer’s attention becomes focussed concurrently on multiple aspects relevant to firing a successful shot – ignoring or buffering everything else (Baumeister et al., 2008).

The ability to pay attention in such a complex manner – by holding several things in mind simultaneously whilst buffering others, in the way an archer does – is key to sapient cognition. Bruner & Colom (2022) define it as the ability to maintain a selective coordination of specific cognitive processes through time, regardless of conflicting stimuli, to achieve a specific goal, for which the neuro-biological underpinnings probably changed throughout the course of human evolution. For example, the specialisation of Homo sapiens, compared with extinct human groups, in terms of enhanced causal reasoning, working memory and visuospatial integration (Bruner & Lozano Ruiz, 2014; Lombard & Gärdenfors, 2021; Wadley, 2013; Wynn & Coolidge, 2011), may also reflect variation in their abilities to focus attention, dealing with meta-awareness, consciously controlling mind wandering, resisting distractors, and managing emotions (Bruner & Colom, 2022).

Some have claimed that in terms of athletic performance, there may be no other aspect of cognitive psychology more important than attention, which comes in a range of manifestations, such as alert/aroused, focused, sustained, selective, alternating and divided attention (Li et al., 2021; Pei et al., 2022). These can be grouped into exteroceptive attention, interoceptive attention, and executive control (Wang et al., 2019). Attention toward the self-status, specifically interoceptive attention (Wang et al., 2019), is a key cognitive ability in closed-skill, self-paced and far-aiming sports such as archery or target shooting (Li et al., 2021).

Similar to archery, current and past hunter-gatherer bow hunting requires the development of the necessary physical traits, precision and focus, but it has many more physical phases, existential implications, and may be fraught with danger. Neurocognitive studies associated with modern archery therefore may have limitations in ecological validity by being distanced from real-life hunting scenarios (Behan & Wilson, 2008). Neuroimaging studies, even when imagined and actual motion share the same neural substrates, can also not reflect the neurocognitive complexity of embodied action through real time (Dietrich, 2008). Yet, I suggest that the information gained through the minds of modern archers provides middle-range application for learning about the probable baseline neurocognitive architecture necessary for Stone Age bow hunting (see Table 1 for current lines of Stone Age bow-hunting evidence before 30 thousand years ago). Here I use information gained from the minds of modern archers as bridging-theory (Coolidge et al., 2016), to explore whether ancient bimanual techno-behaviours such as bow hunting may have contributed to the ability of the sapient mind to ‘pay attention’.

Table 1 Current lines of evidence for Stone Age bow hunting before 30 thousand years ago presented from oldest to youngest (ka = thousands of years ago)

3 Attention in the minds of modern archers and its neurology

Elite archers are more efficient than novice or non-archers in their attention networks, so that they reach the alerting/aroused state quicker (paying full attention to the situation enabling a swift response), make better use of environmental information, and suppress interference from distractors more efficiently (Lu et al., 2021; Wang et al., 2022). An increase in attention, relaxation and parasympathetic system activity may enhance archery performances, so that these are skills developed through training (Li et al., 2019). Archery is a contemplative practice (Baars, 2013), with advanced archers spending thousands of hours repeating their actions without experiencing boredom. Instead, they frequently report ‘silent states’ of absorption and pleasure during practice, wherein absorption is exclusive conscious engagement with one-stream thought. Practicing increases synaptic network efficiency so that the task requires less energy, resulting in a ‘relaxation response’ (Baars, 2013).

These observations relate to experienced archers’ ability to maintain longer quiet-eye periods, during which they organise visual attention and control movement parameters (e.g., direction, force) mentally for accurate aiming. The quiet-eye phase is sensitive to emotional interference such as anxiety. Thus, in addition to maintaining tight coordination between visual and motor attention, it is also necessary to develop the ability to self-regulate emotional states (Behan & Wilson, 2008). Gonzalez et al. (2017) showed how expert archers could develop quiet-eye phases quicker and maintain them longer during high noise interference, compared to novices and non-archers. The longer quiet-eye phases represent more efficient mental programming during which accurate predictions are facilitated by attention control. The combination of inhibitory mechanisms with the control and maintenance of attention reflects higher-order cognitive control (Gonzalez et al., 2017).

Neuro-cognitive work suggests that repeated practice and improvement in aiming accuracy may result in plastic changes in brain areas associated with spatial attention (e.g., Berti et al., 2019; Seo et al., 2012). Archery practice is not limited to real-time experience, but consists of a substantial amount of inner focus or the mental rehearsal of motor acts without much body movement. Chang et al. (2011) suggest that there is an important difference between highly trained archers and beginners in terms of the mental rehearsal phase that precedes voluntary movement. They found higher cerebellar activity in non-archers when learning archery, suggesting neural pressure during archery motor learning. Expert archers, on the other hand, show a more efficient neural network for specialised motor planning that integrates visual information with motor commands whilst using less neural energy in areas such as the cerebellum and basal ganglia associated with motor planning, and the parietal cortex including the precuneus associated with motor imagery (Chang et al., 2011).

EEG (electroencephalogram) studies show that learning a complex bimanual motor skill such as archery is associated with a shift to the left in central-parietal areas, that may reflect increased cortical efficiency of task-relevant processing (Rampp et al., 2022). This was first observed by Salazar et al. (1990), who discussed it in terms of attentional processes, wherein hemispheric dominance shifts from right to left when an attention-demanding skill is learned. Interestingly, however, there also seems to be a simultaneous reduction in verbal-analytic information processing in the left hemisphere in favour of an increase in visuospatial information processing (Cooke, 2013). This may relate to observations wherein archers obtained better results during a condition of externally oriented attention focus, compared to an internally oriented focus of attention (Vrbik et al., 2021). Thus, whilst archery requires the development of interoceptive attention, expert archers are able to silence the ‘inner voice’ or emotional processing when focussing their attention on hitting a target. This interpretation is in line with findings wherein elite archers show increases in both attention and relaxation, whereas mid-level archers have increased attention but decreased relaxation, and higher levels of attention are attained by elite archers at the release phase compared to mid-level and novice archers (Lee, 2009). An EEG study by Vogt et al. (2017) also suggests the development of attentional orienting towards initiating motor movement that accompanies central neuronal preparatory states as a result of archery practice.

Kim and colleagues (2008) performed a functional magnetic resonance imaging (fMRI) study to compare variation between expert and novice archer neural networks. They divided their analysis in a 3-second-resting and a 3-second-aiming period. Here I focus on the attention-demanding aiming period only. Compared to novices, experts showed significantly higher activation of right anterior cingulate cortex, middle occipital gyrus, left fusiform gyrus, superior temporal gyrus, and middle temporal gyrus (Fig. 1a orange areas #1–5). Compared to other regions in their own brains, experts also showed significantly higher activity in the left middle frontal gyrus, left inferior frontal gyrus, right postcentral gyrus, precuneus, lingual gyrus, right extra-nuclear/thalamus, right insula and right para-hippocampal gyrus areas (Fig. 1a yellow areas #6–13). Compared to experts, novices showed more activation in the superior frontal gyrus, inferior frontal gyrus, medial frontal gyrus, precuneus, middle temporal gyrus and corpus callosum (Kim et al., 2008: 239) (Fig. 1b green areas #5,7,9,14–17). When the experts started aiming after resting, the middle occipital gyrus and inferior occipital gyrus were activated as opposed to the frontal area being mainly activated when the novices were aiming. Kim et al. (2008) interpret this as the expert archers’ ability to focus their minds on the target by using only the necessary occipital areas without needing to recruit various other areas of the brain as a result of their long training and experience, suggesting an ‘expertise effect’ associated with perceptual learning. Three important areas overlap in both the expert and novice aiming phases, i.e., the precuneus, left inferior frontal gyrus and the middle temporal gyrus (Fig. 1c blue areas #5,7,9). These areas are activated in expert brains more than other areas, and activated more in novice brains compared to expert brains (Kim et al., 2008), showing where the brain is pressured to perform during aiming an arrow across a distance at a target.

Fig. 1
figure 1

Important brain regions during the aiming/attention phase of archery. a) Expert regions with significantly higher activation than novice areas: [1] right anterior cingulate cortex, [2] middle occipital gyrus, [3] left fusiform gyrus, [4] superior temporal gyrus, [5] middle temporal gyrus. Expert regions with significantly higher activity than other regions in their own brains: [6] left middle frontal gyrus, [7] left inferior frontal gyrus, [8] right postcentral gyrus, [9] precuneus, [10] lingual gyrus, [11] right extra-nuclear/thalamus, [12] right insula, [13] right para-hippocampal gyrus areas. b) Novice regions more active compared to expert regions: [14] superior frontal gyrus, [7] inferior frontal gyrus, [15] medial frontal gyrus, [9] precuneus, [5] middle temporal gyrus, [16] corpus callosum, [17] posterior cingulate gyrus. c) Regions with high activity in both expert and novice archers: [5] middle temporal gyrus, [7] inferior frontal gyrus, [9] precuneus, [18] cerebellum (ML created image with information from: Kim et al., 2008; Lo et al., 2019)

Another finding of the Kim et al. (2008) study showed that the posterior cingulate gyrus of the limbic lobe was only activated in the novice archers (Fig. 1 b17). This region is broadly associated with emotion formation and processing, learning and memory, so that their results may indicate that the experts were able to aim at the target without any emotional interference, whilst the novices experienced more tension and anxiety. Similar to studies of golfers, the archery study demonstrates how the posterior cingulate gyrus and parietal precuneus are activated when novices experience difficulty in paying selective attention and filtering out unnecessary stimuli to the same degree as the experts (Kim et al., 2008). When the novice archers aimed at the target the superior, inferior, and medial frontal gyrus areas were all activated, showing that whereas experts pay attention to the task or adapt to the task immediately, novices need more time to suppress their emotions (for similar studies/discussions also see Kim et al., 2014; Li & Smith, 2021).

Lo et al. (2019) also found a pattern of reduced cerebellar activation accompanying higher sensory cortical activity in archers, compared to non-athletic control participants where the visual network was found to be in concert with extensive cerebellar activation wherein the cerebellum plays a supportive role for the cerebral cortex in sensory data acquisition. According to this hypothesis, it would support the processing capabilities of the various brain regions, especially through the computational demand of sensory information when careful control of incoming sensory data is required. Whilst the cerebellum (Fig. 1 purple areas #18) does not seem to contribute to a particular neuropsychological function directly, it facilitates the processing efficiency of other brain regions in terms of tactile discrimination, auditory processing, spatial orientation and judgment, visuospatial functions, semantic discrimination and duration discrimination (Lo et al., 2019).

Seo et al.’s (2012) fMRI study of a visuospatial working memory task demonstrated that expert archers, compared to novices, displayed higher activation in cortical areas associated with visuospatial attention and working memory, and stronger task-related deactivation in cortical areas. Such areas include the paracentral cortex/precuneus and the anterior and posterior cingulate cortex related to the default network – also known as the network for introspective attention whilst being connected with what is happening. Such deactivation may relate to a redistribution of attentional resources during cognitive tasks (Seo et al., 2012). Neurologically, the default network may be involved across a wide variety of cognitive tasks and has a close correlation with working memory performance. The precuneus is known to be involved in directing attention in space, shifting attention between motor targets, motor coordination when shifting attention to different spatial locations and it is activated during the resting-state default network without intentional sensory-motor activity (Seo et al., 2012).

Cumulatively, Seo et al. (2012) interpret the negative correlations between the paracentral cortex (precuneus) in the default mode network and the activated brain regions as suggesting that visuospatial mental operations are enhanced in archery experts, with their internally directed cognitive activity largely detached from external stimuli. They go on to suggest that: “This ‘internal mentation model’ for DMN [default mode network] posits that the DMN has a role in constructing dynamic mental simulations based on personal past experiences, such as thinking and imagining alternative perspectives and scenarios” (Seo et al., 2012: 182). Thus, their results suggest that through archery practice, a strategy is developed that demands greater use of neural correlates associated with visuospatial working memory and attention, and a greater use of the DMN in visuospatial working memory not directly tied to their domain of expertise (Seo et al., 2012). This implies that the neuro-cognitive changes gained through engaging in a complex bimanual visuospatial task such as archery can be applied to other aspects of technical and/or cognitive engagement. Such flexibility or plasticity is a key characteristic of the sapient mind today (Lourenço & Bacci, 2017).

4 The evolutionary soft- and hardware for paying attention

Neurogenetically, adaptive evolution is associated with selection for excitatory neurons and synaptic function. Kaczanowska et al.’s (2022) atlas of neurogenetic selection associated with human cognitive evolution shows that functional networks already started to shift from motor control to attention in ancient hominoids by 26–19 million years ago, and in ancestral homininae by 19–7.4 million years ago. Selection in functional networks for language emerged with early hominins after 7.4 million years ago continuing with adaptive evolution in functional networks for strategic thinking since about 800 thousand years ago and throughout the split of H. sapiens from ancestral groups after about 600 thousand years ago. They argue that these observations reflect increasingly complex cognitive demands throughout human evolution, and that the co-evolutionary selection for language alongside strategic thinking may have separated archaic Denisovan and Neanderthal groups from their sapient counterparts (Kaczanowska et al., 2022).

For example, the Denisovan split from ancestral groups bi-clustered with functional networks for motor control (sensorimotor, motor-hands, and motor-feet), affective attention/introspection (salience/default mode), affective processing/impulse/emotion control (corticolimbic, prefrontal-accumbens/amygdala), active/passive attention (dorsal/ventral attention), and action planning (frontoparietal) networks (Kaczanowska et al., 2022) (Table 2). The Neanderthal split seems to be correlated with functional network selection for strategic thinking (gambling), working memory and with mathematical skill. The H. sapiens split is neuro-genetically characterized by further selection for functional networks of working memory, motor control (motor-hands), language, emotion recognition, relational processing (causal cognition), abstract thinking – and a notable emphasis on strategic thinking (gambling-reward) compared to any other group (Kaczanowska et al., 2022).

Table 2 Some variation in neuro-genetic adaptive selection for cognition amongst recent big-brained hominins modified from Kaczanowska et al., (2022 Figs. 4D and 5C). Key: D = Denisovans; HN = Homo neanderthalensis; HS = Homo sapiens; * = attention genes associated with the precuneus, underlined entries = genes associated with neuronal plasticity; # = attention not reported for genes by Kaczanowska et al. (2022), but by other authors

An additional literature survey (Table 2) reveals attention as cognitive trait amongst more of the genes highlighted by Kaczanowska et al. (2022). This provides the first detailed analysis into the possible overlap and/or variation between these late, big-brained humans, and the evolution of our ability to pay attention. For example, of the 44 listed genes 26 are associated with attention. Of these ‘attention’ genes none are shared between all three groups, i.e., the Denisovans, Neanderthals and H. sapiens. However, we (H. sapiens) share five attention genes (i.e., ADAMTS9, ARHGEF11, CHL1, LAMB3, MKKS) with the Denisovans, and a different one (ADGRV1) with the Neanderthals. Based on our genetic makeup, it therefore seems that we have more in common in terms of the ability to pay attention with the Denisovans than with the Neanderthals. Fourteen of the attention genes associated with neurogenetic selection for human cognitive evolution are exclusive to H. sapiens (Table 2). These observations suggest different neuro-genetic pathways for developing cognition in terms of how our direct African ancestors, compared to their Eurasian contemporaries, were able to pay attention and engage with technology.

Aspects of the subsequent and continued evolution of attention as a cognitive ability can also be explored through the hominin fossil record (Bruner & Colom, 2022). In this context, the parietal cortex plays a central role in the attention network, and physiological changes and differences in the parietal lobes of big-brained H. neanderthalensis and H. sapiens suggest some functional variation between the two species (Bruner & Colom, 2022). In H. sapiens, the precuneus is distinctively larger and connected with the cingulate and prefrontal cortex, forming the main attention system nexus. The cingulate region is a topological connection between the anterior and posterior cerebral areas and sensitive to the proximity between prefrontal and parietal regions. Bruner and Colom (2022) suggest that because the parietal areas are involved in the integration of visual and bodily stimuli, and the cingulate region is important for the attention system, the evolution of human attention probably went through several changes throughout the last 600 thousand years since the split of the two populations. In the H. sapiens fossil record, it seems to have become specialised sometime between 300 and 100 thousand years ago with the gradual, and continued globularisation (becoming more spherical or globe-shaped) of the cranium (Fig. 2).

Fig. 2
figure 2

Fossil record with schematic brain-shape representations for the globularisation (becoming more spherical or globe-shaped) of the Homo sapiens cranium or braincase over the last 300 thousand years

Aspects of this specialisation and globularisation show continued development between about 100 thousand and 35 thousand years ago, especially in parietal and cerebellar bulging (Neubauer et al., 2018). For example, the Manot Cave calvaria from Israel, dated to about 55 thousand years ago, is similar in shape to recent African skulls and to European skulls from the Upper Palaeolithic period (Hershkovitz et al., 2015). Its endocranial features display a considerable development of parietal features such as the supramarginal lobules, and a slight posterior projection of the occipital lobes (Grimaud-Hervé et al., 2021). For the parietal regions such changes indicate a continued adaptation towards improved orientation, attention, perception of stimuli, sensorimotor transformations underlying planning, visuospatial integration, imagery, self-awareness, working and long-term memory, numerical processing, and tool use.

Parietal bulging is not associated with an increase of outer parietal surface area. It is thus likely that a size increase in the precuneus, related to cognitive specializations in H. sapiens and as a central node of the default mode network, probably contributed to the bulging (Bruner et al., 2017; Neubauer et al., 2018). The cerebellum is associated not only with motor-related functions, such as the coordination of movements and balance, but also with spatial processing, working memory, language, social cognition, and affective/emotional processing. Expansion of the cerebellum in H. sapiens, compared to other humans, is linked to an involvement in higher cognitive functions (Pereira-Pedro et al., 2020), and has been associated with the capacity for cognitive abstraction necessary for the invention of the bow-and-arrow (Shipton, 2023). Interestingly, all three these brain features (default mode network, precuneus and cerebellum) are associated with neurological activity or pressure during modern archery practice (Fig. 1). But as Bruner and Colom (2022) point out, inferences about the evolution of attention based on the partial fossil record of extinct human species alone may be speculative. Sapient-specific gene regions associated with both the precuneus and attention (Table 2*) therefore provide a valuable additional strand of evidence to strengthen and constrain palaeo-neurological inferences based on fossil remains.

Fluid coordination of all the attention-related processes can be achieved through cognitive flexibility and brain plasticity. Brain or neuronal plasticity is the brain’s tendency to be shaped by external influences such as ecological, social and technical interaction (Gomez-Robles & Sherwood, 2017; Lombard & Högberg, 2021). Such plasticity is also reflected in our genetic coding. For example, SRGAP2 is a gene involved in neocortical development and a good candidate to be involved in the evolution of human-specific neural developmental changes related to brain plasticity since about 3–2 million years ago, contemporaneous with the emergence of the genus Homo (Gomez-Robles & Sherwood, 2017). Mutations of the human version of FOXP2 (whose mutation is related to severe speech disabilities), may also relate to varying forms of brain plasticity in late hominin species. Some coding changes in the FOXP2 sequence may have evolved before the divergence of the clade that includes the Denisovans, Neanderthals and H. sapiens, which was probably followed by regulatory changes unique to H. sapiens sometime after 300 thousand years ago (Gomez-Robles & Sherwood, 2017).

Kuhlwilm and Boeckx (2019) published a catalogue of 647 single nucleotide changes in 571 genes that potentially distinguish H. sapiens from archaic humans. They highlighted some as potentially affecting the H. sapiens brain-growth trajectory (e.g., CCND2, HERC5, KIF26B, SPAG5), and neuronal/cognitive functioning (e.g., SLTRK1, SLC6A15), both of which may impact plasticity. If we accept the relatively late (between about 160 and 100 thousand years ago) specialisation of the parietal region/precuneus in H. sapiens only (Fig. 2), it may correlate with variations in genes associated with the precuneus that show a significant signature of selection in H. sapiens. Of these genes eleven (i.e., BZRAP1, CKAP5, CUL4B, EPN2, FAAH, NCOA6, PCLO, RB1CC1, SLC6A15, SLITRK1, SPAG5) stand out as being associated with attention and/or neuronal plasticity (Table 2).

Cumulatively, the fossil and genetic records show that both the soft- and hardware for paying attention have long and complex evolutionary histories. We still do not know all the details, but here I focussed deliberately on the newest approaches in neuro-genetics and palaeo-neurology that provide detailed and evidence-based proxies for some of the mechanisms that may have been associated with the evolution of attention in the human lineage. Both these lines of evidence indicate that whilst we may share several cognitive traits with our Denisovan and Neanderthal cousins, the H. sapiens population that remained in Africa developed unique cognitive traits in how to pay attention until some of them ventured into Eurasia, becoming the only extant human group.

5 Discussion

Archaeological evidence suggests that bow hunting emerged in southern Africa sometime during the Middle Stone Age between about 80 and 60 thousand years ago (Table 1), and it arrived with early sapient settlers by about 54 thousand years ago in France (Metz et al., 2023). Although the parietal lobe and precuneus reached its modern size range by about 100 thousand years ago (Bruner et al., 2017), parietal and cerebellar bulging saw sustained development until about 35 thousand years ago (Hershkovitz et al., 2015; Neubauer et al., 2018). Such continual modification suggests that neuronal and cognitive adaptations associated with these areas were ongoing in the sapient brain when people started to use bows and arrows.

Thus far, evidence for complex bimanual technologies such as the bow-and-arrow is unique to H. sapiens foragers (Lombard, 2021, 2024). Attention and memory play pivotal roles in bow hunting from the tracking phase of a hunt to the final shot (Liebenberg, 1990). San trackers of the Kalahari are acutely aware. They pay close attention to animal spoor and other signs, yet avoid focussing all their attention on the tracks. Instead, they maintain a keen consciousness of everything else around them. Tracking thus requires both selective and intermittent attention, a constant refocussing between changes in the minute details of the spoor and the greater environment. When stalking an animal, the most important thing is not to attract attention with sudden movements. Hunters thus take their time, moving slowly when the prey animal is not looking, and not moving when the animal is looking in their direction, whilst also being careful not to disturb other animals (Liebenberg, 1990). Finally, they aim with quiet-eye precision to bring down their prey – not unlike expert archers today.

San bow hunters have developed a full range of attention strategies to cope with the challenges of the hunt. Divided or intermittent attention during the tracking and stalking phases allows them to devote attention resources to more than one stimulus at a time – cognitive multi-tasking (e.g., Lu et al., 2021). More selective attention, by narrowing the range of salient stimuli in the environment, is necessary during the final stalking and aiming phases and shooting. They maintain a fluid flexibility between the zooming-in (interoceptive) and zooming-out (exteroceptive) effects of attention, enabling them to quickly adapt to changes in the environment. Paying attention simultaneously to the prey animal, other herd animals and hunting partners means that they can also split attention between spatial locations that are not necessarily adjacent to each other (e.g., Awh & Pashler, 2001).

Having the proper arousal level and an effective attentional focus for the task to be performed, particularly during the aiming and shooting phases is critical for successful execution. Whereas attention is key for both modern archery and ancient bow hunting, the hunting scenario also demands quick reflexes and the ability to aim accurately at a moving/speeding target. Bow hunting therefore requires hunters to make split-second decisions, coordinate their limbs within multiple degrees of freedom, and maintain fine motor control under physical and mental fatigue – all while operating under the stress imposed by the fear of disturbing the prey or missing the shot, and thereby not gaining food security. The ability to adapt and refocus in the face of distractions is thus considered one of the key mental skills for bow hunters to develop – not unlike their modern archer counterparts (Pei et al., 2022). Becoming too anxious or ‘choking’ may result in sub-optimal hunting performance, in the same way athletes may choke. Choking is a complex process involving the interplay of several cognitive, attentional, emotional, and situational factors (Wilson, 2012). To maintain attention for the duration of a competition, despite environmental or emotional interference, is an essential aspect of modern archery (Terzioğlu & Çakir, 2020), something that must be practiced.

Similar to observations about modern archers and athletes (e.g., Wu et al., 2010), current African bow hunters practice throughout their careers, usually beginning during early childhood (MacDonald, 2007), continuously improving their task-specific physical and cognitive skills. For example, Hadza boys from Tanzania are given tiny bows when they are only 2–3 years old to carry with them and practice daily on inanimate targets or small birds (Jones & Marlowe, 2002). Older boys make their own bows of increasing size and pull weight, but do not use poisoned arrows until around 14 when they start spending much time away from camp, occasionally hunting antelopes. Hadza accuracy at target shooting increases into middle age, seemingly peaking around 40, after which it plateaus until diminishing in old age (Jones & Marlowe, 2002). Small Kalahari San boys in southern Africa also practice their bow-and-arrow skills around the camp on reptile and bird targets (Lee, 1979). Between 15 and 22, young bow hunters work hard on improving their hunting acuity, peaking at 30–45 years old when physical fitness and strength, are optimally combined with skill, wisdom and experience (Lee, 1979).

Studies on the age-related development of attention generally show rapid improvements in selective and sustained attention, attentional-working memory, as well as executive functions during childhood and continue to develop into adolescence, whereas selectivity and processing speed become slower during later adulthood (Gómez-Pérez & Ostrosky-Solís, 2006). Object-based attention likewise develop from early childhood into early adulthood. Different types of attention show different rates of development and peak at different ages, suggesting that the various aspects of visual, object-based attention rely on different neural pathways (Dye & Bavelier, 2010). Visuo-spatial attention – involving both sensory-level and executive attentional control processes – changes with age, becoming slower and less accurate after about 66 years (Nagamatsu et al., 2011). The prime age of bow hunters is thus not only determined by body size and arm/hand strength (e.g., Jones & Marlowe, 2002), or by fitness, skill, wisdom and experience (e.g., Lee, 1979). Instead, the age-related development of different aspects of attention also contributes to the development, peaking and demise phases of a bow-hunter’s performance.

In the light of Material Engagement Theory (e.g., Malafouris, 2019), developing skills to perform as successful bow-hunters during the Middle Stone Age may have been one of the techno-behaviours that helped to increase various aspects of the sapient attentional range. These aspects probably included the regulation of focussed and cued attention, deliberate attentional allocation and increasing attentional breadth, as well as the modulation of spatial and visual attention (Lu et al., 2021). To have been effective, ancient bow hunters must have also developed the ability to regulate or buffer their own emotional stimuli, whilst sustaining attention throughout a hunt that may last many hours or even stretch over a day or two. Then, they needed the ability to pique their attention by reaching a hyper-focussed state for the final accurate shot – or stand to lose their prey and go hungry.

6 In conclusion

Bruner & Colom (2022) argued that the interaction between brain, body and tools often require a range of attentional resources, and that attention may be involved in the coordination of actions, motor planning and task orientation as well as in deeper cognitive mechanisms such as the integration of tools in the body and neural schemes of a person. They further suggest that noticeable changes in technology demonstrate corresponding changes in the human attention system culminating in the cultural and technological complexity of our recent ancestors (Bruner & Colom, 2022). The origin of bow hunting represents a major change in human techno-behaviour that is, as far as we know, exclusive to H. sapiens. Above, I presented neuro-cognitive work that demonstrates how modern archery training helps to develop the attentional resources of participants, and how paying attention to shoot an arrow accurately from a distance plays out in the human brain.

If we accept that practicing archery affects an archer’s brain and how it pays attention, it is reasonable to suggest that ancient human brains went through similar neuroplastic adaptations when using bimanual technologies that required attentional visuospatial training to function accurately over a distance between the hunters and their prey. Habitual practice and use of such technologies would have stimulated the ability to pay internal and external attention both selectively and concurrently to multiple factors, whilst deliberately ignoring everything else. This interplay between attentional range and buffering is key to sapient cognition today. Below I summarise attentional aspects enhanced through modern archery practice and necessary for effective bow hunting:

  • Coordination between fine-motor execution, visual information processing and focussed attention.

  • Ability to focus attention on multiple aspects synchronously, whilst buffering against paying attention to aspects that are unimportant for successful task execution.

  • Attention toward the self-status, specifically interoceptive attention as the conscious awareness and coordination of bodily sensations with imagined audio-visual feedback.

  • Efficiency in attention networks to facilitate a speedy transition into full attention that enables a swift situational response.

  • Attention control to achieve quiet-eye phases quickly and maintain them despite high levels of interference.

  • Ability to silence the ‘inner voice’ or emotional processing during focused attention.

For paying attention to develop in this manner, a co-evolutionary feedback loop of incremental changes in brain, DNA and material engagement was probably required (Lombard & Högberg, 2021). Although the neuro-genetic shift from motor control to attention already started with the hominoids more than 19 million years ago, a distinctly sapient enhancement is echoed in the suite of genes associated with attention and how these vary between the recent big-brained human groups (Table 2). Pressure on brain regions such as the precuneus, and the default mode network that facilitate the coordination of visuospatial accuracy and attentional plasticity during bimanual technical engagements may have been one of the stimulants for their unique development in the sapient brain. The development of visuospatial integration and attention as facilitated by the precuneus may have already started with early throwing weapons – before the Neanderthal-sapiens split. But such throwing does not require the fine-tuned bimanual coordination and range of attention represented by bow hunting. The earliest evidence for bow hunting in Africa appears not too long after (in evolutionary terms) fossil evidence for the unique globularisation of the H. sapiens skull that becomes within modern range by about 100 thousand years ago and continues to develop afterward. Considering all these factors, I suggest that although we may not be able to ‘excavate’ the minds of the earliest Middle Stone Age bow hunters, this ‘noticeable change’ in technology signifies an important turning point in the evolution of the human ability to pay attention.