Journal of Autism and Developmental Disorders

, Volume 49, Issue 4, pp 1342–1351 | Cite as

Sex Differences in Social Attention in Infants at Risk for Autism

  • Johan Lundin KlebergEmail author
  • Pär Nyström
  • Sven Bölte
  • Terje Falck-Ytter
Open Access
Original Paper


We studied visual attention to emotional faces in 10-month-old infant siblings of children with ASD (ASD-sibs; N = 70) and a siblings of typically developing children (N = 29) using static stimuli. Contrary to our predictions, we found no evidence for atypical gaze behavior in ASD-sibs when boys and girls were analyzed together. However, a sex difference was found in ASD-sibs' visual attention to the mouth. Male ASD-sibs looked more at the mouth across emotions compared to male controls and female ASD-sibs. In contrast, female ASD-sibs looked less at the mouth compared to female controls. These findings suggest that some aspects of early emerging atypical social attention in ASD-sibs may be sex specific.


Autism Spectrum Disorder (ASD) Eye tracking High-risk infants Emotion Broader autism phenotype Face processing 

Siblings of children with autism spectrum disorder (ASD) are a population with highly elevated prevalence of ASD, as well as other forms of neurodevelopmental and psychiatric challenges. Longitudinal studies following this population (hereafter ASD-sibs) from infancy to early childhood have shown that around 25% of male infants and 10% of female infants in this group are later diagnosed with ASD as compared to around 1–2% in the general population (Idring et al. 2015; Messinger et al. 2015; Ozonoff et al. 2011). ASD-sibs who do not fulfill the full criteria for an ASD diagnosis often have elevated levels of subclinical ASD symptoms (Messinger et al. 2013), or other clinical conditions, including language disorders, ADHD, externalizing and internalizing disorders. (Jones et al. 2014; Messinger et al. 2013; Ozonoff et al. 2011). Therefore, comparisons between ASD-sibs and infant siblings of children without familiar risk for ASD can provide new leads on potential early differences related to later neurodevelopmental and psychiatric problems in general, and ASD in particular.

In the present study, we focused on visual attention to faces in ASD-sibs and a control group of infants without familiar risk for ASD. There are at least two important reasons to study this topic. First, reduced or atypical attention to social information such as faces and biological motion is commonly seen in toddlers and young children with ASD (Chawarska and Shic 2009; de Wit et al. 2008; Falck-Ytter et al. 2013; Guillon et al. 2014; Kleberg et al. 2017; Moriuchi et al. 2017), although existing research points to a considerable variation across experimental tasks (e.g. Guillon et al. 2014; Falck-Ytter and von Hofsten 2011). Since ASD-sibs are at high risk for ASD and autistic symptoms, early atypical attention to faces could represent an early sign of ASD symptomatology. Secondly, visual attention to faces is likely to be highly important in infant development beyond core ASD symptomatology. Previous studies have documented that typically developing infants are highly attentive to faces (e.g. Bakker et al. 2011; Gredebäck et al. 2012; Oakes and Ellis 2013), which in turn is linked to the development of social cognition, as well as to language acquisition (e.g. Tenenbaum et al. 2013; Gredebäck et al. 2012), and development of the social brain (e.g. Johnson et al. 2015). Early atypical attention to faces could therefore have cascading consequences that ultimately lead to the development of behavioral difficulties. A better understanding of visual attention to faces in ASD-sibs can therefore contribute to an understanding of the early development of ASD-sibs in multiple areas. In the following sections, we briefly review the literature about visual attention to faces in infants without elevated risk for ASD, before turning to the literature about visual attention to faces in ASD-sibs.

Visual Attention to Faces in Infancy

Face scanning undergoes a rapid development during the first year of life in typically developing infants. At 4 months, infants direct their visual attention mainly to the eyes and relatively little to the mouth. Although eyes and mouth region doubtlessly continue to be important sources of social information from infancy and throughout development, a relative shift towards the mouth and a gradual decrease in attention to the eyes is seen during the second half of the first year, that reaches its peak levels between 8 and 12 months (Lewkowicz et al. 2012; Oakes and Ellis 2013; Tenenbaum et al. 2013). This increase in looking time at the mouth has been related to language acquisition (Lewkowicz et al. 2012).

During the second half of the first year, infants also develop an increasing ability to differentiate between facial emotions. Whereas infants are sensitive to facial expressions of happiness already during the first weeks of life (Field et al. 1982; Grossmann et al. 2007), fearful faces are reliably detected from around 7 months of age (see Leppänen and Nelson 2012 for a review). Eye tracking studies have shown that infants from 7 months of age look longer at fearful faces as compared to faces displaying other emotions (Peltola et al. 2008), and also distribute their visual attention more broadly between areas within faces with a fearful expression as compared to other expressions (Hunnius et al. 2011; Gredebäck et al. 2012). As a consequence, infants may look less at the eyes of fearful faces, as compared to happy or neutral (Hunnius et al. 2011). The fact that looking time at the eyes is decreased for fearful faces may seem counterintuitive in light of other studies showing that the eyes are typically the most diagnostic region for identifying fear (Adolphs 2008), but may represent a ‘vigilant’ form of attention driven by the potential presence of a threat (Hunnius et al. 2011; Gredebäck et al. 2012). To sum up, infants are attentive to faces during the first year, and their visual attention becomes increasingly sensitive to the emotional valence of the faces.

Sex differences in some aspects of face processing have been found in typical development. For example, Pascalis et al. (1998) reported earlier maturation of face processing in 3–6 month old male infants, and Rennels and Cummings (2013) reported differences in visual scanning strategies of female and male infants at 3–6 and 9–10 months. In this study, male infants were more likely than female infants to shift their gaze between external and internal features of the face, whereas female infants made more gaze shifts within the internal regions of the face. Sex differences in visual scanning could relate to differences in cognitive processing. For example, female infants have often been found to perform better in face recognition tasks (McClure 2000).

Visual Attention to Faces in ASD-Sibs

A number of previous studies have examined visual attention to faces in ASD-sibs. Of these, studies using static images of smiling or neutral faces as stimuli found highly similar visual scanning of core regions in ASD-sibs and controls during the first year of life (Dundas et al. 2012; Key and Stone 2012; Young et al. 2009). The aforementioned studies have compared ASD-sibs to control groups without familiar risk for ASD. Further, longitudinal studies have examined whether visual attention to faces in ASD-sibs predicts a diagnosis of ASD: a recent study reported that six month old ASD-sibs who were later diagnosed with ASD did not differ from a matched control group in overall looking time at images of mothers’ and strangers faces. However, ASD-sibs who were not later diagnosed with ASD looked less at the stimuli than both ASD-sibs with a later diagnosis and controls (Wagner et al. 2016). In contrast, a small number of eye tracking studies using dynamic videos as stimuli have found atypical face scanning in infants later diagnosed with ASD (Chawarska et al. 2013; Jones and Klin 2013; Shic et al. 2014). To our knowledge, only one study has examined the effect of facial emotion on visual attention in ASD-sibs. Wagner et al. (2016) examined looking time to happy, fearful and neutral faces in a group of 9 month old ASD-sibs who did not fulfill the criteria for an ASD diagnosis at a subsequent 36 months visit and a typically developing control group. Both groups looked more at the eyes of fearful faces and more at the mouth of happy faces. In addition, the ASD-sibs who did not develop ASD had larger pupil dilation (an index of autonomic nervous system arousal) when viewing faces than controls, regardless of emotional expression (Wagner et al. 2016).

As noted previously, typically developing infants increase their looking time at the mouth of faces during the second half of the first year. This change is believed to be related to verbal development. An interesting question is therefore whether the same relation between visual attention to the mouth area of faces and concurrent or later language ability is seen in ASD-sibs. In support of this hypothesis, two studies have reported that attention to the mouth in ASD-sibs at nine (Elsabbagh et al. 2014) and six (Young et al. 2009) months predicts later expressive language skills at 24–36 months in both ASD-sibs and controls. Another study found that more gaze at the eyes at 6 months predicted worse expressive language at 24 months ASD-sibs but not in controls (Wagner et al. 2016). Together, these studies suggest that individual differences in language acquisition may be related to face scanning in both ASD-sibs and controls, but that the relationships may be different in the two populations.

Sex Differences in ASD-Sibs

Male ASD-sibs are at a two- to threefold risk of ASD as compared to female ASD-sibs (e.g. Messinger et al. 2016; Ozonoff et al. 2011). This has led to an interest in sex differences in early social attention in this group. One aim of this line of research is to identify potential compensatory mechanisms or protective factors against ASD in female infants. For example, Chawarska et al (2016) reported that female ASD-sibs (regardless of subsequent diagnostic outcome) looked longer at the face of a speaking actress both compared to male ASD-sibs and control infants. Increased attention to faces was associated with better socio-communicative skills at 24 months in both sexes, as measured with the Autism Diagnostic Observation Schedule (ADOS). This suggests that visual attention to the eyes may be related to protective or compensatory processes. It is also possible that partly different mechanisms may lead to ASD in male and female infants. A recent study by Bedford et al. (2016) examined the longitudinal predictive relationships between three previously identified behavioral markers of ASD at 14 months and autistic symptoms at 36 months broken down by sex. The three markers represented non-social attention (visual disengagement), social attention (gaze following) and a composite symptom measure (autism observation scale for infants; AOSI, Bryson et al. 2008). Previous studies have reported that these three tasks predict an ASD diagnosis in ASD-sibs, but Bedford et al. (2016) reported that these relations were only found in males. Taken together, these studies suggest that sex differences are important to examine in studies of ASD-sibs. However, it should be noted that, since the base rate of ASD symptoms is higher in males, studies of sex differences in young infants with ASD or ASD-sibs typically have lower power to detect atypicalities in females.

Aims and Hypotheses

The present study was designed to compare visual attention to emotional faces in 10 month old ASD-sibs to a control group without family risk for ASD. In line with previous studies in infants, we expected fearful faces to elicit lower relative looking time at the eyes than happy faces. We hypothesized that this effect of emotion would be smaller in ASD-sibs than in controls. In light of recent reports of sex differences in attention and developmental pathways in ASD-sibs (e.g. Chawarska et al. 2016), sex was added as a factor in all analyses, but we did not have an a priori hypothesis related to this factor. Similarly, previous studies indicate that there may be differences between typical infants and ASD-sibs in terms of looking time to eyes and mouth (e.g. Jones and Klin 2013; Chawarska et al. 2013), but here, again, we did not specify the direction of these results as previous data are rather mixed. Finally, we analyzed linear relationships between looking time at the eyes and mouth, and the expressive and receptive language subscales of the Mullen Scales of Early Learning (MSEL; Mullen 1995). We consider these analyses exploratory.



Data from 99 infants (70 ASD-sibs) were included in the analysis. Data were collected as part of an ongoing longitudinal study following infants from the first year of life (The Early Autism Sweden (EASE) study; All infants in the ASD-sibs group had one or more full siblings with a community diagnosis of ASD. The diagnosis of the older sibling was confirmed through consultation of medical records. The control group was recruited from a database of families who had expressed interest in developmental research. All infants in the control group had one or more sibling with typical development, and no family history of ASD up to second degree relative. All infants were born full term (> 36 weeks) and did not have any confirmed or suspected medical problems, including visual/auditory impairments. As can be seen in Table 1, the two groups did not differ in age at assessment, gender distribution, verbal, and non-verbal cognitive development, as measured with the MSEL. There were also no gender differences within either controls or ASD-sibs (lowest p = 0.19). In addition to the sample reported here (N = 99), six infants were initially tested but excluded from analysis, because they were half-siblings of an older child with ASD. One infant in the control group was excluded from the analysis because of a subsequent ASD diagnosis, and 14 infants (7 ASD-sibs) were seen but excluded because too little valid data was recorded (because of either equipment failure or calibration problems). One infant in the ASD-sibs group was considered an outlier in the eye-mouth index (see definition below) and was therefore excluded from further analysis (see "Data Reduction").

Table 1

Gender, age, and cognitive development


ASD-sibs (N = 70)

Control group (N = 29)


Gender (% Female)




Age (days); M (sd)

311.5 (11.4)

307.7 (14)


MSEL Early Learning Composite Raw Score; M (sd)

99.72 (13.51)

103.30 (11.40)


MSEL Visual Reception Raw Score; M (sd)

13.92 (2.17)

14.20 (1.19)


MSEL Fine Motor Raw Score; M (sd)

13.63 (1.73)

14.12 (1.20)


MSEL Expressive Language Raw Score; M (sd)

10.04 (1.97)

10.33 (2.23)


MSEL Receptive Language Raw Score; M (sd)

10.80 (2.10)

10.53 (2.08)


MSEL Mullen Scales of Early Learning

Numbers represent raw scores

aΧ2-test (two-tailed)

bt-test (two-tailed)

Ethics Approval and Consent to Participate

Parents provided written informed consent, and the study was approved by the Regional Ethical Board in Stockholm. The study was conducted in accordance with the standards specified in the 1964 Declaration of Helsinki.

Data Collection and Analysis

Infants watched the stimuli seated in the lap of a parent. Stimuli were presented on a computer monitor placed at approximately 60 cm distance. The experimental stimuli were presented in random order interleaved with stimuli from other experiments (including inverted faces) not analyzed here. Gaze data were recorded using Tobii corneal reflection eye trackers (Tobii Technology, Danderyd, Sweden). A change in equipment took place during the period of data collection as follows: data from 48 infants (34 ASD-sibs) were recorded at a sample rate of 50 Hz with a Tobii 1750 (Tobii Inc, Danderyd, Sweden; Screen resolution: 1280 × 1024 pixel; Screen size: 17″), and data from 17 infants (11 ASD-sibs) were recorded at 120 Hz, and from 34 infants (25 ASD-sibs) at 300 Hz with a Tobii TX300 system (Tobii Inc, Danderyd, Sweden; Screen resolution: 1600 × 1200 pixels; Screen size: 23″). Stimuli were presented in the same size (342 × 274 mm) on both eye trackers.

No difference was found in the proportion of ASD-sibs and controls, χ2 (1) = 0.004, p = 0.948, or the proportion of boys and girls, χ2 (1) = 0.482, p = 0.488, tested with the two eye tracking systems. The proportion of rejected samples was slightly higher in the older T1750 eye tracker, but the difference was not significant, t (96) = 1.68, p = 0.089. To control for potential equipment differences, all analyses were calculated with eye tracker (T1750, TX300) included as a fixed effect. No significant main or interaction effect involving eye tracker was found (lowest p = 0.58). We therefore pooled the data, and eye tracker was excluded as covariate in the final model. A cognitive assessment with the MSEL was performed during the same visit as the eye tracking experiment.


Stimuli consisted of static images of adult faces from the Karolinska Directed Emotional Faces library (Lundqvist et al. 1998) of female and male models displaying either a fearful or a happy expression. During the experiment, infants saw four fearful and four happy faces, and each image had a presentation duration of 5 s. For each infant, the images presented were randomly selected from a larger set of 16 stimulus pictures. All trials were preceded by a moving animation in order to attract the infant’s attention to the center of the screen, and presented for 5 s. Due to a technical error, 24 infants (20 ASD-sibs) saw five instead of four presentations. In these cases, the fifth trial was removed from further analysis to ensure that the maximum number of analyzed trials was equal between participants. The proportion of male stimulus faces was 53% in the ASD-sibs group and 51% in controls.

Data Reduction

Raw gaze coordinates were analyzed using custom scripts written in MATLAB (Mathworks Inc., CA, USA). Average values for the right and left eye were used in the analysis. Standard fixation parsing algorithms may not be reliable in infants (e.g. Wass et al. 2013). Therefore, we analyzed accumulated looking time based on the raw data. To compensate for data loss due to movement artefacts and blinks, we interpolated linearly over gaps in the data shorter than 150 ms. In order to reduce noise (i.e. rapid changes in gaze position that are likely technical artefacts), data were filtered using a moving median filter with a window corresponding to 80 ms.

Trials with less than 750 ms valid looking time (15% of the trial) were discarded. With these criteria, 494 trials from 71 ASD-sibs (34 male; average proportion of valid trials per participant: 91%) and 204 trials from 29 controls were included 12 male; average proportion of valid trials per participant: 92%. We defined areas of interest (AOIs) covering the (1) whole screen; (2) the eyes; and (3) the mouth (see Fig. 1).

Fig. 1

Areas of interest (AOIs) shown on a fearful (a) and happy (b) stimulus image

Dependent Variables

The dependent variables were (1) total looking time at the screen in milliseconds; (2) looking time at the eyes (relative to total looking time at the screen), (3) looking time at the mouth (relative to total looking time at the screen), and (4) the eye-mouth index (henceforth, EMI), defined as looking time at the eyes divided by the summed looking time at the eyes and mouth. Higher EMI values therefore indicate more looking time at the eyes, relative to the mouth. The EMI is an index of the relative distribution of gaze between the eyes and mouth, and can therefore contribute additional information about gaze behavior during face perception. The EMI has been used in many previous studies of face scanning in ASD and ASD-sibs, and gives a composite measure of the relative distribution of gaze within the face (e.g. Falck-Ytter 2008; Young et al. 2009; Merin et al. 2007).

Statistical Analysis

All statistical analyses were performed in MATLAB (version 2016b, Mathworks, Inc.). Prior to analyses, all variables were inspected for outliers, both in terms of single responses and average responses for each participant. One female participant in the ASD-sibs group was excluded from analyses of the eyes, mouth, and EMI variables because of an average value deviating more than three standard deviations from the mean of the full sample as well as in the group of female ASD-sibs. The EMI and mouth variables were negatively skewed, and were therefore arcsine transformed. When the analyses were performed on the untransformed data, all significant effects remained unchanged. Data were analyzed using linear mixed effects (LME) models using the ‘fitlme’ function with random intercepts for subject. Emotion (happy, fearful), group (ASD-sibs, controls), and gender (male, female), were fixed factors (predictors). LME models are useful for analyzing data with inter-individual variability and uneven number of trials between participants (Baayen et al. 2008) and have been used in previous infant eye tracking studies (Chawarska et al. 2016).

Since previous literature has suggested that language development is strongly related to face scanning at 10 months and infants varied widely in language level, we added the expressive and receptive language subscales of the MSEL as predictors in all analyses. However, all the reported significant results remained when these covariates were removed from the models. We tested significant effects by comparing models with and without the fixed effects using likelihood ratio tests (LRT) computed with the ‘compare’ function in MATLAB (see Baayen et al. 2008). In preliminary analyses, we also added trial number, model gender (male, female), and MSEL Fine Motor and Visual Reception scores as predictors. No significant main or interaction effects involving these measures were found (lowest p = 0.09), and these covariates were therefore dropped from the main analysis. Residual plots indicated that residuals were approximately normally distributed in all analyses.


Preliminary Analysis

Looking time at the whole screen decreased in later trials, χ2 (1) = 27.35, p ≤ 0.001, b = − 201.36, SE = 37.77, but there were no interactions between trial and group, χ2 (1) = 0.18, p = 0.672, trial and emotion, χ2 (1) = 1.57, p = 0.210, or trial and sex, χ2 (1) = 0.15, p = 0.694. There were also no three- or four-way interactions between trial order, and looking time at the screen (all p > 0.20). Looking time at the eyes decreased in later trials, χ2 (1) = 7.24, p = 0.007, b = − 0.02, SE = 0.01, but there was no interaction between trial and group, χ2 (1) = 0.37, p = 0.542, trial and sex, χ2 (1) = 2.12, p = 0.146, or trial and emotion, χ2 (1) = 1.62, p = 0.204. No relation was found between trial order and looking time to the mouth or EMI (p > 0.07).


Looking time at the screen was not related to emotion, χ2 (1) = 0.03, p = 0.861, b = − 14.36, SE = 81.91, group, χ2 (1) = 1.07, p = 0.300, b = 180.20, SE = 173.49, or sex, χ2 (1) = 0.13, p = 0.718, b = 56.47, SE = 156.47. No significant interactions were found between group and emotion, χ2 (1) = 0.01, p = 0.929, b = 16.15, SE = 180.17, group and sex, χ2 (1) = 0.13, p = 0.722, sex and emotion, χ2 (1) = 0.11, p = 0.741, or group, sex, and emotion, χ2 (1) = 1.58, p = 0.209.


For looking time at the eyes, we found no significant main effects of group, χ2 (1) = 0.01, p = 0.909, b = − 0.01, SE = 0.05, emotion, χ2 (1) = 0.47, p = 0.492, b = 0.01, SE = 0.02, or sex, χ2(1) = 0.22; p = 0.641; b = 0.02; SE = 0.04. There were also no significant interaction effects between group and emotion, χ2 (1) = 0.05, p = 0.816, sex and group, χ2 (1) = 1.82, p = 0.177, or sex and emotion, χ2 (1) = 1.05, p = 0.306. These data are shown in Fig. 2.

Fig. 2

Average proportion of looking time to the eyes (a), mouth (b), and Mean Eye-Mouth Index (EMI); c in ASD-sibs and controls as a function of sex. Error bars represent 95% confidence intervals of the mean. *p < 0.05, **p < 0.01


There were no significant main effects of group, χ2 (1) = 0.11, p = 0.736, b = 0.04, SE = 0.12, or sex, χ2 (1) = 0.21, p = 0.650, b = 0.05, SE = 0.11. However, we found a significant main effect of emotion, χ2 (1) = 8.64, p = 0.003, b = − 0.11, SE = 0.04, driven by lower proportion of looking time to the mouth of happy faces. Contrary to our predictions, we found no interaction effect between group and emotion, χ2 (1) = 2.65, p = 0.104. Since we had an a priori hypothesis about these results, we ran the analysis in the two groups separately. When the analysis was broken down by group, a significant effect reflecting lower proportion of looking time to the mouth of happy faces was found in the control group, χ2 (1) = 8.18, p = 0.004, b = − 0.20, SE = 0.07. This effect was not significant, but in the same direction, in ASD-sibs, χ2 (1) = 2.62, p = 0.106, b = − 0.07, SE = 0.04.

There was a significant interaction effect between sex and group, χ2 (1) = 8.44, p = 0.004. No significant interaction effect was found between sex and emotion, χ2 (1) = 0.12, p = 0.733, or emotion, sex, and group, χ2 (1) = 0.18, p = 0.671. Follow-up comparisons showed that male ASD-sibs looked more at the mouth than male controls, χ2 (1) = 7.01, p = 0.008, b = − 0.45, SE = 0.16, but that female ASD-sibs looked less at the mouth than female controls, χ2 (1) = 5.40, p = 0.020, b = 0.36, SE = 0.15. Within the ASD-sibs group, females looked less at the mouth than males, χ2 (1) = 4.01, p = 0.045, b = 0.24, SE = 0.12. In controls, a trend towards longer looking time at the mouth in females was found, χ2 (1) = 3.46, p = 0.063, b = − 0.39, SE = 0.20. These data are shown in Fig. 2.

Eye-Mouth Index

Analyses of the EMI yielded highly similar results as the analyses of looking time at the mouth. No main effects of group, χ2 (1) = 0.02, p = 0.885, b = − 0.02, SE = 0.16, or sex, χ2 (1) = 0.14, p = 0.704, b = − 0.06, SE = 0.15 were found. The EMI was higher (i.e. longer looking time at the eyes relative to the mouth) when infants watched happy as compared to fearful faces, χ2 (1) = 4.77, p = 0.029, b = 0.10, SE = 0.05. We found no interaction effect between emotion and group, χ2 (1) = 0.91, p = 0.340. When the analysis was broken down by group, a significant effect of emotion reflecting higher EMI values for happy faces was found in the control group, χ2 (1) = 3.89, p = 0.048, b = 0.17, SE = 0.08, but not in ASD-sibs, χ2 (1) = 1.74, p = 0.187, b = 0.07, SE = 0.05.

As in the analysis of the mouth AOI, we found a significant interaction effect between infant sex and group for the EMI index, χ2 (1) = 8.59, p = 0.003. No significant interactions were found between sex and emotion, χ2 (1) = 1.05, p = 0.306, or emotion, sex, and group, χ2 (1) = 0.14, p = 0.711. Follow-up comparisons showed that male ASD-sibs had lower EMI values than male controls, χ2 (1) = 7.35, p = 0.007, b = 0.64, SE = 0.23, but that female ASD-sibs had higher EMI values than female controls, χ2 (1) = 4.99, p = 0.026, b = − 0.47, SE = 0.20. Within the ASD-sibs group, a trend towards higher EMI values in females as compared to males was found, χ2 (1) = 3.63, p = 0.057, b = − 0.32, SE = 0.17, whereas lower EMI values were found in female than in male controls, χ2 (1) = 4.00, p = 0.045, b = 0.55, SE = 0.27. These data are shown in Fig. 2.

Relations Between Face Scanning and Concurrent Language Development

In controls, looking time at the eyes was negatively related to expressive, χ2 (1) = 5.69, p = 0.017, b = − 0.05, SE = 0.02, but not receptive language, χ2 (1) = 1.29, p = 0.256, b = − 0.03, SE = 0.02. No significant relations were found between looking time at the mouth and expressive, χ2 (1) = 3.35, p = 0.067, b = 0.09, SE = 0.05, or receptive, χ2 (1) = 0.79, p = 0.373, b = 0.05, SE = 0.05, language. In ASD-sibs, looking time at the eyes was not significantly related to expressive language, χ2 (1) = 0.88, p = 0.348, b = − 0.01, SE = 0.01, or receptive language, χ2 (1) = 0.56, p = 0.455, b = − 0.01, SE = 0.01. We did also not find evidence for a relation between looking time at the mouth and either expressive, χ2 (1) = 0.01, p = 0.906, b = 0.00, SE = 0.03, or receptive language in ASD-sibs, χ2(1) = 0.01, p = 0.931, b = 0.00, SE = 0.03.


The aim of the present study was to examine visual attention to fearful and happy faces in infant siblings of children with ASD (ASD-sibs) and controls. Our hypothesis was that ASD-sibs would show reduced differentiation between emotional expressions in terms of visual attention. The data did not support the hypothesis, as we did not find the expected interaction between group and emotion. This finding points to an area of preserved face processing in ASD-sibs. Across groups, a higher proportion of looking time was directed at the mouth of fearful as compared to happy faces. However, it is notable that the difference between emotions was only marginally significant in ASD-sibs, despite a relatively large sample size, whereas a strong effect of emotion was found in the control group.

A rather strong sex difference was found in visual attention to the mouth, both in proportion of total looking time and relative to the eyes. Male ASD-sibs scanned the mouth region more than male controls and female ASD-sibs, both relative to the eyes and in proportion of total looking time at the screen. The reverse pattern was found in female ASD-sibs—i.e. reduced attention to the mouth compared to female controls. This means that in both sexes, atypical scanning of emotional faces was found, but in different directions compared to sex matched controls. Sex differences in social attention in ASD-sibs during infancy may reflect a compensatory mechanism in females, or a higher accumulated load of risk factors in males (Chawarska et al. 2016; Messinger et al. 2015; Robinson et al. 2013). Longitudinal studies are needed to determine how the present results relate to subsequent outcome.

We also found a sex difference within the control group, with female infants looking more at the mouth relative to the eyes than males. Sex differences in social attention have previously been reported in typically developing infants (e.g. Rennels and Cummings 2013; but see Bakker et al. 2011). It is possible that the observed sex difference in the control group may reflect differences in processing of the visual characteristics of faces (e.g. Rennels and Cummings 2013). Alternatively, it is possible that the observed sex difference in face scanning in the control group is sign of emerging sex differences in language development. Female infants tend to develop earlier in the domain of language (e.g. Messinger et al. 2015), and relatively more attention to the mouth is related to later expressive language skills (Lewkowicz et al. 2012). It should be noted, however, that no sex differences were found in the language measures in MSEL. Interestingly, we found that less extensive scanning of the eyes was related to concurrent expressive language in controls only, not in ASD-sibs, despite this sample being substantially larger.

Attention to both eyes and mouth is important for infant social development, but are likely to be related to different socio-cognitive processes. Whereas attention to the eyes provide opportunities to learn about other’s intentional states, and focus of attention (Batki et al. 2000; Senju and Csibra 2008), attention to the mouth at during the second half of the first year is related to language acquisition (Lewkowicz et al. 2012). A speculative interpretation of our results would therefore be that increased attention to eyes in female ASD-sibs reflect protective factor against social-cognitive impairments, but may also be predictive of worse language development. Consistent with this prediction, Carter et al. (2007) reported that female toddlers with ASD had lower language functioning than male toddlers with ASD (but see Reinhardt et al. 2015). This notion can be tested once follow up data from the current sample is available.

In conclusion, our results suggest that female and male infant siblings of children with autism attend differently to the eyes and mouth of emotional faces. These findings, particularly if corroborated by larger studies with later ASD outcome, could contribute to the understanding of the early development of infant siblings at risk for autism, and stress the importance of studying the development of ASD separately in females and males.



We would like to thank all the participating families and the EASE-team (Sheila Achermann, Linn Andersson Konke, Karin Brocki, Elodie Cauvet, Gustaf Gredebäck, Elisabeth Nilsson Jobs, Emilia Thorup). This research was supported by grants to Terje Falck-Ytter from the Swedish Research Council (2015-03670), Stiftelsen Riksbankens Jubileumsfond (NHS14-1802:1) and the Strategic Research Area Neuroscience at Karolinska Institutet (StratNeuro). Sven Bölte was supported by the Swedish Research Council (Grant No. 523-2009-7054).

Author Contributions

JLK analyzed the data and wrote the manuscript under the supervision of TFY. PN contributed to the design of the experiment, data analysis and coordination of the study. SB contributed to the coordination of the study and to the interpretation of the results. TFY designed the study, analyzed the data and contributed to the coordination of the study.


This research was supported by grants to Terje Falck-Ytter from the Swedish Research Council (2015-03670), Stiftelsen Riksbankens Jubileumsfond (NHS14-1802:1; Pro Futura Scientia programme) and ALF Medicin (Stockhom County Council). Sven Bölte was supported by the Swedish Research Council (Grant No. 523-2009-7054).

Compliance with Ethical Standards

Conflict of interest

The authors declare that they have no conflict of interest.

Supplementary material

10803_2018_3799_MOESM1_ESM.jpg (41 kb)
Supplementary material 1 (JPG 40 KB)


  1. Adolphs, R. (2008). Fear, faces, and the human amygdala. Current Opinion in Neurobiology, 18(2), 166–172.CrossRefGoogle Scholar
  2. Baayen, R. H., Davidson, D. J., & Bates, D. M. (2008). Mixed-effects modeling with crossed random effects for subjects and items. Journal of Memory and Language. Scholar
  3. Bakker, M., Kochukova, O., & von Hofsten, C. (2011). Development of social perception: A conversation study of 6-, 12- and 36-month-old children. Infant Behavior and Development, 34, 353–370.CrossRefGoogle Scholar
  4. Batki, A., Baron-Cohen, S., Wheelwright, S., Connellan, J., & Ahluwalia, J. (2000). Is there an innate gaze module? Evidence from human neonates. Infant Behaviour and Development, 23, 223–229.CrossRefGoogle Scholar
  5. Bedford, R., Jones, E. J., Johnson, M. H., Pickles, A., Charman, T., & Gliga, T. (2016). Sex differences in the association between infant markers and later autistic traits. Molecular Autism, 7(1), 21.CrossRefGoogle Scholar
  6. Bryson, S. E., Zwaigenbaum, L., McDermott, C., Rombough, V., & Brian, J. (2008). The Autism Observation Scale for Infants: Scale development and reliability data. Journal of Autism and Developmental Disorders, 38(4), 731–738.Google Scholar
  7. Carter, A. S., Black, D. O., Tewani, S., Connolly, C. E., Kadlec, M. B., & Tager-Flusberg, H. (2007). Sex differences in toddlers with autism spectrum disorders. Journal of Autism and Developmental Disorders, 37(1), 86–97.Google Scholar
  8. Chawarska, K., Macari, S., Powell, K., Dinicola, L., & Shic, F. (2016). Enhanced social attention in female infant siblings at risk for autism. Journal of the American Academy of Child & Adolescent Psychiatry, 55(3), 188–195. Scholar
  9. Chawarska, K., Macari, S., & Shic, F. (2013). Decreased spontaneous attention to social scenes in 6-month-old infants later diagnosed with autism spectrum disorders. Biological Psychiatry, 74(3), 195–203. Scholar
  10. Chawarska, K., & Shic, F. (2009). Looking but not seeing: Atypical visual scanning and recognition of faces in 2 and 4-year-old children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 39(12), 1663–1672. Scholar
  11. de Wit, T. C. J., Falck-Ytter, T., & von Hofsten, C. (2008). Young children with Autism Spectrum Disorder look differently at positive versus negative emotional faces. Research in Autism Spectrum Disorders, 2(4), 651–659. Scholar
  12. Dundas, E., Gastgeb, H., & Strauss, M. S. (2012). Left visual field biases when infants process faces: A comparison of infants at high- and low-risk for autism spectrum disorder. Journal of Autism and Developmental Disorders, 42(12), 2659–2668. Scholar
  13. Elsabbagh, M., Bedford, R., Senju, A., Charman, T., Pickles, A., & Johnson, M. H. (2014). What you see is what you get: Contextual modulation of face scanning in typical and atypical development. Social Cognitive and Affective Neuroscience, 9(4), 538–543. Scholar
  14. Falck-Ytter, T. (2008). Face inversion effects in autism: A combined looking time and pupillometric study. Autism Res, 1(5), 297–306. Scholar
  15. Falck-Ytter, T., & von Hofsten, C. (2011). How special is social looking in ASD: A review. Progress in Brain Research, 189, 209–222. Scholar
  16. Falck-Ytter, T., Rehnberg, E., & Bölte, S. (2013). Lack of visual orienting to biological motion and audiovisual synchrony in 3-year-olds with autism. PLoS ONE, 8(7), e68816. Scholar
  17. Field, T., Woodson, R., Greenberg, R., & Cohen, D. (1982). Discrimination and imitation of facial expression by neonates. Science, 218(4568), 179–181.CrossRefGoogle Scholar
  18. Gredebäck, G., Eriksson, M., Schmitow, C., Laeng, B., & Stenberg, G. (2012). Individual differences in face processing: Infants’ scanning patterns and pupil dilations are influenced by the distribution of parental leave. Infancy, 17(1), 79–101. Scholar
  19. Grossmann, T., Striano, T., & Friederici, A. D. (2007). Developmental changes in infants processing of happy and angry facial expressions: A neurobehavioral study. Brain and Cognition, 64, 30–41. Scholar
  20. Guillon, Q., Hadjikhani, N., Baduel, S., & Rogé, B. (2014). Visual social attention in autism spectrum disorder: Insights from eye tracking studies. Neuroscience & Biobehavioral Reviews, 42, 279–297. Scholar
  21. Hunnius, S., de Wit, T. C. J., Vrins, S., & von Hofsten, C. (2011). Facing threat: Infants’ and adults’ visual scanning of faces with neutral, happy, sad, angry, and fearful emotional expressions. Cognition & Emotion, 25(2), 193–205. Scholar
  22. Idring, S., Lundberg, M., Sturm, H., Dalman, C., Gumpert, C., Rai, D., … Magnusson, C. (2015). Changes in prevalence of autism spectrum disorders in 2001–2011: Findings from the stockholm youth cohort. Journal of Autism and Developmental Disorders, 45(6), 1766–1773. Scholar
  23. Johnson, M. H., Senju, A., & Tomalski, P. (2015). The two-process theory of face processing: Modifications based on two decades of data from infants and adults. Neuroscience and Biobehavioral Reviews, 50, 169–179. Scholar
  24. Jones, W., & Klin, A. (2013). Attention to eyes is present but in decline in 2–6-month-old infants later diagnosed with autism. Nature, 504(7480), 427–431. Scholar
  25. Jones, E. J. H., Gliga, T., Bedford, R., Charman, T., & Johnson, M. H. (2014). Developmental pathways to autism: A review of prospective studies of infants at risk. Neuroscience & Biobehavioral Reviews, 39, 1–33. Scholar
  26. Key, A. P. F., & Stone, W. L. (2012). Processing of novel and familiar faces in infants at average and high risk for autism. Developmental Cognitive Neuroscience, 2(2), 244–255. Scholar
  27. Kleberg, J. L., Thorup, E., & Falck-Ytter, T. (2017). Visual orienting in children with autism: Hyper-responsiveness to human eyes presented after a brief alerting audio-signal, but hyporesponsiveness to eyes. Autism Research, 10(2), 246–250.CrossRefGoogle Scholar
  28. Leppänen, J. M., & Nelson, C. A. (2012). Early development of fear processing. Current Directions in Psychological Science, 21(3), 200–204. Scholar
  29. Lewkowicz, D. J., Hansen-Tift, A. M., & Lewkowicz, D. J. (2012). Infants deploy selective attention to the mouth of a talking face when learning speech. Proceedings of the National Academy of Sciences of the United States of America, 109(5), 1431–1436.CrossRefGoogle Scholar
  30. Lundqvist, D., Flykt, A., & Öhman, A. (1998). The Karolinska directed emotional faces (KDEF). CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet, 91, 630.Google Scholar
  31. McClure, E. B. (2000). A meta-analytic review of sex differences in facial expression processing and their development in infants, children, and adolescents. Psychological Bulletin, 126(3), 424–453. Scholar
  32. Merin, N., Young, G. S., Ozonoff, S., & Rogers, S. J. (2007). Visual fixation patterns during reciprocal social interaction distinguish a subgroup of 6-month-old infants at-risk for autism from comparison infants. Journal of Autism and Developmental Disorders, 37(1), 108–121. Scholar
  33. Messinger, D., Young, G. S., Ozonoff, S., Dobkins, K., Carter, A., Zwaigenbaum, L., … Sigman, M. (2013). Beyond autism: A baby siblings research consortium study of high-risk children at three years of age. Journal of the American Academy of Child & Adolescent Psychiatry, 52, 300–308.e1. Scholar
  34. Messinger, D. S., Young, G. S., Webb, S. J., Ozonoff, S., Bryson, S. E., Carter, A., … Zwaigenbaum, L. (2015). Early sex differences are not autism-specific: A baby siblings research consortium (BSRC) study. Molecular Autism, 6(1), 32. Scholar
  35. Moriuchi, J. M., Klin, A., & Jones, W. (2017). Mechanisms of diminished attention to eyes in autism. American Journal of Psychiatry, 174(1), 26–35. Scholar
  36. Mullen, E. (1995). Mullen scales of early learning. Circle Pines, MN: AGS.Google Scholar
  37. Oakes, L. M., & Ellis, A. E. (2013). An eye-tracking Investigation of developmental changes in infants’ exploration of upright and inverted human faces. Infancy, 18(1), 134–148. Scholar
  38. Ozonoff, S., Young, G. S., Carter, A., Messinger, D., Yirmiya, N., Zwaigenbaum, L., … Stone, W. L. (2011). Recurrence risk for autism spectrum disorders: A baby siblings research consortium study. Pediatrics. Scholar
  39. Pascalis, O., De Haan, M., Nelson, C. A., & De Schonen, S. (1998). Long-term recognition memory for faces assessed by visual paired comparison in 3-and 6-month-old infants. Journal of Experimental Psychology: Learning, Memory, and Cognition, 24(1), 249.Google Scholar
  40. Peltola, M. J., Leppänen, J. M., Palokangas, T., & Hietanen, J. K. (2008). Fearful faces modulate looking duration and attention disengagement in 7-month-old infants. Developmental Science, 11(1), 60–68. Scholar
  41. Reinhardt, V. P., Wetherby, A. M., Schatschneider, C., & Lord, C. (2015). Examination of sex differences in a large sample of young children with autism spectrum disorder and typical development. Journal of Autism and Developmental Disorders, 45(3), 697–706.CrossRefGoogle Scholar
  42. Rennels, J. L., & Cummings, A. J. (2013). Sex differences in facial scanning: Similarities and dissimilarities between infants and adults. International Journal of Behavioral Development, 37(2), 111–117.CrossRefGoogle Scholar
  43. Robinson, E. B., Lichtenstein, P., Anckarsäter, H., Happé, F., & Ronald, A. (2013). Examining and interpreting the female protective effect against autistic behavior. Proceedings of the National Academy of Sciences of the United States of America, 110(13), 5258–5262. Scholar
  44. Senju, A., & Csibra, G. (2008). Gaze following in human infants depends on communicative signals. Current Biology. Scholar
  45. Shic, F., Macari, S., & Chawarska, K. (2014). Speech disturbs face scanning in 6-month-old infants who develop autism spectrum disorder. Biological Psychiatry, 75(3), 231–237. Scholar
  46. Tenenbaum, E. J., Shah, R. J., Sobel, D. M., Malle, B. F., & Morgan, J. L. (2013). Increased focus on the mouth among infants in the first year of life: A longitudinal eye-tracking study. Infancy, 18(4), 534–553. Scholar
  47. Wagner, J., Luyster, R. J., Moustapha, H., Tager-Flusberg, H., & Nelson, C. A. (2016). Differential attention to faces in infant siblings of children with autism spectrum disorder and associations with later social and language ability. International Journal of Behavioral Development. Scholar
  48. Wagner, J. B., Luyster, R. J., Tager-Flusberg, H., & Nelson, C. A. (2016). Greater pupil size in response to emotional faces as an early marker of social-communicative difficulties in infants at high risk for autism. Infancy, 21(5), 560–581. Scholar
  49. Wass, S. V., Smith, T. J., & Johnson, M. H. (2013). Parsing eye-tracking data of variable quality to provide accurate fixation duration estimates in infants and adults. Behavior Research Methods, 45(1), 229–250. Scholar
  50. Young, G. S., Merin, N., Rogers, S. J., & Ozonoff, S. (2009). Gaze behavior and affect at 6 months: Predicting clinical outcomes and language development in typically developing infants and infants at risk for autism. Developmental Science, 12(5), 798–814. Scholar

Copyright information

© The Author(s) 2018

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Johan Lundin Kleberg
    • 1
    • 2
    Email author
  • Pär Nyström
    • 1
  • Sven Bölte
    • 3
    • 4
  • Terje Falck-Ytter
    • 1
    • 3
    • 5
  1. 1.Department of Psychology, Uppsala Child and Baby LabUppsala UniversityUppsalaSweden
  2. 2.Department of Clinical NeuroscienceKarolinska InstitutetStockholmSweden
  3. 3.Department of Women’s and Children’s Health, Center of Neurodevelopmental Disorders (KIND)Karolinska InstitutetStockholmSweden
  4. 4.Child and Adolescent PsychiatryCenter for Psychiatry Research, Stockholm County CouncilStockholmSweden
  5. 5.The Swedish Collegium for Advanced Study (SCAS)UppsalaSweden

Personalised recommendations