Skip to main content

Comparison of Different Interaction Formats for Automatized Analysis of Symptoms in Children with Autism Spectrum Disorder

  • Conference paper
  • First Online:
Universal Access in Human-Computer Interaction (HCII 2023)

Abstract

Observation and assessment of interactional difficulties in children with suspected Autism Spectrum Disorder (ASD) are part of the gold standard in the ASD diagnostic process. The risk for a diagnosis of ASD can be assessed involving the three typical symptom categories speech/language, facial expression, and interaction. However, identifying ASD turns out to be staff-intensive and time consuming and hereby often delaying the start of therapy and support for the child and the family. Automatized analyses, for example of eye contact or mimic response, could facilitate early screening and hereby contribute to more efficient diagnostic routines. Such automatized screening can build on advanced language and mimic processing technology already available. However, the validity of automated screening depends on the elicitability of behavior that signifies ASD. Therefore, a proof of principle is required to demonstrate that a mediated approach does indeed elicit ASD symptoms that are usually observed during natural face-to-face interaction. Research aim: The goal of this paper is to show preliminary results on the validation of diagnostic comparability of real and simulated interactions between a child with ASD and an examiner. Specifically, the simulated interactions would be most useful for an automated screening if they could be based on pre-recorded stimuli and not only be mediated in real time. Method: For such a proof of principle approach we apply a within-design with five conditions in which the emerging symptoms in children diagnosed with ASD are contrasted with regard to speech, facial expressions, and communication behavior. Both, the authenticity of the communication situation (face-to-face vs. video call vs. pre-recorded video) as well as the child's interlocutor (real person vs. video recording vs. digital avatar) are varied and conditions fully counterbalanced. After each condition, the child is prompted to reflect about the perceived authenticity of the test situation and their interactional involvement. Inclusion criteria include boys with an established diagnosis of ASD at elementary school age who are verbally fluent. Results and implications: Although data collection is still ongoing we present preliminary results based on two children. The observations suggest that digitally delivered content is highly appealing and perceived as appropriate for children with ASD. Further data collection and analyses will inform whether typical and relevant ASD symptoms in the participants will be observable in mediated and even fully automatized conditions. However, our first impressions already demonstrate a great potential for automated measurements with low personnel effort.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The research project IDEAS (Identification of autism spectrum disorder using speech and facial expression recognition) is funded by German Federal Ministry of Education and Research (BMBF), funding code: 13GW0584D.

  2. 2.

    For the research purposes of the IDEAS project, KIZMO GmbH has developed a native iOS app (KIZMO Face-Analyzer) for use on iPads to capture facial and gaze data on demand. The app is not available to the public.

  3. 3.

    KIZMO GmbH has written code in MATLAB to analyze the raw data from the KIZMO Face- Analyzer application. The code is not available to the public.

References

  1. Akhtar, Z., Guha, T.: Computational analysis of glaze behavior in autism during interaction with virtual agents. [Conference paper]. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK (2019, May 12–14)

    Google Scholar 

  2. American Psychiatric Association: Diagnostic and statistical manual of mental disorders. Arlington: American Psychiatric Publishing (2013)

    Google Scholar 

  3. Apple Inc.: iPad OS-Version (16.3) [Software] (2023). https://www.apple.com/de/ipados/ipados-16/

  4. Bekele, E., Zheng, Z., Swanson, A., Crittendon, J., Warren, Z., Sarkar, N.: Understanding how adolescents with Autism respond to facial expressions in virtual reality environments. IEEE Trans. Visual Comput. Graph. 19(4), 711–720 (2013). https://doi.org/10.1109/TVCG.2013.42

    Article  Google Scholar 

  5. Bellani, M., Fornasari, L., Chittaro, L., Brambilla, P.: Virtual reality in autism: state of the art. Epidemiology Psychiatric Sci. 20, 235–238 (2011). https://doi.org/10.1017/S2045796011000448

    Article  Google Scholar 

  6. Berger, N.I., et al.: Characterizing available tools for synchronous virtual assessment of toddlers with suspected autism spectrum disorder: a brief report. J. Autism Dev. Disord. 52(1), 423–434 (2021). https://doi.org/10.1007/s10803-021-04911-2

    Article  Google Scholar 

  7. Charlton, C.T., et al.: Effectiveness of avatar-delivered instruction on social initiations by children with Autism Spectrum Disorder. Research in Autism Spectrum Disorders 71(101494) (2020). https://doi.org/10.1016/j.rasd.2019.101494

  8. de Borst A.W., de Gelder, B.: Is it the real deal? Perception of virtual characters versus humans: an affective cognitive neuroscience perspective. Front. Psychol. 6(576) (2015). https://doi.org/10.3389/fpsyg.2015.00576

  9. Drimalla, H., et al.: Towards the automatic detection of social biomarkers in autism spectrum disorder: introducing the simulated interaction task (SIT). Npj Digital Medicine 3(25) (2020). https://doi.org/10.1038/s41746-020-0227-5

  10. Forbes, P.A.G., Pan, X., de C. Hamilton, A.F.: Reduced mimicry to virtual reality avatars in autism spectrum disorder. J. Autism Dev. Disord. 46(12), 3788–3797 (2016). https://doi.org/10.1007/s10803-016-2930-2

    Article  Google Scholar 

  11. Gale, R., Chen, L., Dolata, J., van Santen, J., Asgari, M.: Improving ASR systems for children with autism and language impairment using domain-focused DNN transfer techniques. Interspeech 2019, 11–15 (2019). https://doi.org/10.21437/Interspeech.2019-3161

    Article  Google Scholar 

  12. Georgescu, A.-L., Kuzmanovic, B., Roth, D., Bente, G., Vogeley, K.: The use of virtual characters to assess and train non-verbal communication in high-functioning autism. Front. Hum. Neurosci. 8, 1–17 (2014). https://doi.org/10.3389/fnhum.2014.00807

    Article  Google Scholar 

  13. Griffiths, S., Jarrold, C., Penton-Voak, I.S., Woods, A.T., Skinner, A.L., Munafò, M.R.: Impaired recognition of basic emotions from facial expressions in young people with autism spectrum disorder: assessing the importance of expression intensity. J. Autism Dev. Disord. 49, 2768–2778 (2019). Doi:https://doi.org/10.1007/s10803-017-3091-7

  14. Jonas, K., Jaecks, P.: Digitale Diagnostik: Innovative Wege für die Sprachtherapie [Digital Diagnostics: Innovative Ways for Speech Therapy]. In: Fritzsche, T., Breitenstein, S., Wunderlich, H., Ferchland, L. (eds.), Spektrum Patholinguistik (Band 14), pp. 1–29. Potsdam: Universitätsverlag Potsdam (2021)

    Google Scholar 

  15. Kamp-Becker, I., Tauscher, J., Wolff, N., Küpper, C., Stroth, S.: Is the combination of ADOS and ADI-R necessary to classify ASD? Rethinking the “Gold Standard” in diagnosing ASD. Front. Psychol. 12, 727308 (2021). https://doi.org/10.3389/fpsyt.2021.727308

    Article  Google Scholar 

  16. Lin, Y., Gu, Y., Xu, Y., Hou, S., Ding, R., Ni, S.: Autistic spectrum traits detection and early screening: a machine learning based eye movement study. J. Child Adolesc. Psychiatr. Nurs. 35(1), 83–92 (2022). https://doi.org/10.1111/jcap.12346

    Article  Google Scholar 

  17. Lord, C.E.: Autism: From research to practice. Am. Psychol. 65(8), 815–826 (2010). https://doi.org/10.1037/0003-066X.65.8.815

    Article  Google Scholar 

  18. McCarty, P., Frye, R.E.: Early detection and diagnosis of autism spectrum disorder: why is it so difficult? Seminars Pediatric Neurol. 35(100831) (2020). https://doi.org/10.1016/j.spen.2020.100831

  19. Moore, A., et al.: The geometric preference subtype in ASD: Identifying a consistent, early-emerging phenomenon through eye tracking. Molecular Autism 9(19) (2018). https://doi.org/10.1186/s13229-018-0202-z

  20. Neitzel, I., Tuschen, L., Ritterfeld, U.: Automatisierte Sprachentwicklungsanalysen in Forschung und Diagnostik: Potentiale und Barrieren [Automated language development analyses in research and diagnostics: potentials and barriers]. Sprache Stimme Gehör 47(2), 84–88

    Google Scholar 

  21. Negrão, J.G., et al.: The child emotion facial expression set: a database for emotion recognition in children. Front. Psychol. 12(666245) (2021). https://doi.org/10.3389/fpsyg.2021.666245

  22. Pierce, K., Marinero, S., Hazin, R., McKenna, B., Barnes, C.C., Malige, A.: Eye tracking reveals abnormal visual preference for geometric images as an early biomarker of an Autism spectrum disorder subtype associated with increased symptom severity. Biol. Psychiat. 79(8), 657–666 (2016). https://doi.org/10.1016/j.biopsych.2015.03.032

    Article  Google Scholar 

  23. Polzer, L., Freitag, C.M., Bast, N.: Pupillometric measures of altered stimulus-evoked locus coeruleus-norepinephrine activity explain attenuated social attention in preschoolers with autism spectrum disorder. Autism Res. 15(11), 2167–2180 (2022). https://doi.org/10.1002/aur.2818

    Article  Google Scholar 

  24. Robles, M., et al.: A virtual reality based system for the screening and classification of Autism. IEEE Trans. Visual Comput. Graphics 28(5), 2168–2178 (2022). https://doi.org/10.1109/TVCG.2022.3150489

    Article  Google Scholar 

  25. Sachse, S., Spreer, M.: Grundlagen zu Auffälligkeiten und Diagnostik im Kontext der Sprachentwicklung [Basics of abnormalities and diagnosis in the context of language development]. In: Sachse, S., Bockmann, A.-K., Buschmann, A. (eds.) Sprachentwicklung. Entwicklung – Diagnostik – Förderung im Kleinkind- und Vorschulalter, pp. 165–175. Springer, Berlin (2020)

    Google Scholar 

  26. Tariq, Q., Daniels, J., Schwartz, J.N., Washington, P., Kalantarian, H., Wall, D.P.: Mobile detection of autism through machine learning on home video: a development and prospective validation study. PLoS Med. 15(11), e1002705 (2018). https://doi.org/10.1371/journal.pmed.1002705

    Article  Google Scholar 

  27. The MathWorks Inc.: MATLAB (2021b) [Software] (2023). https://de.mathworks.com/products/matlab.html?s_tid=hp_ff_p_matlab

  28. Wawer, A., Chojnicka, I.: Detecting autism from picture book narratives using deep neural utterance embeddings. Int. J. Lang. Commun. Disorders, 1–14 (2022). https://doi.org/10.1111/1460-6984.12731

  29. Wiggins, L.D., Baio, J.O.N., Rice, C.: Examination of the time between first evaluation and first autism spectrum diagnosis in a population-based sample. J. Dev. Behav. Pediatr. 27(2), 79–87 (2006). https://doi.org/10.1097/00004703-200604002-00005

    Article  Google Scholar 

  30. Wolff, N., et al.: Abilities and disabilities – applying machine intelligence in diagnosing Autism spectrum disorders. Front. Psychol. 13, 826043 (2022). DOI: https://doi.org/10.3389/fpsyt.2022.826043

  31. Zwaigenbaum, L., Penner, M.: Autism spectrum disorder: advances in diagnosis and evaluation. BMJ 361(k1674) (2018). https://doi.org/10.1136/bmj.k1674

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Larissa Pliska .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pliska, L., Neitzel, I., Buschermöhle, M., Ritterfeld, U. (2023). Comparison of Different Interaction Formats for Automatized Analysis of Symptoms in Children with Autism Spectrum Disorder. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. HCII 2023. Lecture Notes in Computer Science, vol 14020. Springer, Cham. https://doi.org/10.1007/978-3-031-35681-0_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35681-0_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35680-3

  • Online ISBN: 978-3-031-35681-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics