Reading differences in eye-tracking data as a marker of high-functioning autism in adults and comparison to results from web-related tasks


Yaneva V., Ha L. A., ERASLAN Ş., YILMAZ Y., Mitkov R.

Neural Engineering Techniques for Autism Spectrum Disorder: Volume 2: Diagnosis and Clinical Analysis, Elsevier, ss.63-79, 2022 identifier

  • Yayın Türü: Kitapta Bölüm / Araştırma Kitabı
  • Basım Tarihi: 2022
  • Doi Numarası: 10.1016/b978-0-12-824421-0.00011-4
  • Yayınevi: Elsevier
  • Sayfa Sayıları: ss.63-79
  • Anahtar Kelimeler: autism in adults, autism spectrum disorder, eye tracking, High-functioning autism, machine learning, web tasks
  • Orta Doğu Teknik Üniversitesi Kuzey Kıbrıs Kampüsü Adresli: Evet

Özet

Automated detection of high-functioning autism in adults is a highly challenging and understudied problem. In search of a way to automatically detect the condition, we explore how eye-tracking data from reading tasks can be used. Our previous work shows that eye-tracking data from web page-processing tasks provide suitable signal for detecting high-functioning autism, achieving 75% accuracy for our sample. This research line leads us to ask further: how do reading passages and reading tasks compare to web pages and web-processing tasks for the purpose of autism detection. In this chapter, we hypothesize that the reading differences between adults with and without high-functioning autism captured through eye-tracking data that would be a useful marker for differentiating between the two groups. In this new study, we collect data from 20 reading passages and 60 comprehension questions associated to these passages organized into three separate data subsets. This data are used to train a variety of machine learning classifiers using various combinations of features and granularity levels for gaze data extraction. The best performing classifiers achieve accuracy of 57%, 83%, and 70% for the three subsets of data. These results are discussed in terms of their implications for autism detection and reading research as well as in comparison to similar studies using web pages and images as stimuli.