Development of an audiovisual speech perception app for children with autism spectrum disorders

Julia Irwin, Jonathan Preston, Lawrence Brancazio, Michael D'angelo, Jacqueline Turcios

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

Perception of spoken language requires attention to acoustic as well as visible phonetic information. This article reviews the known differences in audiovisual speech perception in children with autism spectrum disorders (ASD) and specifies the need for interventions that address this construct. Elements of an audiovisual training program are described. This researcher-developed program delivered via an iPad app presents natural speech in the context of increasing noise, but supported with a speaking face. Children are cued to attend to visible articulatory information to assist in perception of the spoken words. Data from four children with ASD ages 8-10 are presented showing that the children improved their performance on an untrained auditory speech-in-noise task.

Original languageEnglish (US)
Pages (from-to)76-83
Number of pages8
JournalClinical Linguistics and Phonetics
Volume29
Issue number1
DOIs
StatePublished - Jan 1 2015

Keywords

  • Audiovisual app
  • Autism spectrum disorder
  • Speech perception

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language
  • Speech and Hearing

Fingerprint

Dive into the research topics of 'Development of an audiovisual speech perception app for children with autism spectrum disorders'. Together they form a unique fingerprint.

Cite this