Speaker
KEYWORDS
listening
multimodality
phonology
ABSTRACT
Listening is obviously an auditory process relating to the monitoring and mental processing of acoustic information and attending to meaning. However, there are documented links between visual and acoustic information processing in the brain when listening. The presentation will first provide an easily-digestible overview of the links between vision and listening from neuroscience, psychology and language teaching. The next part of the presentation details a quasi-experimental study and pilot study conducted in a classroom setting examining discrimination of English vowels /æ/, /ʌ/, /ɜː/, and /ɔː/. The quasi-experimental study ran with two groups exposed to audio-only and audiovisual training stimuli, using total sample size of 40. Although at time of submission data is still being collected for a delayed post-test, and initial data analysis has not been completed, a pilot study reported that while immediate post-test differences small, delayed post-test differences showed that the audiovisual group discriminated vowels more accurately than the audio-only group. Finally, the presentation brings together the initial overview of the literature presented in the introduction, linking the findings of the studies on modality, and presenting ramifications for the classroom, including whether audiovisual or audio-only input is more useful for language acquisition and comprehension.
TITLE | Does what you see affect listening? Multimodality in listening |
---|---|
RELEVANT SIG | Listening Literature in Language Teaching |
FORMAT | Research-oriented Oral Face-to-face presentation (25 minutes, including Q&A) |