17–19 May 2024
Meijo University Nagoya Dome Campus
Asia/Tokyo timezone

Pose detection using an AI-based web application offers a novel approach to observing gesture use in second language learning.

18 May 2024, 15:00
30m
DN 409 (North Building)

DN 409 (North Building)

Practice-based Presentation (30 minutes) AI for Teaching DN 409: Virtual Reality

Speaker

Robert Olexa (National Institute of Technology, Hakodate College)

Description

Traditional classroom-based language teaching often lacks context and natural elements crucial for effective language acquisition. Although research has shown the importance of turn-taking, disfluency, embodiment, and gesture use in natural conversation, gathering quantitative data to identify patterns has been challenging for language teachers. Previous research relied on annotating video stills, which is informative but not quantitative, making it difficult to understand significant differences in gestures over temporal units of conversation. Recent AI advances have led to tools that allow teachers to document and analyze situated language use. This presentation introduces a web application developed by researchers in the Yamada Lab at the National Institute of Technology, Hakodate College, enabling amateur researchers to apply pose detection to their research. The application outputs a CSV file for statistical analysis, adding rigor and identifying typological differences in gesture use. The presentation will demo a case study observing iconic gestures in online VR language exchanges and provide participants with hands-on experience using a Quest 3 headset and the pose-detection app developed by Yamada-Lab. By raising awareness of authentic gesture use and embodiment in speech, we can develop more effective language materials that improve students' communication skills and help to move language teaching forward.

Primary authors

Kazumasa Yamada (National Institute of Technology, Hakodate College) Robert Olexa (National Institute of Technology, Hakodate College)

Presentation materials

There are no materials yet.