#4266

Poster Session

Biosignal-Adaptive Language Learning In Virtual Reality

Time not set

We introduce an innovative approach to adaptive language learning using embodied intelligent virtual agents as personalized tutors within an immersive extended reality environment. At the core of our innovation is the use of real-time biosignals, specifically heart rate data collected through wearable devices like the Samsung Watch 7, to personalize educational interactions. By continuously monitoring changes in the learner's emotional and cognitive states, indicated through fluctuations in heart rate, the virtual tutor can adapt its instructional approach. For example, if a Japanese learner's heart rate increases when correcting written errors, indicating possible stress or difficulty, the tutor might slow down speech or simplify language tasks to help the learner remain comfortable and engaged. Currently, our adaptation method primarily uses average heart rate data, but we are also exploring more detailed analyses, such as examining heartbeat intervals over time and detecting patterns in specific heart rhythm frequencies. Additionally, we are investigating how additional biosignals such as electrocardiogram and photoplethysmography can further improve the accuracy of stress prediction. Our comprehensive system integrates digital human tutors powered by ConvAI, advanced language models, XR environments, and efficient data handling. Early results demonstrate significant potential for enhancing language learning through personalized, engaging, and responsive experiences.