Multimodal AI Agent to Support Students’ Motion-Based Educational Game Play
Peer reviewed, Journal article
Published version
View/ Open
Date
2021Metadata
Show full item recordCollections
Original version
CEUR Workshop Proceedings. 2021, 2902 102-112.Abstract
Increased accessibility of lightweight sensors (e.g., eye trackers, physiological wristbands, and motion sensors), enable the extraction of student’s cognitive, physiological, skeletal, and affective data, as they engage with Motion-Based Educational Games (MBEG). Real-time analysis of this Multi-Modal Data (MMD) leads to a deep understanding of student’s learning experiences and affords new opportunities for timely, contextual, personalised feedback delivery to support the student. In this work-in-progress, we present the MMD-AI Agent for Learning; a MMDdriven Artificially Intelligent (AI) agent based eco-system, composed of 3 separate software components, which work together to facilitate student’s learning during their interactions with MBEG. The Crunch Wizard, receives MMD from eye-trackers, physiological wristbands, web camera, and motion sensors worn by a student during game play, and derives relevant cognitive, physiological and affective measurements. The AI agent identifies and delivers appropriate feedback mechanisms to support a student’s MBEG play learning experience. The Dashboard visualises the measurements to keep teachers informed of a student’s progress. We discuss the foundational work that motivated the ecosystem’s design, inform on our design and development accomplished thus far, and outline future directions.