Measuring Multimodal Data in an XR-Based Training Simulation Environment
DOI:
https://doi.org/10.56198/ITIG24XNYKeywords:
XR-based Simulation, HoloLens, Multimodal Data, Learner PerformanceAbstract
This pilot research aims to investigate the potential of multimodal data in predicting the user’s performance in an XR-based training simulation environment. XR-based learning simulations have rapidly gained popularity in training education areas like medical and nursing, as well as STEAM, due to their ability to provide training opportunities in an immersive and authentic environment. In this study, multimodal data, such as eye-tracking and behavioral data (task completion time, accuracy, head movement, and hand movement) were collected, as well as the participant’s perception of the simulation through a subjective survey questionnaire. We aim to investigate how and what kind of multimodal data could be collected in an XR-based learning environment to optimize the training curriculum, improve the user experience, and enhance the learning outcomes. To achieve this, we collected multimodal data for attention, cognitive load, and performance behavior: head movement, hand movement, and eye information from 22 medical residents who participated in the XR-based simulation experience focused on strabismus diagnosis for resident training. As a result, we can collect fifteen multimodal data types for attention and nine data types for performance behavior. These results could be the fundamental research for predicting the learner’s performance states in XR-based training simulation.
Downloads
Published
Conference Proceedings Volume
Section
Categories
License
Copyright (c) 2023 The Immersive Learning Reseach Network

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
The papers in this book comprise the proceedings of the meeting mentioned on the cover and title page. They reflect the authors' opinions and, in the interests of timely dissemination, are published as presented and without change. Their inclusion in this publication does not necessarily constitute endorsement by the editors or the Immersive Learning Research Network.
Contact: publications@immersivelrn.org