Integrating Procedural Information into XR Simulation for Basic Life Support Training
DOI:
https://doi.org/10.56198/4qwqar51Keywords:
XR Simulation, Procedural Information, Basic Life Support (BLS) TrainingAbstract
This study proposes a design case for developing procedural information in extended reality (XR)-based Basic Life Support (BLS) training simulations, focusing on integrating immersive learning environments to align with XR-specific features. Procedural information was developed using the 4C/ID model, incorporating demonstration, just-in-time information, and corrective feedback were incorporated to support learners in effectively acquiring and automating BLS skills. Demonstrations provided visual and auditory guidance for observing task execution comprehensively, while just-in-time information offered step-specific procedural rules to facilitate cognitive integration during task performance. Corrective feedback enabled learners to identify and address errors, ensuring accurate procedural execution. The simulation emphasized spatial interaction and engagement to provide learners with authentic and immersive practice experiences that bridge theoretical knowledge and practical application. Expert reviews and iterative pilot testing refined the design, enhancing usability and instructional effectiveness. This study highlights the importance of procedural information design in creating XR-based simulations that promote learner engagement and skill acquisition. Future research should explore the effectiveness of multimodal interaction and specific XR design features in supporting cognitive and procedural learning outcomes, aiming to optimize knowledge retention and transfer across diverse educational contexts.
Downloads
Published
Conference Proceedings Volume
Section
Categories
License
The papers in this book comprise the proceedings of the meeting mentioned on the cover and title page. They reflect the authors' opinions and, in the interests of timely dissemination, are published as presented and without change. Their inclusion in this publication does not necessarily constitute endorsement by the editors or the Immersive Learning Research Network.
Contact: publications@immersivelrn.org