Senior Lecturer National University of Singapore Singapore, Singapore
Introduction:: Laboratory (lab) investigations play an integral role in the undergraduate science curriculum and are a part of the assessment process. Students are typically assessed at the end point via a lab report that have unfortunately evolved to become creative plagiarized submissions. Students are not given feedback on which specific step they performed incorrectly, as there are multiple steps in every procedure and instead only have an end result that limits their learning. Student learning outcomes with regards to competency and self-efficacy are adversely affected and this will have repercussions in their future biomedical careers. In addition, even despite our best efforts, a local university in Singapore reported several cases of cheating via online exams. Cheating through online assessments became a grieve concern through this pandemic and serves as a wake-up call of us to reinvent the way students are assessed. Education needs to change, and we also need a new approach to assessment because succeeding in today’s complex, dynamic world is not easily or optimally measured by multiple-choice responses on simple knowledge tests that do not support deep learning or the acquisition of complex competencies. The objective of this research was to design, develop and iteratively evaluate if a metaverse learning environment in the form of virtual reality (VR) lab simulations can be used not only for training and experiential learning, but as a mode of assessment of learners’ competence.
Materials and Methods:: VR lab simulations were designed, developed (using Unreal Engine) and deployed using a single cloud-based platform for the purpose of training and stealth assessment that consisted of a serious games content depository, centralized databank, and dashboard module. As the students performed tasks within the game, in-game interactions triggered events within the game. These event data was channelled to the analytics module. Realtime analysis of the event data was done, and the results were reflected in the game dashboards illustrating the competencies assessed.
The simulations were designed based on the curriculum for BN3402 Bioanalytics for Engineers. This was a technical elective offered to undergraduate BME students (n=40). The overall goal of the simulations was for the learners to conduct the RT-PCR experiments for the detection of SARS-CoV-2 in nasopharyngeal swab specimens taken from residents who live in a high-risk apartment in Singapore. The learner used a standard diagnostic kit, and it involved three lab experiments, divided into three separate games. The games required a HP Reverb G2 headset and a set of controllers.
After completing the simulations, students filled-up a 16-item design validation questionnaire (with a Likert scale) that focused on purpose, content, fiction/narration, mechanics, aesthetics, framing and motivation. Student perception may not correlate with actual learning, hence direct evidence that learning was achieved was gathered by comparing 5 students who had passed the module 1 semester ago to 5 students who were not enrolled in the module. Students’ individual in-game scores were captured on the dashboard and analysed with statistics.
Results, Conclusions, and Discussions:: The BN3402 module’s rating score was excellent and the VR lab simulation-based assessment constituted 13% of the final grade. Students’ feedback from the module report with regards to the VR labs included, “Technology introduced was innovative and new”, “VR experience was engaging and something different from usual lectures in school”, “VR lab was a new experience”, “Assignments were very interesting and had significant relevance to real-world problem”, and “Got to learn about the different bioanalytical methods and experience experimental procedures virtually.”
All students enrolled in BN3402 (n=39) went through the VR lab simulation assessment at the end of the module. Twenty of them voluntarily filled out the design validation questionnaire post-assessment anonymously. The simulations achieved a score of 13.3/15 for Purpose, 17.65/20 for Content, 8.75/10 for Fiction/Narrative, 4.4/5 for Mechanics, 13.7/15 for Aesthetics, 13.45/15 for Framing, and 12.7/15 for Motivation/Emotion/Memory. The total Likert score was 84/100.
The mean score differences between students who completed the module previously as compared to students who did not enroll for the module for game 1 was 18.28% and for game 2 was 43.54%. The unpaired t-tests confirmed that the mean scores differences obtained for both simulations were statistically significant (p< 0.05), hence providing direct evidence that the simulations were measuring the skills specified in the module’s measurable outcomes.
Content validation from four external experts was conducted so that this new method of assessment can be readily accepted by the community. It was the first time that the experts experienced VR and took the first 15min to familiarize themselves with the metaverse environment and game controllers. After which, they executed the steps correctly and were immersed in the flow of the standard operating protocol and validated its accuracy. They appreciated the high fidelity of the environment. The participants expressed that it is a unique way to assess students’ individual lab competence, and this would not have been possible otherwise.
In conclusion, stealth assessment using VR lab simulations integrated with an analytical dashboard show promise to accurately train and assess a learner’s lab procedural skills competence and theoretical knowledge of a bioanalytical method.
Acknowledgements (Optional): : I would like to acknowledge funding from innovPLUS 2020 (IAL/SUSS). I would like to thank my solution partners, Ivan Boo from Serious Games Asia and Mark Wong from FX Media and my final year project students Xavier Tan Wei Chen and Tan Jeng Woon from the department of Biomedical Engineering, National University of Singapore.