Important Announcement
PubHTML5 Scheduled Server Maintenance on (GMT) Sunday, June 26th, 2:00 am - 8:00 am.
PubHTML5 site will be inoperative during the times indicated!

Home Explore VirtualRealityInstructionalDesigninOrthopedicPhysicalTherapyEducation-AMixed-MethodsUsabilityTest

VirtualRealityInstructionalDesigninOrthopedicPhysicalTherapyEducation-AMixed-MethodsUsabilityTest

Published by opleople2205, 2022-10-01 10:05:09

Description: VirtualRealityInstructionalDesigninOrthopedicPhysicalTherapyEducation-AMixed-MethodsUsabilityTest

Keywords: Virtual Reality Instructional Design in Orthopedic Physical Therapy Education: A Mixed-Methods Usability Test

Search

Read the Text Version

Original article Simulation & Gaming 2022, Vol. 0(0) 1–24 Virtual Reality Instructional © The Author(s) 2022 Design in Orthopedic Physical Article reuse guidelines: Therapy Education: A sagepub.com/journals-permissions Mixed-Methods Usability Test DOI: 10.1177/10468781211073646 journals.sagepub.com/home/sag Aaron J. Hartstein1, Margaret Verkuyl2, Kory Zimney3, Jean Yockey4 and Patti Berg-Poppe5 Abstract Background. Physical therapy education benefits from innovative and authentic learning opportunities. However, factors that influence the acceptance of ed- ucational technology must be assessed prior to curricular adoption. The purpose of this study was to assess the perceived ease of use and perceived usefulness of a virtual reality (VR) learning experience developed to promote the clinical decision-making of student physical therapists. Methods. A VR learning experience was developed, and an established two-stage usability test assessed player experience as well as the user’s perception of both ease of use and usefulness. Two experts evaluated the VR learning experience and provided feedback. Six student physical therapists and five faculty members completed the VR experience, responded to two questionnaires, and partici- pated in a semi-structured interview to further assess ease of use and utility. Results. High levels of perceived ease of use, perceived usefulness, and positive player experiences were reported by both faculty and student users. Faculty users 1Division of Physical Therapy, Shenandoah University, Winchester, VA, USA 2School of Community and Health Studies, Centennial College, Toronto, ON, Canada 3Department of Physical Therapy, University of South Dakota, Vermillion, SD, USA 4Department of Nursing, University of South Dakota, Vermillion, SD, USA 5Department of Physical Therapy, University of South Dakota, Vermillion, SD, USA Corresponding Author: Aaron J. Hartstein, Division of Physical Therapy, Shenandoah University, 1775 North Sector Court, Winchester, VA, USA. Email: [email protected]

2 Simulation & Gaming 0(0) perceived a significantly greater amount of educational and clinical utility from the VR simulation than did student users. Semi-structured interviews revealed themes related to ease of use, benefits, modeling of professional behaviors, and realism. Conclusion. Quantitative data supported faculty and student users’ perceptions of ease of use, utility towards learning, practical application, and several constructs related to user experience. Qualitative data provided recommendations to modify design features of the VR experience. This study provides a template to design, produce, and assess the usability of an immersive VR learning experience that may be replicated by other health professions educators where current evidence is limited. Keywords virtual reality, clinical decision-making, physical therapy education, health professions, simulation, usability testing Background Effective and efficient clinical reasoning skills have been described as the cornerstone of physical therapy practice and are associated with improved outcomes (Prokop, 2018; Wainwright et al., 2011). Although these skills remain essential features of autonomous practice, the most effective methods for their development have not yet been identified (Vendrely, 2005; Wainwright et al., 2011). To meet the needs of today’s patient pop- ulation, Jensen et al. (2017) recommend a transformation of physical therapy education that is informed by other disciplines, grounded in adult learning theory, and includes the use of technology to create authentic learning experiences. Experiential learning activities that encourage the application of foundational knowledge, provide feedback, and pro- mote self-reflection may advance the clinical decision-making skills necessary to ap- propriately shape professional identity (Huhn et al., 2018; Jensen et al., 2017). The use of innovative technology and virtual learning experiences are frequently found within the medical and nursing education literature. While virtual simulations encourage learning across many domains (Foronda et al., 2020), their effects on clinical reasoning, critical thinking, or clinical decision-making among physical therapy and other graduate health professions students are less certain and insufficiently represented (Kononowicz et al., 2019). When compared to traditional instruction, several studies favor virtual simulation for their effects on clinical decision-making (Bediang et al., 2013; Botezatu et al., 2010; Diehl et al., 2017; Lehman et al., 2015; Mardani et al., 2020; Middeke et al., 2018; Schittek-Janda et al., 2004; Weiner et al., 2016). However, marked variability among the operational definition of virtual patient and the sporadic application of adult learning theory limit the generalizability of these findings without clarification of their specific design variants (Cant et al., 2019; Cheng et al., 2016; Kardong-Edgren et al., 2019). When design features and instructional design are

Hartstein et al. 3 considered, however, studies that described virtual simulation experiences with high levels of fidelity, moderate levels of immersion, an interactive patient depiction, and included adult learning features such as pre-briefing, debriefing, and consistent feedback frequently reported significant improvements in clinical decision-making measures (Diehl et al., 2017; Kleinert et al., 2015; Middeke et al., 2018). Virtual reality (VR) simulation describes a computer-generated educational mo- dality in which users interact with an authentic clinical scenario in a highly immersive sensory environment to achieve specific learning goals (Cant et al., 2019; Kardong- Edgren et al., 2019; Lioce et al., 2020). Beyond a two-dimensional or screen-based learning experience, VR instruction offers design features associated with engagement, motivation, and deep learning (Huang and Liaw, 2018; Plotzky et al., 2021). The fully immersive sensory environment of VR, often achieved through a head-mounted display, is thought to create opportunities for presence learning and to stimulate an emotional response that resembles authentic patient interactions (Pottle, 2019). Ad- ditionally, it has been shown that VR instruction improves students’ self-efficacy, communication skills, and clinical judgment, and allows the safe, contextual appli- cation of knowledge – all features commonly associated with effective clinical decision-making (Huang and Liaw, 2018; Makransky et al., 2019; Real et al., 2017). Principles of constructivism and experiential learning theory offer support for the use of VR simulation to advance clinical decision-making skills. VR experiences encourage learners to actively construct meaning through contextualized practice in low-stakes environments (Pantelidis, 2009). VR learning experiences also acknowledge the im- portance of environmental context, social interaction (through pre- and/or debriefing sessions), and guided reflection as meaning is extracted from a learning activity (Dennick, 2016). VR clinical simulations exemplify these principles as they provide multiple representations of reality, present real-world authentic scenarios in meaningful contexts, and use case-based learning to test hypotheses and facilitate reflection (Dennick, 2016). The process of responding to emerging data, selecting necessary assessment techniques, and consistently analyzing results in comparison to a working hypothesis also parallels the experiential adult learning cycle described by Kolb (1984). Despite their instructional benefits, no studies have explored the development or use of an immersive VR environment to facilitate clinical decision-making skills in physical therapy education. Considering its educational potential and in response to the critical need for authentic and situated instruction in physical therapy education in- formed by andragogy (Jensen et al., 2017; Wainwright and Gwyer, 2017), a VR learning experience was developed to promote students’ clinical decision-making skills. Beyond development, however, the efficacy and uptake of new educational technology is heavily dependent on user experience (Foronda et al., 2016). Although it is often an overlooked step in health professions education, usability testing is nec- essary to identify challenges that may otherwise negatively affect the user’s perception of utility and thus inhibit learning outcomes (Verkuyl et al., 2018). The purpose of this study was to examine the perceived usability of a VR instructional method specifically designed to enhance physical therapy students’ clinical decision-making abilities.

4 Simulation & Gaming 0(0) Figure 1. Branching Format within VR Experience. Intervention To facilitate the development of student physical therapist clinical decision-making, an immersive VR learning experience was designed and created. To simulate a patient encounter, an Insta360 Pro 360-degree single camera was used to capture subjective and objective examination techniques. Raw audiovisual footage was edited in post- production and separated into more than 50 distinct assessment techniques as MP4 files, which ranged in duration from 11 seconds to five minutes, each with a verbalized response of the clinical findings. Assessment techniques were grouped together and presented in the order of a typical orthopedic physical examination. The authentic environment in which clinical decision-making occurs was considered, as the VR experience was filmed in an outpatient physical therapy clinic that included typical sights, sounds, and equipment (McBee et al., 2018). Additionally, a virtual clipboard feature was embedded to present relevant pre-examination paperwork, such as a physician’s order for physical therapy, a medical screening form, and a region-specific standardized outcome measure. Reflecting the recent suggestions to unify the nomenclature when describing virtual clinical simulation by including specifics regarding fidelity, immersion, and patient depiction, the design features of this VR learning experience included a high level of fidelity, a high level of concrete, objective sensory immersion (through a head-mounted display), and patient depiction by a standardized patient (Cant et al., 2019; Kardong- Edgren et al., 2019). For convenience, the learning experience was designed to work on multiple platforms with various specifications. In this study, however, users experi- enced the VR instructional tool with an Oculus Quest 2 with a single fast-switch LCD display, a resolution of 1832x1920 per eye, and a 90 Hz refresh rate. Users were free to select the order of assessment techniques, in a branching format to encourage clinical

Hartstein et al. 5 Figure 2. Patient Depiction as Viewed within Oculus Quest 2 Head-Mounted Display. Printed with permission. decision-making (See Figure 1). The content of the virtual simulation depicted a patient with left shoulder pain, consistent with primary external subacromial impingement syndrome (See Figure 2). Following an instructional pre-briefing video clip, all users viewed a standardized subjective examination. Users were then prompted to select a primary working hypothesis and rank two additional differential diagnoses from a list of 24 possible pathologies. After committing to a working hypothesis and ranking their differential diagnosis list, users continued to examine the virtual patient and test their hypotheses by selecting relevant objective tests and measures from the available options. Following each category of assessment, users again selected their primary working hypothesis and ranked their differential diagnoses accordingly. Users con- tinued to examine the patient in the immersive VR setting for up to 60 minutes, or until they had received enough information to confirm their working hypothesis. Features of the experience were designed with needs of the adult learner in mind and aimed to promote reflection, engagement, and motivation in a contextually relevant environment. Throughout the VR learning experience, for example, users were pro- vided with feedback for each assessment procedure selected regarding its

6 Simulation & Gaming 0(0) appropriateness and clinical utility in differential diagnosis. Users were also provided with feedback describing their decision-making process in relation to an expert user’s order of test selection and hypothesis ranking, thereby allowing comparison of both diagnostic accuracy and efficiency. Methods Study Design This concurrent mixed-methods usability study was conducted to identify challenges, promote future adjustments, and ultimately improve users’ experiences (Nielsen, 1994; Verkuyl et al., 2018). This study sought to answer the questions: What is the usability (perceived ease of use and perceived usefulness) of a VR learning experience that has been developed for physical therapy students? What are the users’ experiences of those exposed to an immersive VR instructional method? Additionally, acknowledging the correlation between faculty acceptance and future classroom application of technology (Hsu, 2010), this study sought to determine: How do faculty and student user per- ceptions of ease of use, usefulness, and user experiences compare to one another? Consistent with previous usability tests, the Technology Acceptance Model (TAM) was used to guide the initial assessment of this VR learning experience (Davis, 1989). The TAM suggests that acceptance and adoption of a new technology are dependent on two key determinants – user’s perception of both usefulness and ease of use (Davis, 1989). An established two-stage usability test concurrently captured quantitative data (via surveys) and qualitative data (via interviews, observations, and heuristic checklists) to assess users’ perceptions related to ease of use and usefulness with a VR learning experience (Verkuyl et al., 2016; Verkuyl et al., 2018). Ethical Approval This study was conducted within the school of health professions of a small, private institution. Prior to any data collection, informed consent was received from all participants, and ethical approval was obtained from sponsoring Universities’ Insti- tutional Review Boards. Participants Convenience sampling methods were used to recruit six second-year physical therapy students and five faculty members to test the VR learning experience. The faculty and student users included seven (63.6%) women and four (36.4%) men. Student users’ ages ranged from the 20- to 22-year-old group to the 27- to 29-year-old age group. Faculty users’ ages ranged from the 30- to 32-year-old group to the 51- to 53-year-old age group. Four (36.4%) participants reported they had experienced VR before, while seven (63.6%) participants noted they had never experienced an immersive VR

Hartstein et al. 7 environment. More faculty users (n = 3) had experience with VR than student users (n = 1). Six (54.5%) participants noted they never use any educational technology, while five (45.5%) reported an occasional use. Research Protocol and Instruments Expert usability test (Stage 1). In the first stage of usability testing, two expert VR developers completed the physical therapy VR learning experience. As described by Nielsen (1994), previously established guidelines or heuristics were used to review this VR instructional method. Each expert individually completed the VR learning ex- perience and, using a checklist, identified challenges related to the visibility of the system, the terminology used, user control, the consistency of the experience, error prevention, efficiency of use, and aesthetics of design, and provided recommendations for improvements (Verkuyl et al., 2016; Verkuyl et al., 2018). User usability test (Stage 2) Think Aloud. First, all students and faculty members completed the VR learning experience using the think aloud approach (Nahm et al., 2004). In a one-on-one setting, each participant was asked to verbally share their thoughts while immersed in the VR patient encounter. While independently completing the experience, anecdotal notes were taken, documenting participant comments, and observed areas of struggle, confusion, and progress. A checklist was used to record information and participants’ responses regarding the viewing of initial instructions and a pre-briefing video clip, the viewing of the subjective examination clip, the initial selection of a primary working hypothesis and at least two differential diagnoses, subsequent selections of physical examination tests and measures, the response to embedded feedback, their attitude, their level of engagement or interest, and any other additional comments verbalized (Verkuyl et al., 2018). Orthopedic Clinical Simulation Survey. Second, after completing the VR learning experience, all students and faculty members completed the OCSS (Verkuyl et al., 2016; Verkuyl et al., 2018). The OCSS, which is based on previously described us- ability testing measures, such as the Pediatric Clinical Simulation Survey (Verkuyl et al., 2016), and the TAM framework, was adapted with assistance and permission. This survey included 17 Likert-style questions, ranging from strongly disagree = 1 to strongly agree = 5, to assess the users’ perceived usefulness and ease of use of the VR learning experience. When added together, the total score on the OCSS was out of a possible 85. Scores on individual items of the OCSS were averaged, with a maximum score of five per item. Internal consistency measures of similar, non-adapted simulation surveys have demonstrated acceptable levels of reliability, with Cronbach’s alpha results ranging from 0.70 to 0.93 (Albu et al., 2015; Atack et al., 2010; Verkuyl et al., 2016; Verkuyl et al., 2018).

8 Simulation & Gaming 0(0) Player Experience Inventory. Participants also completed the 30-item PXI (Abeele et al., 2020). This seven-point Likert-style scale, ranging from strongly disagree = À3 to strongly agree = +3, measures functional consequences, such as ease of control, and emotional or psychosocial consequences of a virtual experience, such as immersion (Abeele et al., 2020). The total score on the PXI was out of a possible 90, when added together. Scores on individual items of the PXI were averaged, with a maximum score of three per item. The multiple constructs on the PXI assess the user’s sense of meaning, mastery, immersion, autonomy, curiosity, ease of control, and challenge in relation to the VR experience (Abeele et al., 2020). The PXI has demonstrated both discriminant and convergent validity, performed well among various sample sizes, and each of the 10 constructs has exhibited acceptable composite reliability with ranges between 0.73 and 0.92 (Abeele et al., 2020). Semi-Structured Interviews. Third, following the completion of the OCSS and PXI, a semi-structured interview was completed with each participant concerning their ex- perience with the VR instructional method. An interview guide, including several scripted questions and open-ended questions, was used for questioning both the students and the faculty members. Questions focused on each user’s initial reaction, areas of challenge or confusion, technical problems encountered, assessment of utility, thoughts on embedded feedback, recommendations for its application, areas of design improvement, and suggestions for integration of the experience into the existing curriculum. Each interview was captured on a high-quality recording device. Statistical Analysis SPSS Version 27 (IBM, Armonk, New York) was used to analyze the quantitative data from this user usability test. Specifically, descriptive statistics, including reliability coefficients, were used to analyze the data from the student and faculty responses on both subsections of OCSS and the PXI (Abeele et al., 2020; Verkuyl et al., 2016). Means and standard deviations were calculated for both the perceived ease of use and perceived usefulness portions of the OCSS (Verkuyl et al., 2016; Verkuyl et al., 2018) as well the individual items on the PXI (Abeele et al., 2020). Cronbach’s alpha was used to measure internal consistency reliability of the responses on the two subsections of the OCSS and the 10 individual constructs on the PXI (Abeele et al., 2020). Reflecting the small but sufficient sample size of usability methodology, non-parametric inferential statistics were also used to compare mean perceived ease of use, mean perceived usefulness, and mean PXI responses of faculty members to those of student users. An inductive, conventional qualitative content analysis was used to describe how students and faculty experienced the VR instructional technique (Hsieh and Shannon, 2005). To analyze the content of the qualitative data collected, results from the expert usability heuristic tool and the think aloud user checklists were documented, while the recorded semi-structured interviews were transcribed verbatim. To enhance the credibility of the qualitative analysis, all qualitative data were reviewed by two

Hartstein et al. 9 researchers (Merriam and Tisdell, 2016). First, all data was reviewed several times to gain an understanding of the multiple forms of written and verbal information collected. Next, all qualitative data were independently coded to identify categories and sub- categories. Primary themes emerged out of this analysis by combining similar areas, or units, of content. Consensus agreement between the two reviewers produced a final list of categories (Thomas, 2006). While data were viewed collectively to capture over- arching themes regarding users’ experience with a VR learning platform, individual responses were also assessed as they encouraged future adjustments to the experience. Concepts or themes that emerged from this analysis added depth and meaning to the calculated reliability coefficients and inferential data and uncovered features that could inhibit the adoption of this technology. Results Expert Testing Two expert VR developers reviewed the immersive orthopedic VR experience and provided detailed feedback using a heuristic evaluation checklist (Verkuyl et al., 2016; Verkuyl et al., 2018). Agreement between the two experts was described regarding the controls and freedom of the experience, the flexibility and efficiency of the user in- terface, the aesthetics and minimalist design of the VR experience, and the common language used to prevent user error. Overall, both experts reported that the VR learning experience was easy to use, fun, and an innovative instructional method to facilitate decision-making in student physical therapists. Suggestions were made to enhance user experience. For example, the use of a tooltip feature, an informative graphical and/or text representation of an interface feature, like the controls of the VR controller, was recommended to ensure user’s knowledge of hardware terminology. To minimize cognitive load, it was recommended that the user interface remember the most recently selected category and return the user to this section of the catalog after viewing an objective video. The addition of titles for each examination clip, an indication of item completion or a progress bar, and a help button were also suggested. Both experts offered major recommendations to decrease recall from the instructional pre-briefing video that preceded the VR experience. For example, the experts suggested editing the instructional video into shorter interactive tutorials with audiovisual cues that would allow the user to prepare and practice the skills needed to view the virtual clipboard, organize hypotheses, select various tests and measures from the catalog of possibilities, watch a video, and receive feedback. Prior to user testing, each of the suggested modifications were made to the VR clinical simulation. User Testing Eleven subjects (five faculty members and six students) participated in the usability testing. This sample size is supported by Rubin and Chisnell (2008), who reported that

10 Simulation & Gaming 0(0) usability studies commonly include five to seven participants, with the identification of most challenges and the saturation of feedback occurring after four users experience a new technology. Think aloud. All participants watched the instructional pre-brief video without diffi- culty. Intuitively, all participants progressed to the next part of the experience, where patient information was presented on a virtual clipboard. Due to a lack of resolution within the visual field of the head-mounted display, eight of the participants described minor challenges reading the text on the forms of the virtual clipboard. Some par- ticipants required cuing to recognize the multiple pages of patient information that were available on the virtual clipboard and recommended an explicit cue, such as an arrow or text on the bottom of the clipboard. Participants started the subjective examination video without difficulty and rec- ognized the third-person perspective of the learning experience. Prior to selecting their first objective test, all participants explored and selected a primary working hypothesis and a ranked differential diagnosis list using the drag and drop feature. All users appeared to select the first objective examination video without difficulty and described the process as intuitive. Subsequent objective tests were also selected as users navigated the available list of examination techniques with ease. As intended, all users selected objective examination techniques in a non-linear fashion, appearing to choose follow- up tests and measures with the purpose of either confirming their primary hypothesis or refuting a competing differential diagnosis. Most users identified the icon that provided feedback about their most recent selection. One student and one faculty user (18.2%), however, had forgotten the verbal instruction that was initially provided in the pre-brief and failed to read the feedback without cuing. Following the first few objective examination selections, users appeared to rec- ognize the common clinical routine of completing a specific test or measure, reading the provided feedback, and then revisiting their working hypothesis and adjusting their rank-ordered list, if necessary. Although all users eventually returned to their working hypothesis and list of competing differential diagnoses throughout the encounter, two students and two faculty users (36.4%) required prompting to consistently return to this screen. For one student and one faculty user (18.2%), the head-mounted display positioned the hypothesis and differential diagnosis selection window slightly out of the user’s visual field, which required additional scanning for recognition. All users appeared engaged and interested throughout the orthopedic VR learning experience. Participants were frequently observed moving their own shoulder around throughout the encounter and verbalizing their thought process by commenting, “okay, that is what I expected to see,” “I think I know what is causing her symptoms now,” or “I think I will do one more test to make sure I am right.” Further, users reported a high level of interest and positive attitude throughout their experiences, frequently ex- pressing opinions such as, “this is just so cool,” “this will be great for students to learn,” and “I want to stay in longer and explore more.”

Hartstein et al. 11 All participants completed the VR experience within 45 minutes. There were no major technical difficulties encountered during the think aloud portion of this usability testing. Additionally, while it was not a primary outcome, no adverse effects, such as dizziness, headache, fatigue, or nausea, were reported during or after the immersive experience. Orthopedic Clinical Simulation Survey. The total mean score on the OCSS was 74.82 (SD 6.55) out of a possible 85, and total results ranged from 59 to 81 out of 85 (Verkuyl et al., 2016; Verkuyl et al., 2018). The mean scores for all items on the ease of use subscale ranged from 3.64 to 4.91, with a maximum score of 5 (See Table 1). No statistically significant difference (p > .05) was noted in group means between faculty and student users on the questions relating to ease of use (See Table 2). The mean scores for all items on the usefulness subscale ranged from 4.17 to 4.83, with a maximum score of 5 (See Table 1). A statistically significant difference (p < .05) was noted between faculty and student user means on the questions relating to usefulness and favored the faculty perspective (See Table 2). Faculty reported higher mean scores on all items in the usefulness subscale. Participants’ mean scores between agree and strongly agree suggest that the VR simulation was fun (mean 4.82/5), was presented in a realistic manner (mean 4.91/5), would help students prepare for clinical practice (mean 4.83/5), and would be a useful addition to clinical experience for students (mean 4.75/5). The item, “The text information presented on the screen was easy to read” (Verkuyl et al., 2016, p. 84) received the lowest mean score of 3.64. The Cronbach alpha was 0.83 for the ease of use subscale and 0.81 for the usefulness subscale. Player Experience Inventory. The total mean score on the PXI was 63.82 out of a possible 90 and total results ranged from 30 to 82 out of 90 (See Table 3) (Abeele et al., 2020). The mean score for each of the 10 constructs ranged from 0.36 (Progress Feedback) to 2.82 (Clarity of Goals) (See Table 3). Four participants, who required cuing to view embedded feedback, selected either slightly disagree or disagree on all three questions related to feedback. No statistically significant difference (p > .05) was noted in group means between faculty and student users on the questions relating user experience (See Table 2). The Cronbach alpha for the PXI was 0.91. Of the 10 individual constructs on this measure, six provided evidence of reliability, including mastery (0.87), progress feedback (0.90), audiovisual appeal (0.92), challenge (0.94), ease of control (0.94), and clarity of goals (1.00). The individual construct of curiosity (0.71) demonstrated ac- ceptable internal consistency, while three individual constructs, including meaning (0.55), immersion (0.40), and autonomy (0.63) demonstrated poor internal consistency. Semi-Structured Interviews. Several common themes were reported by both faculty and student participants as they described their experiences, the utility and benefit of learning in an immersed environment, challenges they encountered, and recommen- dations for future modifications (See Table 4).

12 Simulation & Gaming 0(0) Table 1. Orthopedic Clinical Simulation Survey (OCSS): Mean Scores and Standard Deviations. Faculty Student Overall Survey Item (maximum score: 5) Mean (SD) Mean (SD) Mean (SD) Ease of Use 1. It was easy to learn how to use the orthopedic VR 4.60 (0.55) 4.50 (0.55) 4.55 (0.52) 4.17 (0.75) 4.27 (0.65) simulation 3.33 (0.82) 3.64 (0.81) 4.17 (1.17) 3.88 (0.94) 2. The text information presented on the screen 4.40 (0.55) 4.0 (1.26) 4.09 (1.04) 4.33 (0.52) 4.45 (0.52) was clear 4.67 (0.52) 4.82 (0.40) 4.33 (0.52) 4.64 (0.50) 3. The text information presented on the screen 4.0 (0.70) 3.67 (0.52) 4.00 (0.63) 4.33 (0.52) 4.64 (0.50) was easy to read 4.83 (0.41) 4.91 (0.30) 4. It was easy to know what to do at each stage of 3.60 (0.54) 4.67 (0.52) 4.83 (0.40) 3.33 (0.82) 4.17 (1.04) the VR simulation 4.5 (0.55) 4.75 (0.47) 3.83 (0.98) 4.42 (0.92) 5. I didn’t have any technical problems using the VR 4.20 (0.83) 4.17 (0.41) 4.58 (0.52) simulation 4.33 (0.82) 4.37 (0.67) 6. The visual quality of the video was good 4.60 (0.55) 7. It was fun using the VR simulation 5.0 (0) 8. The pace of the “action” was good 5.0 (0) 9. The audio quality of the video was good 4.40 (0.55) 10. I wanted to continue working through the VR 5.0 (0) simulation 11. The situation presented seemed realistic 5.0 (0) Usefulness 12. I think the VR simulation will help students 5.0 (0) prepare for clinical practice 13. I think the VR simulation could help me develop 5.0 (0) communication/clinical skills 14. The orthopedic VR simulation will be a useful 5.0 (0) addition to clinical experience for students 15. The orthopedic VR simulation improved my 5.0 (0) knowledge of an orthopedic physical therapy assessment 16. The VR simulation helped me develop my 5.0 (0) orthopedic assessment skills 17. I think the VR simulation helped me develop my 4.40 (0.49) ability to organize and prioritize assessment strategies Note. VR, virtual reality. From “Virtual Gaming to Develop Students’ Pediatric Nursing Skills: A Usability Study,” by M. Verkuyl, L. Atack, R. Mastrilli, and D. Romaniuk, 2016, Nurse Education Today, 46, 81-85 (https:// doi.org/10.1016/j.nedt.2016.08.024). Adapted with permission. Discussion The purpose of this study was to determine the usability of a VR learning experience designed to promote student physical therapists’ clinical decision-making skills. The primary results of this study support the acceptance of an orthopedic VR instructional

Hartstein et al. 13 Table 2. OCSS and PXI Faculty and Student User Comparison. Faculty Users Student Users Mean Rank n Mean Rank n U z p OCSS—Ease of Use 7.1 5 5.08 6 9.5 À1.03 0.30 OCSS—Usefulness 8.7 5 3.75 6 1.5 À2.54 0.01* PXI 6.8 5 5.33 6 11 À0.73 0.46 Note. *p < .05 experience for student physical therapists. Overall, both faculty and students reported high levels of usefulness and frequently commented on the authenticity, educational utility, and the potential professional and developmental benefits of the application of this technology. Although the immersive VR experience was intentionally designed and influenced by principles of adult learning theory (Aiello et al., 2012), an established process of usability testing was necessary to assess the factors known to affect the uptake of educational technology. Instructional goals that aim to advance clinical decision-making may otherwise be threatened by poor levels of usability. Results from the OCSS (Verkuyl et al., 2016; Verkuyl et al., 2018) suggest overall satisfaction with the orthopedic VR learning experience from both faculty (79.2/85) and student (71.2/85) participants, with a total mean score of 75.2/85 or 88.5/100. These findings are similar to the results of previous usability studies investigating the acceptance of educational technology and virtual simulation in other health professions, such as nursing (Padilha et al., 2018). For example, in the usability study of a virtual game for pediatric nursing skills, Verkuyl et al. (2016) reported a mean of 87.6/100 on the Pediatric Clinical Simulation Survey. Verkuyl et al. (2018) and Verkuyl et al. (2019) both described comparable results in usability studies assessing the adoption of virtual gaming designed to promote mental health assessment (90/100) and a digital simulation aimed at enhancing clinical decision-making (78.9/100) when measured by their re- spective simulation surveys. Mean values on the OCSS (Verkuyl et al., 2016; Verkuyl et al., 2018), PXI (Abeele et al., 2020), and interview data suggest consensus agreement related to ease of learning, ease of control, clarity of text instruction, technical functionality, visual quality, pace of the experience, intrigue, clarity of goals, and realism, while immersed in the experience. These findings are consistent with the application of virtual learning and serious gaming in other health professions education curricula, such as pharmacy and nursing (Benedict et al., 2013; Forsberg et al., 2011; Lynch-Sauer et al., 2014). Although no statistically significant difference was noted between faculty and student perceived ease of use means (p > .05), three items demonstrated notable descriptive differences between faculty and student user perception. For example, student users more frequently noted difficulty with the visual clarity of the text information on the screen and reported more challenges with the quality of the audio presented. Faculty users noted difficulty progressing through the virtual experience, receiving embedded

14 Simulation & Gaming 0(0) Table 3. Player Experience Inventory (PXI): Mean Scores and Standard Deviations. Faculty Student Overall Construct Survey Item (maximum score: 3) Mean (SD) Mean (SD) Mean (SD) Meaning Playing the game was meaningful to 2.60 (0.55) 2.50 (0.84) 2.55 (0.69) Curiosity me. 2.82 (0.40) Mastery 2.64 (0.50) Autonomy The game felt relevant to me. 3.0 (0) 2.67 (0.52) 2.45 (0.93) Immersion Playing this game was valuable to me. 2.80 (0.45) 2.50 (0.55) 2.64 (0.50) Progress I wanted to explore how the game 2.80 (0.45) 2.17 (1.17) 2.55 (0.52) Feedback evolved. 1.73 (1.42) Audiovisual 2.27 (0.90) Appeal I wanted to find out how the game 2.80 (0.45) 2.50 (0.55) 1.64 (1.43) Challenge progressed. 2.45 (0.82) I felt eager to discover how the game 2.80 (0.45) 2.33 (0.52) 2.64 (0.67) continued. 2.18 (1.25) I felt I was good at playing this game. 2.0 (1.22) 1.50 (1.64) 1.18 (1.53) I felt capable while playing the game. 2.40 (0.55) 2.17 (1.17) 2.09 (0.94) 2.18 (0.87) I felt a sense of mastery playing this 2.20 (0.84) 1.17 (1.72) 0.27 (1.68) game. 0.27 (1.56) I felt free to play the game in my own 2.80 (0.45) 2.17 (0.98) 0.55 (1.13) way. 2.27 (0.65) I felt like I had choices regarding how 2.80 (0.45) 2.50 (0.84) 2.18 (0.98) 2.18 (0.98) I wanted to play this game. 2.55 (0.69) I felt a sense of freedom about how I 2.60 (0.55) 1.83 (1.60) 2.27 (1.19) wanted to play this game. 2.45 (0.93) I was no longer aware of my 1.4 (1.52) 1.0 (1.67) surroundings while I was playing. I was immersed in the game. 2.40 (0.55) 1.83 (1.17) I was fully focused on the game. 2.60 (0.55) 1.83 (0.98) The game informed me of my -0.40 (1.14) 0.83 (1.94) progress in the game. I could easily assess how I was 0.0 (1.58) 0.5 (1.64) performing in the game. The game gave clear feedback on my 0.40 (0.55) 0.67 (1.51) progress towards the goals. I enjoyed the way the game was 2.60 (0.55) 2.0 (0.63) styled. I liked the look and feel of the game. 2.60 (0.55) 1.83 (1.17) I appreciated the aesthetics of the 2.60 (0.55) 1.83 (1.17) game. The game was not too easy and not 2.80 (0.45) 2.33 (0.82) too hard to play. The game was challenging but not 2.60 (0.55) 2.0 (1.55) too challenging. The challenges in the game were at 2.80 (0.45) 2.17 (1.17) the right level of difficulty for me. (continued)

Hartstein et al. 15 Table 3. (continued) Faculty Student Overall Construct Survey Item (maximum score: 3) Mean (SD) Mean (SD) Mean (SD) Ease of It was easy to know how to perform 2.60 (0.55) 1.83 (0.75) 2.18 (0.75) Control actions in the game. 1.83 (0.75) 2.00 (0.77) 2.0 (0.89) 2.18 (0.75) Clarity of The actions to control the game 2.20 (0.84) 2.83 (0.41) 2.82 (0.40) Goals were clear to me. 2.83 (0.41) 2.82 (0.40) 2.83 (0.41) 2.82 (0.40) I thought the game was easy to 2.40 (0.55) control. I grasped the overall goal of the 2.80 (0.45) game. The goals of the game were clear to 2.80 (0.45) me. I understood the objectives of the 2.80 (0.45) game. Note. From “Development and Validation of the Player Experience Inventory: A Scale to Measure Player Experience at the Level of Functional and Psychosocial Consequences,” by V.V. Abeele, K. Spiel, L. Nacke, D. Johnson, and K. Gerling, 2020, International Journal of Human Computer Studies, 135, 1-12 (https://doi.org/10. 1016/j.ijhcs.2019.102370). Reprinted with permission. feedback, and applying directions from the pre-brief. While these findings did not meet a level of statistical significance (p > .05), and were, in some cases, singular in fre- quency, the usability methodology suggests these perceptions may limit future ac- ceptance of VR technology and negatively affect learning outcomes (Davis, 1989). In addition to its perceived utility, Cheng et al. (2021) described the relationship between instructors’ ability to negotiate and navigate educational technology and their eventual classroom implementation. Many of the faculty and student users’ recommendations regarding text clarity, audio quality, explicit instruction, and pre-briefing modifications were easily amended. In addition to being described as easy to use, the orthopedic VR learning experience was perceived as useful by both faculty and student users. Mean values on the use- fulness scale of the OCSS (Verkuyl et al., 2016; Verkuyl et al., 2018) between 4.17 and 4.83 out of 5 and the narrow range of these responses suggest high levels of agreement related to the perceived utility of the VR simulation. High mean scores on the PXI and internal consistency of questions relating to meaning and challenge also support the perceived utility of this learning experience and are comparable to other recent usability studies (Verkuyl et al., 2018; Verkuyl et al., 2019). Although group means from both faculty and student users indicate high levels of usefulness (4.9 and 4.14, respectively), the faculty users’ perspective was significantly greater than those of the student users (p < .05). Notable differences were found between these groups for items relating to the potential for the virtual simulation to improve communication and clinical skills (item 13), advance the students’ knowledge of orthopedic physical therapy assessment (item 15), and develop the students’

16 Simulation & Gaming 0(0) Table 4. Interview Themes and Recommendations. Theme Supporting Participant Statements Ease of use Faculty users Benefits “The layout and intent made sense as a clinician and matches the thought process we attempt to teach.” “I was surprised by how clear things were.” Student users “It was really cool, and it worked! I was interested in trying to pick the case apart. It seemed like I was in the clinic, and it let me work through the case because it did what I wanted the clinician to do.” “The controls were mostly intuitive, and the features look like the ones on YouTube.” “I like that it gives you enough prompts but also allows you freedom.” Faculty users “I think this is going to be an invaluable learning experience.” “We have always struggled with teaching clinical decision-making and connecting the information you glean from each part of the exam and how that relates to your hypothesis.” “An effective bridge between classroom and clinical education” “VR sems more engaging and interactive than the traditional observation hours that are required to apply to physical therapy school.” “I felt excited and intrigued to continue to move through the experience.” “I felt engaged because I had to remain present to work through the case.” Student users “The nice thing is that it feels like being in clinic with you.” “The experience responds to one of the challenges in our musculoskeletal courses by seeing an exam in its entirety, seeing all of the examination options laid out in front of you, and guiding you to select what would be helpful to rule in your primary hypothesis.” “Replaying an objective test was helpful and the options made me less fearful of making a mistake.” “Using VR would be very helpful as clinic is scary at first and a little stressful. The virtual encounter allows students to see the flow of an exam in a typical outpatient clinic and it would have helped my confidence in how to handle a patient interaction.” (continued)

Hartstein et al. 17 Table 4. (continued) Theme Supporting Participant Statements Modeling Faculty users “...powerful effects of viewing expert practice at work.” Realism “The experience provided a gold standard for effective Experience with technology communication between a clinician and a patient that we might Recommendations for not explicitly teach.” Student users improvement “I picked up on subtle things that I had not considered, such as the completeness of the subjective examination, the clarity that was revealed with follow-up questioning about the location of pain, non-verbal and verbal interactions between you and the patient, and handling skills.” “Seeing an actual physical therapist do the skills and complete the examination has benefit, even without doing the skills.” Faculty users “It’s as close to the real thing as it can be.” “Responds to the needs of the visual learner and more useful than a case study, from that perspective.” “As opposed to a case study where information is given to a student that they may not have ever considered questioning or collecting, this forces decisions to be made and data to be extracted” Student users “The standardized nature of the VR experience was more realistic than a role-playing situation where students are limited by the knowledge of their partner, dependent on their acting skills, and may have a lab partner who is not as serious about the experience.” “A realistic way to get information.” “It seemed like I was in the clinic with an expert.” “Difficulty viewing font size, color, and clarity of the text on the pre-subjective forms.” “I was a little overwhelmed by the amount of information in the pre-briefing video.” “I know this was in the instructions, but I have since forgotten.” An interactive tutorial or video demonstrating the expected user actions Additional instructions for non-auditory learners Explicit language on buttons to progress user through objective section More explicit direction to prompt users to receive feedback Move hypothesis selection window to the left side of viewer screen

18 Simulation & Gaming 0(0) orthopedic assessment skills (item 16). Considering the instructional goal of advancing student physical therapists’ clinical decision-making, the identified difference between faculty and student perception of usefulness may be explained, in part, by the typical development of this multifactorial process. Second-year student physical therapists, for example, may not possess the external perspective that more frequent clinical exposure fosters, where declarative and procedural aspects of knowledge are eventually transformed into conceptualized reasoning skills and practical application (Furze et al., 2015; Gilliland and Wainwright, 2017). Further, the discrepancy noted in perceived usability and several constructs of user experience reflects a difference in the expertise level and clinical competence of the faculty user when compared to the student user. Unlike the master clinician, who is reflective, internally motivated, able to recognize contextual influences, and is emo- tionally engaged during decision-making, the novice learner is often bound by rules, adopts a slower, analytical reasoning approach, and is often unable to recognize broad contextual factors (Carraccio et al., 2008). This difference in user expertise is noted in both the quantitative and qualitative data. For example, while faculty users commented that the realistic VR clinical experience forced decision-making and provided a bridge between classroom knowledge and direct patient-care application, student users ap- peared to perceive VR instruction as a realistic learning experience instead. As opposed to the competent or expert clinician, who takes responsibility for the clinical decision, perhaps the novice student user perceived their role as the observer in the interaction, where limited responsibility and ownership of decision-making is assumed (Carraccio et al., 2008). A difference in perceived autonomy was similarly noted in the PXI, where faculty user means were higher than student user mean on all three items related to autonomy. Rather than limiting the utility of this VR experience, these findings ex- emplify the continuum of expertise development and offer insight that may direct future instruction aimed at promoting ownership, responsibility, and the autonomy associated with clinical decision-making. Data from the semi-structured interviews provided additional context to support the educational utility of this VR experience. Student users consistently noted the benefit of the authentic and realistic environment that simulated the clinical experience of treating a patient with a faculty member present. Further, student users frequently expanded on the benefit of a VR encounter when compared to a case study or other traditional role- playing experiences. These findings parallel a recent scoping review of nursing in- structional practices that suggest greater student engagement, ease of access, and perceived safety when virtual simulation is used to enhance diagnostic reasoning (Duff et al., 2016). The low-stakes associated with this virtual learning environment allowed students to make mistakes without penalty, watch and re-watch examination techniques multiple times, and experiment with their judgment in a trial-and-error fashion, with the intermittent guidance of feedback. As recommended, several modifications were made to the VR learning experience to enhance perceived usability and future application. To encourage more frequent use of embedded feedback, highlighted arrows and text boxes were added to draw user

Hartstein et al. 19 attention to these instructional features. Additionally, consistent with the typical scanning and reading pattern of moving from left to right, the hypothesis selection window was moved to the user’s left to encourage more frequent reflection and as- sessment of working hypotheses after each objective examination selection. The results of usability testing are dependable and transferable as they routinely highlight areas of concern, capture user satisfaction, and generate solutions to enhance future application (Reijonen and Tarkkanen, 2015). Although generalizability is not typically associated with qualitative research or usability methods, features of this study have implications beyond the specific learning activity tested and the small sample of students and faculty intentionally selected (Leung, 2015). The results are generalizable to future users and the assessment of this learning modality’s practical utility is generalizable to its future use (Reijonen and Tarkkanen, 2015). Considering the consistent evidence supporting learning outcomes through virtual simulation (Tolarba, 2021) and the anticipated adoption rate of VR instruction in health professions ed- ucation (Kardong-Edgren et al., 2019), a greater understanding of design features and user experience adds to the literature for developers and educators who plan to im- plement immersive technology. From this perspective, rather than restricting these findings to the limited population in this study, an analytical view of generalization supports the broader application of usability testing, their procedures, their results, and their influence on future designs and instruction (Brinkmann and Kvale, 2015). Several limitations may have influenced the findings of this usability study. The convenience sample, which includes students from a single cohort and faculty from the same physical therapy program may not reflect all student experiences. Additionally, the small sample size of this usability study may have affected the power of the in- ferential analyses comparing faculty and student user experiences. Although modi- fications to the OCSS (Verkuyl et al., 2016; Verkuyl et al., 2018) are primarily semantic in nature and reflect the difference in clinical setting from the reliable and valid Pe- diatric Clinical Simulation Survey, the psychometric qualities of this questionnaire are unknown. Constructs of the PXI (Abeele et al., 2020), however, which assess similar features of technology adoption offered an opportunity for the triangulation of data. The number of items on the OCSS and PXI may have also been prohibitive and created fatigue from student and faculty responders. Recommended modifications were made to the VR learning experience, but time and budgetary constraints prohibited a second round of usability testing prior to comparative study. Conclusion As the frequency of technology in health professions education continues to increase, a thorough assessment of factors known to influence acceptance and adoption becomes a critically necessary step. This study describes the usability testing completed on a VR learning experience designed to enhance clinical decision-making skills among student physical therapists. Both faculty and student users perceived the VR experience as easy to use and useful with regards to learning and clinical development. The two phases of

20 Simulation & Gaming 0(0) this usability study revealed quantitative data supporting faculty and student per- ceptions, positive player experiences, and high levels of internal consistency for both subsections of the OCSS and six of the 10 constructs on the PXI. Qualitative data provided recommendations to modify design features that may further enhance ease of use, usefulness, and positively influence learning outcomes. Findings from this study offer educators an effective template to design, produce, and assess the usability of immersive virtual learning experiences that can be replicated in health professions programming where current evidence is limited. Declaration of Conflicting Interests The authors declare no conflicts of interest regarding the research, authorship, and publication ofthis article. Funding The author(s) received no financial support for the research, authorship, and/or publication of this article. Contributorship From the School of Health Sciences, University of South Dakota, Vermillion, SD (AH, PB-P, KZ, and JY), Division of Physical Therapy, Shenandoah University, Winchester, VA (AH), and School of Community and Health Studies, Centennial College, Toronto, ON (MV) ORCID iDs Aaron J. Hartstein  https://orcid.org/0000-0003-2469-3684 Margaret Verkuyl  https://orcid.org/0000-0002-7714-5449 Kory Zimney  https://orcid.org/0000-0002-7513-2126 Supplemental Material Supplemental material for this article is available online. References Abeele, V. V., Spiel, K., Nacke, L., Johnson, D., & Gerling, K. (2020). Development and validation of the player experience inventory: A scale to measure player experiences at the level of functional and psychosocial consequences. International Journal of Human- Computer Studies, 135, 1-12. doi: 10.1016/j.ijhcs.2019.102370 Aiello, P., D’Elia, F., Di Tore, S., & Sibilio, M. (2012). A constructivist approach to virtual reality for experiential learning. E-learning and Digital Media, 9(3), 317-324. 10.2304/elea.2012. 9.3.317 Albu, M., Atack, L., & Srivastava, I. (2015). Simulation and gaming to promote health education: Results of a usability test. Health Education Journal, 74(2), 244-254. 10.1177/ 0017896914532623

Hartstein et al. 21 Atack, L., Gignac, P., & Anderson, M. (2010). Getting the right information to thetable: Using technology to support evidence-based decision making. In Healthcare Management Forum (Vol. 23, No. 4, pp. 164-168). Sage CA: Los Angeles, CA: SAGE Publications. doi: 10. 1016/j.hcmf.2010.08.009 Bediang, G., Franck, C., Raetzo, M. A., Doell, J., Kamga, Y., Baroz, F., & Geissbuhler, A. (2013). Developing clinical skills using a virtual patient simulator in a resource-limited setting. Studies in Health Technology and Informatics, 192, 102-106. doi: 10.3233/978-1-61499- 289-9-102 Benedict, N., Schonder, K., & McGee, J. (2013). Promotion of self-directed learning using virtual patient cases. American Journal of Pharmaceutical Education, 77(7). doi: 10.5688/ ajpe777151 Botezatu, M., Hult, H., Tessma, M. K., & Fors, U. (2010). Virtual patient simulation: Knowledge gain or knowledge loss? Medical Teacher, 32(7), 562–568. doi: 10.3109/01421590903514630 Brinkmann, S., & Kvale, S. (2015). Interviews: Learning the craft of qualitative research in- terviewing (3rd ed.). Los Angeles, CA: Sage. Cant, R., Cooper, S., Sussex, R., & Bogossian, F. (2019). What’s in a name? Clarifying the nomenclature of virtual simulation. Clinical Simulation in Nursing, 27, 26–30. doi: 10.1016/ j.ecns.2018.11.003 Carraccio, C. L., Benson, B. J., Nixon, L. J., & Derstine, P. L. (2008). From the educational bench to the clinical bedside: Translating the Dreyfus developmental model to the learning of clinical skills. Academic Medicine, 83(8), 761-767. doi: 10.1097/ACM.0b013e31817eb632 Cheng, A., Kessler, D., Mackinnon, R., Chang, T. P., Nadkarni, V. M., Hunt, E. A., & Auerbach, M. (2016). Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. Clinical Simulation in Nursing, 12, A3-A13. doi: 10. 1097/SIH.0000000000000150 Cheng, S. L., Chen, S. B., & Chang, J. C. (2021). Examining the multiplicative relationships between teachers’ competence, value and pedagogical beliefs about technology integration. British Journal of Educational Technology, 52(2), 734-750. doi: 10.1111/bjet.13052 Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of in- formation technology. MIS Quarterly, 13(3), 319–340. doi:10.2307/249008 Dennick, R. (2016). Constructivism: Reflections on twenty-five years teaching the constructivist approach in medical education. International Journal of Medical Education, 7, 200–205. 10.5116/ijme.5763.de11 Diehl, L. A., Souza, R. M., Gordan, P. A., Esteves, R. Z., & Coelho, I. C. M. (2017). InsuOnline, an electronic game for medical education on insulin therapy: A randomized controlled trial with primary care physicians. Journal of Medical Internet Research, 19(3), e72. doi: 10. 2196/jmir.6944 Duff, D., Miller, L., & Bruce, J. (2016). Online virtual simulation and diagnostic reasoning: A scoping review. Clinical Simulation in Nursing, 12(9), 377-384. doi:10.1016/j.ecns.2016. 04.001 Foronda, C. L., Fernandez-Burgos, M., Nadeau, C., Kelley, C. N., & Henry, M. N. (2020). Virtual simulation in nursing education: A systematic review spanning 1996 to 2018. Simulation in

22 Simulation & Gaming 0(0) Healthcare: The Journal of the Society for Simulation in Healthcare, 15(1), 46–54. doi: 10. 1097/SIH.0000000000000411 Foronda, C. L., Shubeck, K., Swoboda, S. M., Hudson, K. W., Budhathoki, C., Sullivan, N., & Hu, X. (2016). Impact of virtual simulation to teach concepts of disaster triage. Clinical Simulation in Nursing, 12, 137-144. doi:10.1016/j.ecns.2016.02.004 Forsberg, E., Georg, C., Ziegert, K., & Fors, U. (2011). Virtual patients for assessment of clinical reasoning in nursing—A pilot study. Nurse Education Today, 31(8), 757-762. doi: 10.1016/ j.nedt.2010.11.015 Furze, J., Black, L., Hoffman, J., Barr, J. B., Cochran, T. M., & Jensen, G. M. (2015). Exploration of students’ clinical reasoning development in professional physical therapy education. Journal of Physical Therapy Education, 29(3), 22–33. doi:10.1097/00001416-201529030- 00005 Gilliland, S., & Wainwright, S. F. (2017). Patterns of clinical reasoning in physical therapist students. Physical Therapy, 97(5), 499-511. doi:10.1093/ptj/pzx028 Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277-1288. doi: 10.1177/1049732305276687 Hsu, S. (2010). The relationship between teacher’s technology integration ability and usage. Journal of Educational Computing Research, 43(3), 309-325. doi: 10.2190/EC.43.3.c Huang, H. M., & Liaw, S. S. (2018). An analysis of learners’ intentions toward virtual reality learning based on constructivist and technology acceptance approaches. International Review of Research in Open and Distributed Learning, 19(1). Doi:10.19173/irrodl.v19i1. 2503 Huhn, K., Black, L., Christensen, N., Furze, J., Vendrely, A., & Wainwright, S. (2018). Clinical reasoning: Survey of teaching methods and assessment in entry-level physical therapist clinical education. Journal of Physical Therapy Education, 32(3), 241–247. doi:10.1097/jte. 0000000000000043 Jensen, G. M., Nordstrom, T., Mostrom, E., Hack, L. M., & Gwyer, J. (2017). National study of excellence and innovation in physical therapist education: Part 1 - Design, method, and results. Physical Therapy, 97(9), 857–874. doi:10.1093/ptj/pzx061 Kardong-Edgren, S., Farra, S. L., Alinier, G., & Young, H. M. (2019). A call to unify definitions of virtual reality. Clinical Simulation in Nursing, 31, 28–34. doi: 10.1016/j.ecns.2019.02.006 Kleinert, R., Heiermann, N., Plum, P. S., Wahba, R., Chang, D. H., Maus, M., & Stippel, D. L. (2015). Web-Based immersive virtual patient simulators: Positive effect on clinical rea- soning in medical education. Journal of Medical Internet Research, 17(11), e263. doi:10. 2196/jmir.5035 Kolb, D. A. (1984). Experience as the source of learning and development. Upper Saddle River, NJ: Prentice Hall Inc. Kononowicz, A. A., Woodham, L. A., Edelbring, S., Stathakarou, N., Davies, D., Saxena, N., & Zary, N. (2019). Virtual patient simulations in health professions education: Systematic review and meta-analysis by the digital health education collaboration. Journal of Medical Internet Research, 21(7), e14676. doi: 10.2196/14676 Lehmann, R., Thiessen, C., Frick, B., Bosse, H. M., Nikendei, C., Hoffmann, G. F., & Hu- wendiek, S. (2015). Improving pediatric basic life support performance through blended

Hartstein et al. 23 learning with web-based virtual patients: Randomized controlled trial. Journal of Medical Internet Research, 17(7), e162. doi:10.2196/jmir.4141 Leung, L. (2015). Validity, reliability, and generalizability in qualitative research. Journal of Family Medicine and Primary Care, 4(3), 324-327. doi:10.4103/2249-4863.161306 Lioce, L., Lopreiato, J., Downing, D., Chang, T.P., Robertson, J. M., Anderson, M., & Ter- minology, Concepts Working Group. (2020). Healthcare Simulation Dictionary – Second Edition. Retrieved from http://www.ssih.org/dictionary Lynch-Sauer, J., VandenBosch, T. M., Kron, F., Gjerde, C. L., Arato, N., Sen, A., & Fetters, M. D. (2011). Nursing students’ attitudes toward video games and related new media technologies. Journal of Nursing Education, 50(9), 513-523. doi: 10.3928/01484834-20110531-04 Makransky, G., Borre-Gude, S., & Mayer, R. E. (2019). Motivational and cognitive benefits of training in immersive virtual reality based on multiple assessments. Journal of Computer Assisted Learning, 35(6), 691–707. doi:10.1111/jcal.12375 Mardani, M., Zarifsanaiey, N., Cheraghian, S., & Naeeni, S. K. (2020). Effectiveness of virtual patients in teaching clinical decision-making skills to dental students. Journal of Dental Education, 84(5), 615-623. doi: 10.1002/jdd.12045 McBee, E., Ratcliffe, T., Schuwirth, L., O’Neill, D., Meyer, H., Madden, S. J., & Durning, S. J. (2018). Context and clinical reasoning: Understanding the medical student perspective. Perspectives on Medical Education, 7(4), 256–263. doi:10.1007/s40037- 018-0417-x Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and Im- plementation (4th ed.). San Francisco, CA: Jossey-Bass. doi:10.1177/0741713616671930 Middeke, A., Anders, S., Schuelper, M., Raupach, T., & Schuelper, N. (2018). Training of clinical reasoning with a Serious Game versus small-group problem-based learning: A prospective study. PLoS One, 13(9), e0203851. doi: 10.1371/journal.pone.0203851 Nahm, E. S., Preece, J., Resnick, B., & Mills, M. E. (2004). Usability of health Web sites for older adults: A preliminary study. CIN: Computers, Informatics, Nursing, 22(6), 326-334. doi: 10. 1097/00024665-200411000-00007 Nielsen, J. (1994). Usability Engineering. San Diego, CA: Academic Press. Padilha, J. M., Machado, P. P., Ribeiro, A. L., & Ramos, J. L. (2018). Clinical virtual simulation in nursing education. Clinical Simulation in Nursing, 15(C), 13-18. doi: 10.1016/j.ecns. 2017.09.005 Pantelidis, V. S. (2009). Reasons to use virtual reality in education and training courses and a model to determine when to use virtual reality. Themes in Science and Technology Edu- cation, 2(1-2), 59-70. Retrieved from http://earthlab.uoi.gr/theste/index.php/theste Plotzky, C., Lindwedel, U., Sorber, M., Loessl, B., Konig, P., Kunze, C., & Meng, M. (2021). Virtual reality simulations in nursing education: A systematic mapping review. Nursing Education Today, 101, 1-11. doi:10.1016/j.nedt.2021.104868 Pottle, J. (2019). Virtual reality and the transformation of medical education. Future Healthcare Journal, 6(3), 181–185. doi:10.7861/fhj.2019-0036 Prokop, T. R. (2018). Use of the dual-processing theory to develop expert clinical reasoning in physical therapy students. Journal of Physical Therapy Education, 32(4), 355–359. doi: 10. 1097/jte.0000000000000062

24 Simulation & Gaming 0(0) Real, F. J., DeBlasio, D., Beck, A. F., Ollberding, N. J., Davis, D., Cruse, B., & Klein, M. D. (2017). A virtual reality curriculum for pediatric residents decreases rates of influenza vaccine refusal. Academic Pediatrics, 17(4), 431-435. doi:10.1016/j.acap.2017.01.010 Reijonen, P., & Tarkkanen, K. (2015). Artifacts, tools and generalizing usability test results [Paper presentation]. 6th Scandinavian Conference on Information Systems, Oulu, Finland. doi:10.1007/978-3-319-21783-3_9 Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design and conduct effective tests.Indianapolis, IN: John Wiley & Sons. Schittek-Janda, M., Mattheos, N., Nattestad, A., Wagner, A., Nebel, D., Farbom, C., & Attstrom, R. (2004). Simulation of patient encounters using a virtual patient in periodontology instruction of dental students: Design, usability, and learning effect in history-taking skills. European Journal of Dental Education, 8(3), 111–119. doi: 10.1111/j.1600-0579.2004.00339.x Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27(2), 237-246. doi:10.1177/1098214005283748 Tolarba, J. E. L. (2021). Virtual simulation in nursing education: A systematic review. Inter- national Journal of Nursing Education, 13(3), 48-54. doi:10.37506/ijone.v13i3.16310 Vendrely, A. (2005). Critical thinking skills during a physical therapist professional education program. Journal of Physical Therapy Education, 19, 55-59. doi:10.1097/00001416- 200501000-00007 Verkuyl, M., Atack, L., Mastrilli, P., & Romaniuk, D. (2016). Virtual gaming to develop students’ pediatric nursing skills: A usability test. Nurse Education Today, 46, 81-85. doi: 10.1016/j. nedt.2016.08.024 Verkuyl, M., Betts, L., & Sivaramalingam, S. (2019). Nursing students’ perceptions using an interactive digital simulation table: a usability study. Simulation & Gaming, 50(2), 202-213. doi:10.1177/1046878119844283 Verkuyl, M., Romaniuk, D., & Mastrilli, P. (2018). Virtual gaming simulation of a mental health assessment: A usability study. Nurse Education in Practice, 31, 83-87. doi: 10.1016/j.nepr. 2018.05.007 Wainwright, S. F., & Gwyer, J. (2017). (How) Can we understand the development of clinical reasoning? Journal of Physical Therapy Education, 31(1), 4–6. doi:10.1097/00001416- 201731010-00003 Wainwright, S. F., Shepard, K. F., Harman, L. B., & Stephens, J. (2011). Factors that influence the clinical decision making of novice and experienced physical therapists. Physical Therapy, 91(1), 87–101. doi:10.2522/ptj.20100161 Weiner, C. K., Ska˚le´n, M., Harju-Jeanty, D., Heymann, R., Rose´n, A., Fors, U., & Lund, B. (2016). Implementation of a web-based patient simulation program to teach dental students in oral surgery. Journal of Dental Education, 80(2), 133-140. doi: 10.1002/j.0022-0337. 2016.80.2.tb06068.x


Like this book? You can publish your book online for free in a few minutes!
Create your own flipbook