Enabling Military Virtual Reality and Mixed Reality Simulations with a Flexible Peripheral Framework and Automated Assessments
Presented at the International Conference on Applied Human Factors and Ergonomics (AHFE), Orlando, FL (July 2018).
The military virtual training and simulation market is forecast to be a $6B+ market through 2020, driven by the increasing immersiveness of virtual reality (VR), augmented reality (AR), and mixed reality (MR) training systems. AR/VR/MR simulations can provide significant cost and safety benefits to conducting training or research on human behaviors in situations that would otherwise be costly to replicate or pose significant safety risks to participants. However, effective simulations must facilitate natural human interactions with the virtual environment to represent the situations they seek to replicate to achieve results that will transfer to real-world scenarios. As a result, the success of these training systems is often heavily reliant on the resolution, immersiveness, and usability of human-simulation interactions. These dimensions are driven both by the simulation software and the hardware that mediates human interactions and control with/of the virtual environment.
For medical training and simulation, this interface is even more critical because mapping natural, real-world interactions (e.g., grasping an object) to a keyboard, mouse, or game controller interface creates a degree of separation from the skills being trained. This separation prevents physical reinforcement of trained skills and places the burden on the human to translate in-game actions to real-world skills, creating the potential for human error in situations with potentially severe consequences. Existing solutions to these challenges have historically involved specialized equipment; however, the cost, complexity, and fragility of these systems are prohibitive for the widespread deployment necessary to train large communities of medical personnel.
To enable effective virtual training and simulation, a system could be developed that affords natural user interface controls while leveraging the benefits of simulation-based training. This system should address three key requirements: (1) support natural human interactions within virtual environments; (2) provide an extensible framework that adapts to varying simulation constraints to maximize the return on investment (ROI); and (3) provide automated context-driven proficiency assessments based on virtual user actions to maximize the benefits of virtual training and simulations.
To address these requirements, Charles River Analytics Inc. is designing, developing, validating, and evaluating a prototype Virtual Interface for Real-Time User Control during Simulated Operations (VIRTUOSO). VIRTUOSO is a hardware and software toolkit designed for simulation developers to enable creation of natural human interactions in AR/MR/VR simulations. Using an open source framework, VIRTUOSO allows simulation developers to deploy hardware-agnostic solutions (e.g., end users can seamlessly switch between different viewing platforms or control peripherals without additional development) to ensure simulations can be deployed based on environmental constraints and user preferences. The capability allows simulations built with VIRTUOSO to remain future-extensible for new AR/MR/VR peripherals, extending their value-add timeline. Finally, VIRTUOSO incorporates behavior monitoring and performance assessment capabilities to automate behavior and skill proficiency assessment by extracting relevant motion data, and optional physiological variables (e.g., heart rate, heart rate variability, respiration rate, galvanic skin response) based on user actions in the virtual environment. This work will summarize our engineering approach to support natural human interactions and interchangeable simulation peripherals. We will provide a brief demonstration of one of our exemplar simulations for treating tension pneumothorax, and will highlight the benefits of enabling automated performance assessment to facilitate self-guided or remote-moderator training leveraging the potential of AR/VR/MR simulations.
For More InformationTo learn more or request a copy of a paper (if available), contact Michael Jenkins.
(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)