9 key VR terms L&D needs to knowby
Any new technology comes with its own slew of terms, concepts and abbreviations – and VR is no exception. In my last of three articles on VR I’ll try to explain what I think are the most important and interesting terms for L&D.
If you’re looking for a brief introduction to VR itself, consider reading my previous article which challenges assumptions around VR in learning. Also, if you’re looking to start your journey in learning and VR and are looking for some encouragement, considering looking at my first article which explains why VR in learning isn't as hard as you think.
Almost all VR systems feature a head mounted device – or a headset. In the past, the primary purpose or the HMD was to envelope the user’s sight and replace it with artificial visuals, placing a display right before the user’s eyes.
Recently this has extended to audio. Most modern HMDs also feature sensors which allow the wider system to track the movement of the user.
HMDs range in price and complexity from a phone held against the user’s face in a cardboard box all the way to dedicated, sensor rich, high resolution and expensive devices.
Picking the right HMD that supports the learning you plan on delivering will determine the broad opportunities and limitations that are available when designing experiences.
Augmented reality is a similar technology to VR with one key difference. Rather than attempting to remove the user from the current setting, AR takes the user’s current view of the world and layers additional data on top of it. This augmentation allows the users to gain context, see detail and receive information they otherwise couldn’t.
Some AR systems don’t require HMD and can be run on smartphones. AR on a smart phone, though less immersive, is cheaper, easier to understand and something a group of people can experience together. On that basis AR is a great option for a classroom environment.
Mixed reality is a term used by many to describe the use of technology that mixes the real and digital world. Some refer to AR and VR as being on a “mixed reality spectrum” of experiences that are bought together by similar technologies, themes and goals.
Recently the term has been used by Microsoft to describe a subset of devices that can support both VR and AR.
When developing learning experiences, it’s important to take the entire MR spectrum into account, picking an approach that maximizes learning outcomes.
The refresh rate is supported by the frame rate of the software that runs the simulation.
Expressed in FPS, or frames per second, it refers to how often the software can generate an update for the screen. Higher frame rates improve the sense of immersion and reduce indices of simulator sickness.
The biggest factors that affect FPS are the complexity of the simulation and the quality of the graphics processing hardware hooked up to the HMD. A poor balance of complexity and hardware capability can introduce visual and audio slowdowns and stutters.
Occasionally people come away from a VR experience feeling disoriented or unwell. It can feel like motion sickness, and typically is more prominent in those who suffer from motion sickness generally.
The biggest trigger is when the simulation contradicts the user’s sense of place or motion.
An example is “moving” the user without input, such as panning the viewpoint without any prompt from the user. It can also be caused by more subtle contradictions such as poor use of directional sound, high latency or low framerates.
In the L&D space it’s worth considering accessibility and thinking about how else the user can learn what you’re trying to teach if motion sickness prevents them from seeing through the experience.
If engrossing visuals are the highest impact part of VR, directional sound comes a close second. Much like the surround sound you hear in the cinema, directional sound gives the user a greater sense of place and motion. When the user moves in the scene the “source” of the sound should as well.
In L&D directional sound is important when familiarity with the sights and sounds of a situation are part of the learning outcomes, such as how people might respond to a dangerous situation.
A big breakthrough in modern VR is the ability to track the user’s motion in the experience.
By using tilt sensors and light tracking, high end VR systems allow the user to move naturally in the experience, removing the need for them to press buttons to change the point of view. Some systems even allow the user to stand up and walk around, delivering a “room scale” experience.
Where a sense of scale and perspective are important to learning investing in good head tracking or even room scale VR is a must.
Eye tracking is an extension of head tracking that lets the user focus on particular subjects in the experience using their gaze, rather than having to tilt their whole head.
Though it’s much less mature than head tracking (no commercial systems are currently on the market) it promises to take the benefits of head tracking to the next level.
Being able to track a user’s gaze could be useful for determining a learner’s performance in difficult situations, allowing you to analyse their levels of focus, composure of even eye contact.
It’s well documented that the more senses you engage convincingly in a VR experience, the greater the user’s level of immersion becomes.
Haptic, or touch feedback systems are seen as the next great leap in VR, promising that the user will be able to “feel” the experience.
Convincing haptics are a long way off but will have a huge impact on L&D, allowing a learner to develop muscle memory as well as visual, spatial and audio familiarity with a situation.
There’s a lot to learn to make the most of VR, particularly in L&D which presents its own set of challenges.
Today we’ve covered some of the core concepts, some small but important technical factors and had a peek at what is coming in the future. Armed with this, having challenged some assumptions and knowing it’s not as hard as you might think, you should be ready to start your journey in VR in L&D.