VR/AR - More than just headsets

Sunil Patel profile pic

by Jeremy Dalton 
Head of VR/AR
@jeremycdalton
Linkedin

 

When discussing innovation, we often hear the phrase: ‘the bigger picture’. It’s a cliché. More bumper-sticker slogan than business strategy. But it raises a key point about the importance of perspective. When we take a step back from something, we see things differently. We engage in joined-up thinking. We discover new connections, new contexts and new possibilities.

Now more than ever, against the backdrop of rapid technological development around virtual reality and augmented reality (VR/AR), it’s this broader, more encompassing, view of immersive technology that holds my interest as a jumping-off point for important conversations.

Too often, approaches to VR/AR are dangerously myopic. It’s tempting to focus only on its most visible feature: headsets or head mounted displays (HMDs) and their ability to immerse users in another environment. But, in reality, it’s clear that HMDs aren’t the be-all-and-end-all of immersive technology. As VR/AR has developed, so has the landscape that surrounds it. And to truly tap into the potential of immersive technology for business, innovators must get to know this wider web of interconnected technology.

We need to broaden our view of virtual reality and augmented reality technology (collectively known as ‘Extended Reality’ or ‘XR’) to a larger ecosystem that incorporates adjacent technologies. These technologies, including eye tracking, haptics, 3D audio, photogrammetry, locomotion, and more, are connected to XR in that they have the potential to enable or enhance those experiences but are not bound to the technology.

With this in mind, this article asks (and hopefully answers) two essential questions:

  1. What are some of the most significant elements of the extended reality ecosystem?

  2. How  can they unlock new benefits for businesses?

Eye tracking

Eye tracking allows developers to monitor users’ exact eye movements. And, although the tech is still in its infancy, it’s easy to see its value as a tool for rapid prototyping and data collection in an enterprise setting.

Take the example of a retail store that is looking to optimise its product placement. By recording a user's gaze in VR, eye tracking technology would allow a retailer to collect powerful and actionable data on how test shoppers interact with different store merchandising layouts: what were they looking at? Which products stood out to them first? Did they look at some products longer than others?

Using specialist VR analytics software, the results could be output as simple automated visualisations like ‘heat maps’ or ‘gaze vectors’. These give organisations immediate access to a rich, reliable, and intuitive understanding of the user’s experience - everything from the most overt reactions to subconscious elements of their response that traditional feedback forms simply cannot capture.

A similar story can be told about the training sphere, where this kind of ‘gaze data’ could give important insights into how trainees react to visual cues in educational VR experiences. Are they looking at the right item at the right time? How often do they get distracted? Or even, how might we improve our simulations to make better use of our users’ attention? All valuable questions to address in a landscape where ineffective training tools have a high cost to both people and profits.

Eye tracking technology also has an important role to play in optimising VR graphics. In a process known as ‘foveated rendering’, devices like the Varjo VR headset have the potential to detect where the user’s eye is focused and concentrate high-quality rendering in that area, while reducing quality in their peripheral vision. The result? A crisp, clean visual experience, without the need for huge processing power - a game-changer in making high-resolution virtual reality more accessible and affordable.

Haptics

In a workplace increasingly dominated by intangible technologies, haptics is about using the power of touch to enhance a user’s presence in the virtual world. It recreates the sensation of touch, using mechanical means or through innovative methods like ultrasound, helping us to ‘feel’ digital interactions. Most of us already encounter it everyday via phone vibrations - when notifications are accompanied by a simple haptic buzz. But we’re often completely unaware of its broader uses and context.

In the business to consumer sphere, the technology has already demonstrated its ability to increase consumer engagement. When Bristol-based startup, Ultrahaptics, added mid-air haptics to interactive movie posters, the results were positive. Feedback included statements like: ‘I would pay more attention to it’ and ‘I felt more engaged and immersed with the haptic feedback’. Doctor Hannah Limerick, a User Experience Researcher at Ultrahaptics, found that 73% of people would be more likely to make a purchase after experiencing a haptic poster, and that interaction time increased by a remarkable 50%.

In business to business focused organisations, haptics is equally promising. It has the potential to replace complicated user interfaces with intuitive gestural controls in workstations, which could lead to increased productivity, or to add an extra dimension to data visualisation.

As with eye tracking, it also has clear applications in employee training. Products like the Teslasuit are already using haptics to teach ‘complex repetitive movements’. The full-body suit acts like a ‘second-skin’ for trainees, using a motion capture system to track and analyse their performance in real time, and providing tactile feedback to their muscles when they make an incorrect movement. Through these repeated physical cues, the tech incorporates motions into the user’s long-term muscle memory, improving retention and allowing the actions to be performed more naturally. This type of technology sees applications stretching from athletics to firefighter training, giving users an advantage in situations where there can be no margin for error.

Locomotion

Locomotion in the context of VR refers to travelling around a virtual environment. It can take many forms, but most commonly users explore virtual worlds in one of the following ways:

  • Real-life movement - simply walking around a real-world environment which is matched in the virtual world

  • ‘Teleportation’ - indicating an area users wish to travel to (by using controllers or gazing at it, for example), allowing them to instantaneously appear at the selected destination.

Although in many cases both systems function perfectly, they have their drawbacks. Some users find ‘teleportation’ tricky, inaccurate, and unintuitive. On the other side, without access to expensive, large-scale demo areas, it’s often unsafe to encourage users to walk around freely. Even in combination, the user experience isn’t perfect, especially because consumers without VR experience generally find it difficult to switch seamlessly between the two different types of locomotion.

VR go image

As a result, different companies have created a number of locomotion solutions from multidirectional treadmills to sensor-embedded footrests and seats. The work of VRGO is one such example. Seated users can use their feet and body weight to push, tilt and rotate a small seat to direct their virtual movement, allowing for flexible control over speed with adjustable sensitivity.

When integrated with VR experiences, this form of locomotion offers many benefits:

  • It allows users to sit comfortably (rather than walking), reducing fatigue.
  • It limits the space required for effective demos, minimising costs whilst maximising virtual mobility.
  • It removes any reliance on handheld controllers, freeing up users’ hands for other activities (e.g., 3D design in VR).
  • It makes complex trajectories more intuitive, triggering locomotion through the natural movement of your feet.

Final thoughts

Understanding the full potential of immersive technology as a tool for enterprise is something of a paradox. To get closer to what immersive tech can really do, you have to take a step back, exploring how peripheral XR technologies can refine and enrich virtual experiences. To combine very new tech with a very old metaphor, we could say that XR headsets are just the tip of the iceberg, and that, under the surface, lies a vast and important ecosystem of tools to enhance experiences.

In this article, we talked specifically about eye tracking, haptics and locomotion. These are a few examples but there are many other adjacent technologies connected to the XR ecosystem.

Want to start your XR journey? We combine the right people, technology and experience to help you unlock the benefits of virtual reality and augmented reality. Contact me today for more information.

Contact us

Jeremy Dalton

Jeremy Dalton

Head of Metaverse Technologies, PwC United Kingdom

Tel: +44 (0)7701 295956

Follow us