It seems trying to predict the future is a very human trait. It speaks to both our curious and creative nature. However, the popular William Gibson quote “The future is already here — it’s just not very evenly distributed.” would suggest we don’t need to look too far. I felt this, when seeing the Webby Awards winner DOTA2 Championshipin 2017.
DOTA2 is an online multi-player battle game that has spawned a large community, resulting in a highly competitive and lucrative e-sports franchise. The live experience is a place where gaming fans experience AR and VR in a truly epic way, pointing a clear possibility to how our physical reality may be transformed in the near future.
Is this our future and when will it arrive for more of us?
Mary Meeker of Kleiner Perkins, proposes that a major technology paradigm shift occurs every 15 years. Looking at her chart, we’re due the next evolution in/around 2020 (the previous paradigm shift was the smartphone era starting in 2005). The particular technology in question, is Mixed Reality (XR), with XR being the umbrella term that includes Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). The later, being the term used with Microsoft HoloLens or the Magic Leap headsets.
As the market matures, there is increasing neural research that demonstrates how XR lights up our brains in ways other communications channels don’t. Compare against voice interfaces, which are more natural, but also more passive with regards to neural activity. The increased brain activity of XR, is important for brands and marketeers, as research by the IPA states that successful advertising benefits you to elicit a powerful emotional response. With this we know comes increased neural activity.
While good XR experiences turn our brains on, they require good “quality” content to do so. Quality, is a relative measure, and might not always mean super hi-fidelity visuals and high frame rates.
What this content should be and how you experience it, is what the Isobar Neurolabset out to find out.
Built by combining the expertise of Isobar’s Market Intelligence team, VR measurement tool and global accelerator, Isobar Nowlab, we were able to create an acclaimed test unit, The Neurolab.
It uses deep brain activity analysis coupled with eye tracking, facial expression, galvanic skin response and heart activity to see how test subjects respond to these new XR experiences. By recording all the relevant inputs into a time-locked database, deep analysis can be done on the effectiveness and efficiency of this new content.
A recent example includes the comparative test for Lionsgate’s Jigsawfilm trailer, pitting a standard 16:9 edit with a fully immersive, VR based, moral dilemma. Not surprisingly the VR experience trumped the humble film trailer across 5 key measures, with a staggering 336% increase in emotional valence.
While these new inputs need carefully calibration and the responses to emotional state decoding, the results so far have been enlightening. From the many tests we completed so far, we’ve been able to ascertain:
- That too many AR/VR experiences are being rushed out without properly testing usability testing, which is diminishing the impact and effectiveness. There is a consistently seen set of intuitive, confusing and clunky features and approaches that need to be reworked with customer feedback. In particular, developers need to do more in the way of visual and auditory cueing to lead users through the intended path for room scale experiences.
- Motion sickness is still a pervasive problem in any XR experience that moves the subject through virtual space. Fortunately, the same biometric sensors we use to detect emotion can be used to predict the early onset of motion sickness. This is something developers need to take seriously, incorporate into early testing, and then modify the experience to accommodate.
- Men and women are similar in terms of their emotional response to triggers. In Isobar’s Mountain Dew experience, for example, we always, without exception, see large GSR peaks indicating emotional arousal as soon as the subject realizes they are going to skydive from the back of an airplane.
- It’s important to use multiple indicators of emotional arousal and valence, and to build them into predictive algorithms that summarize emotional states. Without this, you’ve just got a big pile of data. Certain sensors are better in different situations, e.g., EEG is prone to movement artefacts for room scale; facial EMG is less sensitive to detecting emotions, but less prone to artefacts.
With a more scientific approach we’ve been able to better understand where to put future effort, where to experiment and what to watch for when creating successful experiences. We’ve consistently seen positive and better performing results from VR/AR, which will undoubtedly drive transformation in the marketing of brands, in their evolving ecosystems, and help develop new participatory narratives.
While the human desire to tell a story is still strong, it’s important to realise that VR/AR is about evolving experiences, where the story is framed for the participant to involve themselves in. Go play, go measure, go learn, go improve.