OnQ Blog

How do we explain VR to someone who hasn’t tried it?

30 nov. 2016

Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

Alex Schwartz is the founder and CEO of VR game-development studio Owlchemy Labs. The views expressed are the author’s own, and do not necessarily represent the views of Qualcomm.

“If I had asked people what they wanted, they would have said faster horses.”

–Henry Ford

VR promises to transport viewers into worlds they’ve only seen on movie screens — where they can fight alongside superheroes, explore ancient civilizations, and jump into the action of their favorite video games. But sky-high expectations belie the reality of the medium. For instance, you might say you want to be fully immersed in a “Call of Duty” game, without realizing that the sights, sounds, and calamity of a battlefield is way more intense than you bargained for.

Herein lies one of the biggest challenges facing VR: People don’t yet understand it. That’s been true of every new medium, from radio and television to video games, mobile apps, and web video. Though the platforms may seem similar, simply adapting existing content from other media just won’t work.

Think of it this way: There’s a reason you don’t play the 1985 version of “Super Mario Bros.” on your smartphone. The game was designed to be played with a controller. Virtual buttons on mobile-device screens were a lame attempt to bring the old world into a medium where it didn’t belong. Ultimately, mobile developers had to invent a new lexicon — a new toolkit — for what works on a touchscreen. (Don’t forget, we lived with and developed for touchscreens for years before “Tiny Wings” and “Angry Birds.”) Now, pinching, zooming, and dragging are second nature to smartphone users.

Helping people better understand the medium will be true of the VR transition, as well. The level of immersion VR delivers is inconceivable to anyone who hasn’t experienced it, and until high-quality VR is available in every living room (or on every mobile phone, for that matter) we need to figure out how to make it demonstrable — something people can experience from the outside, too. Trouble is, we’re still in the very early stages of VR adoption, so the need for workarounds to promote a more complete understanding of the medium is particularly acute. And doing so takes exposure and time.

So, how can we share VR experiences with a broader audience? What we have right now are only stopgaps. Watching someone else fumble and flail around in an immersive VR experience might be funny (for a moment), but it’s not the answer. Nor is sharing the wearer’s point-of-view to a flat screen — the sudden and sometimes erratic movements of another person’s head can trigger nausea. Instead, we need to figure out how a spectator can see the player and their environment in context all at once.

At Owlchemy, we’re exploring a couple of different avenues to solve this problem. One way is to insert another point-of-view, essentially a virtual camera, into the VR environment. Through that virtual lens, the spectator watching on a laptop or TV screen would see the entire scene, complete with the virtual representation of the player — almost as if she was watching an actor in a movie or a suspect on a security camera. Even though this technique does not immerse the third party, it allows for him to become a true spectator in the virtual environment.

Another promising method is something we’re playing with called “mixed reality.” In this setup, the spectator sees both the real-world human player and the virtual world with which he or she is interacting, merged together into a single image. Of course, this is challenging from both a hardware and software perspective. Software must constantly track the position of both a real-world camera and a virtual-reality player and then composite the scene together in real time for the spectator. This kind of tech needs to be simplified to make it both affordable and approachable for the average gamer or streamer — a problem we are in the midst of tackling.

Of course, there is no replacement for experiencing VR first hand. In these early days of VR, high-quality experiences on tethered VR systems, such as the HTC Vive or Oculus Rift, or mobile VR systems, such as the Samsung Gear VR, have been best way to really sell the medium to the uninitiated. The high price tag of tethered VR is still a barrier to entry for mainstream consumers. Mobile VR, on the other hand, is more affordable and offers the advantage of being untethered. In addition, early designs of fully untethered standalone headsets prove that delivering high-quality immersive mobile VR experiences is possible. And year by year, mobile processors will advance, making it easier and more convenient to show off premium VR content on smartphones and standalone VR headsets. It’s all about giving players their first taste of fully immersive VR — something that goes beyond looking around an environment, and allows the player to use his hands, to pick things up, to truly interact with the virtual world.

Only then can we expect to have a collective “ah-hah” moment, where the purpose and promise of VR becomes apparent to anyone and everyone. Until that time, we need to build experiences that give to-be-converted VR fans an idea of the fantastic experience happening in the headset.

Related News

OnQ

Making virtual reality truly immersive

Virtual Reality has been touted for the past several years as the next big thing – and its history goes back even further than many of us realize (the first prediction of VR goes back to a science fiction story from the 1930s!) – but now we may have reached an inflection point for VR.

With the advances in technology fueled by the mobile industry, much of what was considered sci-fi (even in the previous incarnations of VR) is now becoming reality. Life-like visual and audio processing, movement and positional tracking as well as haptic and integrated sensory feedback are realities today making VR immersive in ways only imagined before. 

Creating immersive VR experiences involves bringing together these interactive technologies that are intuitive for the user – so it feels like you are there, practically reaching out and grabbing the controls of that virtual vehicle. And perhaps not surprisingly, we expect that the best VR experiences will be built on mobile technologies to offer people a truly untethered experience. This means that the devices we are using become a part of the world we’re immersed in instead of distracting from it.

During CES Qualcomm Technologies, Inc. demonstrated “Power Rangers: Zords Rising”, an immersive mobile VR experience that allowed users to gear up and experience what it’s like to be part of the Power Rangers team. This demo highlighted the power of the new Qualcomm Snapdragon 835 processor, which is designed to deliver immersive VR and augmented reality (AR) experiences.

For instance, the new Snapdragon 835 processor is engineered to support 6-degrees of freedom movement – the ability to translate through the virtual environment forward/backward, up/down, left/right and with pitch, roll, and yaw, crucial for creating a realistic sense of being inside the virtual world. And with real-world movement one needs life-like visual processing to deliver the smooth, visually rich experiences similar to our own natural vision. This is why we built sub-18 millisecond latency and 4K display at 60 frames per second in the Snapdragon 835. Likewise, in the real world, sound has a three dimensional profile that we use to orient ourselves. Therefore, the Snapdragon 835 supports 3D audio – VR has never sounded so good.

These are no small computational tasks. You might imagine (or have seen elsewhere) that achieving this degree of processing performance needs large, power hungry processors. The new Snapdragon is designed to deliver superior GPU and CPU performance per watt while also being 35% smaller than its predecessor. The smaller size and better performance means better immersion inside the virtual world and less distraction caused by the VR hardware in the real world – both are important when delivering compelling virtual experiences.

All of this is good news for developers, and even better news for those people who will be trying VR for the first time, as it means they won’t experience distractions like jerky movement, lag, or low resolution. And when built into an untethered, mobile experience – such as a headset – you can minimize the potential for device discomfort that would come from excess weight, heat or protruding wires.

Snapdragon processors and toolsets are designed to provide multi-processor computation coordination and energy management so that you can offer both an engaging VR performance as well as optimal device comfort.

When we put on our VR headsets, we gaze out at a world of opportunity for you to start developing your own VR experiences using the right immersive technologies to draw in your users and leave them craving more.

Are you ready to dive in and develop your own VR experience? Download our white paper, "Making Immersive Virtual Reality Possible in Mobile" to learn more; and be sure to have a look at the Snapdragon VR SDK.

For more ideas, take a look at Qualcomm’s announcements from CES.

 

Qualcomm Snapdragon is a product of Qualcomm Technologies, Inc.

 

30 janv. 2017