UX Planet

UX Planet is a one-stop resource for everything related to user experience.

Follow publication

Virtual Reality — A Medium For Reimagining Human-Machine Communication

Pressing Buttons, Turning Dials and Learning New Languages

One of the first pieces of technology that the average consumer owned was the rotary dial telephone. Its users — many of whom had never had an encounter with such a machine — would have to learn how to speak its language in order to unleash its magic.

This was a somewhat bizarre language, but it worked nonetheless: pick up the phone, put your finger on a digit, rotate the dial, and repeat. Learn this language and earn voice access to anyone who owns the same technology. Learn the machine’s dialect so you can talk to your grandma.

Soon after, another popular consumer product came with its own language — the television. This time, the user looks at a small flat surface and controls what appears on it by pressing buttons or rotating dials.

The dominance of TVs and telephones in households helped set the standard: screens and buttons serve as the middleman that translates between human and machine.

For years to come, humans would continue making efforts to help devices understand their intentions. We press, rotate and toggle, getting visual, haptic or audial feedback to know “we did good”. This was true for dishwashers, cameras, fax machines, microwaves, mobile phones and more.

While some technologies like joysticks and the PC mouse defined their own languages, most did not diverge from the essence of the original concepts- until the emergence of the smartphone.

Less Learning, More Intuition

The smartphone revolution marked a trend change, a renaissance of interactions. For the first time, technology tried to speak our language. We were introduced to new concepts that felt more natural and intuitive and less foreign and contrived. We can break these down to 3 important changes:

  1. Technology behaves more more like us. Touch screens were a refreshing change from buttons and dials. Gestures like swiping (which resembles physically moving something aside) and pinching (which resembles stretching some elastic material) replace their button counterparts, making a connection between the interaction and the intention.
    Pinch to zoom-in on a picture. Swipe to say “no thanks” to a poor lad on Tinder. It just feels right.
  2. Technology understands the world we live in. With its Inertial Measurement Unit (IMU) and GPS parts, smartphones can know our geolocation, acceleration, and angular velocity.
    Rooms magically light-up when you enter them; gadgets show the correct time and weather no matter where you are; smart watches light-up in response to the same motions one makes when reading a normal watch.
    Technology begins to react to the world and understand our presence in it.
  3. Technology understands us. Devices learn our step count, location history, activity hours, even heartbeat, all without our explicit intent.
    In other words, not only is the tech familiar with the human dialect — it actually understands it and can give us insights about ourselves.
    As sensors get more accurate and devices get smarter, these insights can be used to better understand our bodies (and maybe even our minds).

But many of the more interesting applications, specifically those that employ the novel sensors, are very experimental. While the tools exist, we have yet to fully harness their potential. We still adjust our language to our technology. Hitting buttons remains the core interaction for performing trivial tasks like calling grandma.

Enter Virtual Reality

Virtual Reality (VR), and more generally, mixed reality, comes with its own attitude.

For the first time, visuals and audio are spatial — that is, they have an actual location in space. Headsets detect our hands (or spatially track controllers as their proxy) even when we don’t touch a flat surface or smash a button. Our head is constantly tracked — which means our position in space and the direction in which we look are also tracked.

Sensor technology is advancing rapidly, and it’s just the beginning!

When I worked as a prototype engineer at Magic Leap, we often explored these uncharted territories. A simple task like designing a UI menu became tricky in the absence of best practices — let alone designing more complex interactions like dragging items, removing objects from the virtual space, or scrolling through bulks of information.

While it may have been tempting to retreat to using dials and buttons, we tried to imagine novel and more intuitive approaches that harness the full potential of the headset sensors.

In fact, “forgetting” 2D paradigms and best practices actually improved our chances to come up with creative and fresh solutions. By ignoring the existing language, we sometimes stumbled across some more intuitive, engaging approaches. It left me with one very clear insight:

With VR, we have a chance to eliminate the middlemen between humans and machines, bringing machines closer than ever to the real, authentic human experience.

Just as smartphones added touch systems and inertial components that better used human senses, VR can do the same — but on a much bigger scope. By reimagining and redefining human-machine interactions, we better respect our physicality as human beings.

From Concept To Toolkit

This discussion has already begun, and I believe these interactions will be better defined in the coming years. The developer community will continue to write and provide best practices and toolkits as a suite for our headsets.

But what will they look like? What principles will guide them? To best answer these questions and bridge the gap between the human and virtual experience, we should leave our comfort zone and boldly explore.

In future articles, I hope to do just that. I will delve more deeply into the ideas introduced in this article, exploring how VR developers can make their experiences more authentic in practice.

I will suggest new approaches that redesign movement, replace button-pressing with body language, utilize the user’s physiology as input and more. By doing so, I hope we can fuel this discussion and start defining those new principles together.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Published in UX Planet

UX Planet is a one-stop resource for everything related to user experience.

Responses (1)

Write a response