Brave NUI World: Rise of touch-less gesture control

Let's talk about Graphical user interface(GUI). This is a form of user interface that uses graphical icons and visual representations to display all elements and related user controls on the display. By elements, we mean buttons, icons, lists, etc. The human interaction with the interface is being done via keyboard, mouse, or any other pointing device. These are also well known as input devices or manipulators.
The minute of history
GUI is a relatively young phenomenon. The current GUI that we know is about 50 years old. In 1968 an innovator Douglas Engelbart in cooperation with his team developed the basics of first GUI in the world. It went down in history as the fundamental creation of the first Online System(known as NLS). The NLS supposed to display raster image on the display, interaction with mouse and hypertext.
Since then, the science and the technology have made a huge step forward, but the basic understanding of interfaces has remained at the same level. We are still limited to a display device we have today (for example, a projector), graphic elements (folders) and a pointing device (mouse) for manipulation.
Nevertheless, we cannot ignore achievements in the technology sphere that have significantly influenced the modern perception of the graphical interface.
One of the main points was the introduction of the touch screen and when it got available for use worldwide in our life on a regular basis, the device that at the same time can be used for input and output information. Its advantages are pretty clear and among them are interface simplicity and scalability (small size and large screen is a good combination for a device). But there is a significant drawback in place and it is high power consumption.
The iPhone that was presented back in 2007 was the first device supporting multitouch. “An iPod with a touch screen, a phone and a device for surfing the Internet in one device” — this is how Steve Jobs introduced the iPhone, and at the same time smashed all the competitors, that still were relying on button solutions:
Buttons & Controls can’t change. They are there even if you don’t need them.
— Steve Jobs
Artificial intelligence development indeed took an equally important role. Further developments in the voice recognition sphere and the appearance of voice assistants created a new dimension in working with the graphical interface. The noteworthy point of voice assistants such as Siri and Alexa is that their interface remains to be “out of our eyes”. Even despite the UI tricks, that simulate communication with AI, all the “magic” occurs inside the device. But unlike the touchscreens, which completely killed the push-button devices market, voice interfaces became only an additional option for interaction with systems. The reason for this is well described here.
The giant ways variability for constructing sentences, nuances of context, hundreds of meanings of individual words, ways of pronouncing words, accent, speech defects, intonations, completely changing the meaning of phrases in some languages — all this does not allow to correctly recognize large amounts of data.
No doubts, that the voice control has great prospects. But there is still a lot to do in this field in order to win and gain the users absolute loyalty.
Either keep it or forget about it
Traditional GUI was developed at a time when no one could even think about scrolling. Now it is hopelessly outdated and pulls behind together numerous manipulators. Imagine that you have a free evening and you decided to catch up for the missed episodes of your favorite series. So here you are, with your TV, iPhone, TV remote, Apple TV, iPhone (it's like the law of the 21st century — you should always be in touch) and your home tablet (to check email or browse Instagram). On the one hand, you are accustomed to operating with many devices at the same time, you are the king among your subjects. On the other hand, you get tired and sick of all these devices.
The Black Mirror and their futuristic, but graceful devices come to mind. The devices look so convincing so that by looking at them, you begin to suspect that the creators of the series know more about the future than they say. And on each next Apple presentation, we hope to see at least something closely resembling a series. But the future does not get to us as fast as we would like.

The Future of User Interface
The solutions shown in the Black Mirror are usually elegant and simple. This is not just a familiar for us graphical interface, therefore it’s more about so-called Natural User Interface. NUI — this is an intuitive interface that completely eliminates the need for mechanical devices such as a keyboard or mouse.
The advantage of such interface is that it’s pretty intuitive and that the user does not have to specifically learn how to work with it.
Theoretically, transferring our experience from the real world to the virtual one, we should already be able to perform certain tasks. We know how to move objects in the real world so then we can use the same well-known movement actions in virtual space.
Augmented reality gets us one more step closer to the ideal NUI. But we still lack the wide use of contactless gesture control technology. Nowadays designers are revising the UI elements library, trying to move away from the usual solutions in order to give a chance to new ways inputting information. Focusing on the gestures control management, we can invent new principles of interaction with systems. Then we need to test them in practice, find weak points and then we can improve it. Iteration after iteration, we will continue to move further away from the idea of pointing device towards non-contact interaction.

Is this real?
The short answer is yes. Here are some ways of hands tracking for recognizing gestures.
- With the help of a special glove.
The first glove model for wide use appeared almost thirty years ago. The Power Glove by Mattel Inc was used as a control device for Nintendo. Despite the fact that over the years glove models for such purposes have become lighter and better, this technology is difficult to bring to the mass market since users usually don’t want to use and wear such a burdensome accessory as a glove for everyday use.
2. Using Cameras
The distinctive feature — low consumption of computing and energy resources. The software understands the skeletal structure of the hand, the remoteness of its parts, its relation to other objects and dynamic gestures. (for example, clicking, tapping, squeezing and releasing, etc.) You might want to read more about tracking here.

One bad rotten apple
Designers while searching for new solutions usually forget about the fundamental principles of interaction design(from an article by Don Norman and Jacob Nilsson). These include the following:
- Visibility
- Feedback
- Consistency (Standards)
- Non-destructive operations
- Discoverability
- Scalability
- Reliability
The result of not following these standards can become a sad story for business. Users will pay attention to an innovative solution, but it will not pass usability testing.
Donald Norman from NNG drew attention to the problem of impact back in 2010.
Because gestures are ephemeral, they do not leave behind any record of their path, which means that if one makes a gesture and either gets no response or the wrong response, there is little information available.
– Dan Norman
Need to point out, that this article came out after the iPad appeared on the market and it criticized contact gestures, and as we see eight years later, this technology not only proved itself but also entrenched.
The second point that needs to have attention is ergonomics. Steve Jobs once explained why screens in MacBooks will never have a touchscreen display.
Touch surfaces don’t want to be vertical. It doesn’t work. It isn’t ergonomic.
– Steve Jobs
Ergonomics problems are inherited in gestures control technology. But there is nothing that could not be improved. Perhaps the problem is not in technology but in its correct use.
Where to apply and use?
A new way inputting information requires a revise the overall user experience. Soon in the broad discourse, along with the concept of “tough-friendly”, another expression will appear — “gesture-friendly”. Today, advanced companies are already implementing new technologies into their daily flow, adding the use of gestures to their processes. Indisputable benefits of this technology are in education.
Hand tracking and gesture recognition “breaks the fourth wall”, transferring the experience of practical work to a completely different level. The illustrations in the anatomy textbooks are now before the last century. The skeleton and inner organs 3D visualization and the students interaction with them give a true sense of the size and proportions of objects, which will never be achieved with flat pictures on papers.
This technology is also indispensable in cases where voice control is not enough, and hands are busy with other required motion actions. Using gesture tracking in operating by surgeons is not a dream of a great future, but the practice which is being used today.
Large corporations feel the importance of this technology development and gradually implement its UX within their products. With this said the BMW 7th model understands a set of basic gestures commands and allows the user to add his own custom gestures.


Future Now
The idea of a natural user interface is not new but has not yet become widespread. There is no better time than now for its implementation: technologies have become more affordable and human potential is now at its maximum. For the first time in a thousand years, our capabilities have come as close as possible to cover all our needs. We believe that in the near future, not a person will adapt to the limitations of interfaces, but it will be done vice versa.
Interfaces will be as personalized as possible to cover all persons needs and tasks.
There is a challenge for designers and developers for developing a new language of interaction. This is associated with usability problems and a possible contravention of the interactive design principles. However, the most radical solution while creating new interaction levers is definitely not to create them. There are two types of people: those who do, and those who wait till others will do it. We at Chapps consider ourselves as business people, and we do it.
Upon getting interested by contactless interaction sphere we already predicted possible pain points. Here are some common mistakes that might appear while developing natural user interfaces:
- Focusing on GUI principles
New interfaces can not blindly inherit the old principles of interaction. We can take into consideration previous experience, but NUI should be setting the development vector direction in the future.
2. Creating an excessive amount of gestures
The last thing you want to do is to make your user feel too stupid for your product. Focus on basic gestures and do not overwhelm with extra information.
3. Disregarding device limitations
Non-obvious scope, tracking and detection speed, can lead to poor, unstable recognition, and, as a result, to user frustration.
4. Not processing a proper user testing
When developing NUI, user testing takes a primary role. What is natural for one person, may be incomprehensible or strange for another. Upon testing, the most natural option is the one that is preferred by the most amount of people that took part in testing.
5. Overwhelming the interface with complex actions that kill fun and make the user lose interest
Everything that has ever been or will be developed in the technology sphere exists to facilitate and improve the people lives and to add values to the world around us. If the interaction seems too complicated, the user will not waste precious time on this, no matter how loyal he is to the idea or to the overall technology.
3 main rules NUI development:
- Proper metaphors for gestures and for animations
- Invented error handling
- Products guidelines
And the fourth rule that we would like to add personally from Chapps Team. The clients active position and interest in alternative solutions development is very important for us. Chapps clients don’t wait what will become a trend next year since together we can create and set trends today.
If you like this article please share it or at least click 👏👏👏👏, this will be the best indicator for us to keep writing articles.
By the way, do you have any thoughts on what can be better than hand gestures management? Perhaps neural interfaces that are actively being used in prosthetic management for people with disabilities. We would be glad to dive into this in order to sort it out, we’ll come up with ideas and will prepare an article if you are interested.