What are Natural User Interfaces?

This article covers the basics of Natural User Interfaces (NUI). More articles about NUI are available here.

There is a quiet revolution happening in the world of computer interaction and it is all centered with how we interface with them. With the advent of more powerful CPUs, we now have the capability to reinvent how we interact with our machines. This trend will lead to an explosion of efficacy in interaction with the machines we daily. It has the potential to completely tear down the dichotomy between what we consider the real world and what we call the digital world. It will come to redefine how we interact with our machines on a fundamental level.

As an example, here I am using the Leap Motion controller to manipulate a block in 3D.

The History of User Interfaces

A high level overview of the development of user interaction with computers, from command-line interfaces to natural user interfaces.

The evolution of user interfaces. Photo courtesy of Wikipedia.

The interface between man and general computing machines have gone through three main stages. With each stage comes a shift to a more intuitive method of interacting with machines.

The CLI

The first paradigm was known as the Command-Line Interface or CLI. CLI was centered around a strict set of commands that were predetermined by the program’s creator. It has a steep learning curve because of the necessity to use arbitrary keywords and parameters. According to Wikipedia:

In the CLI, users had to learn an artificial means of input, the keyboard, and a series of codified inputs, that had a limited range of responses, where the syntax of those commands was strict.

The GUI

With the advent of the mouse, the screen became for than a terminal that would simply display data. It became something that was a little more intuitive. The screen became a navigable 2-dimensional plane. Because our brains exist in a physical world, it understands navigating 2- and 3-dimensional spaces on a much more fundamental level.

This led to the shift toward Graphical User Interfaces, or GUI. Again, Wikipedia:

The GUI relied on metaphors for interacting with on-screen content or objects. The ‘desktop’ and ‘drag’ for example, being metaphors for a visual interface that ultimately was translated back into the strict codified language of the computer.

The NUI

Finally, we come to natural user interfaces. The goal of NUI is to render the user interface effectively invisible. This is done by teaching machines to see us and understand our gesture and intent. This allows for a flatter learning curve in that the interaction seems like the right thing to do. For example, in the video I posted above, if you want to move an object, you simply pinch it and move it. If you want to scale in a direction, you pinch the handle and lift it up.

Examples of Natural User Interfaces Hardware

In conclusion of this article, I thought I might share with you some of the exciting new NUI technologies that are available on the market today. Note that this list is by no means exhaustive. Please feel free to reach out to me with any new ones that I might have missed.

Leap Motion

The Leap Motion controller, showing that small does not mean weak.

The Leap Motion is an incredible hand tracking hardware/software package. The device has an astonishing level of accuracy and an intuitive API all packed into a device no bigger than a pack of gum. While playing with this device myself, I was impressed with how intuitive it felt to interact with the computer with my hands. There was almost no learning curve to using this device; you simply reach your hands in and start manipulating the content.

I was so fascinated with this device that I made my own JavaScript library to allow for websites to be navigated by Leap Motion. The code can also be used to make a web browser extension to allow surfing any site with the Leap Motion. Feel free to check it out and contribute here.

Microsoft Kinect

Probably the most well-known NUI device on the market. The Microsoft Kinect is a full-body NUI device capable of facial tracking and full-body tracking. It can actually track up to 6 bodies simultaneously. They also give you access to the low-level APIs, giving rise to some awesome new uses for the device.  This app, for example, used the changes in facial temperatures detected via the Kinect’s infrared sensor, to infer the user’s heart rate.

Intel RealSense

The Intel RealSense is not necessarily a device, but a technology stack offered by Intel. In it are several NUI devices that is a cross between the Kinect and the Leap Motion. It can do facial tracking, body tracking, and hand tracking. Additionally, it can be used for object detection, and digital 3D scanning. Furthermore, it can be used by drones for object avoidance.

Conclusion

This nascent field is just at the inflection point of becoming a major industry. Big hitters like Microsoft and Intel are already in the game, and even Elon Musk is getting in. Look for natural user interfaces to become a huge influence in our lives.

Tags:
James Stephens About the author

James is the founder of Code Vanguard and one of its developers. He is an applied mathematician turned computer programming. His focuses are on security, DevOps, automation, and Microsoft Azure.

No Comments

Post a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.