NUI FirstThoughts
Jeff Han set the user-interface world on fire with his FTIR-based multi-touch technology presentation at the 2006 TED conference. The interface’s ease of use was partly responsible for the unprecedented success of Apple’s iPad tablet. It was clear that “touch” was the next big thing in user interfaces. A few years later, there are plenty of clips on YouTube showing three-year olds frustrated when they cannot “resize” images in a paper magazine. The next generation of computer users are declaring the days of the mouse and keyboard as the primary methods of interacting with computers are numbered.
Multi-touch-enabled devices are just a part of a class of human-computer interactions known as natural-user-interfaces (NUI). The RainyDayScience folks followed the progress of the various types of NUI research over the past few years with great interest (SixthSense, Kinect, NeuroSky). Some of the research have begun to move from the labs out to the marketplace. When Microsoft released the Kinect camera for their xBox 360 platform, the device created a huge swell of interest in the NUI community. All kinds of devices quickly became Kinect-enabled.
Some companies even started demonstrating gear which enables users to interact with the computer using their thoughts alone (NeuroSky, Emotiv). While these devices are intriguing, they still have quite a ways to go before they will be widely adopted. Closer to being useful a two devices (MYO, Leap Motion) scheduled for release in 2013.
The MYO armband converts the electrical signals detected from the wearer’s muscles and uses them to control whatever they want: computer, phone, anything that can communicate wirelessly. From MYO’s demo, the armband seems to be an intuitive way for the wearer to effect remote actions. Our unanswered question is, how fine-grain is the control and what does it actually take to map the gestures to the desired actions?
The MYO band is innovative, but what really got us excited was the device from Leap Motion. The Leap Motion controller was announced more than a year ago, and developers have been waiting eagerly to get their hands on this intriguing little box. Those who has not seen the Leap Motion controller demo should definitely go here and take a look.
Some people have poo-poo’ed the practicality of waving their hands around as a way to interact with the computer, instead of moving a mouse. We equate them with those who questioned the need to advance from a command-line interface to a GUI. In thinking about natural user interface designs, we should not limit ourselves to just considering how the interaction would work in front of a monitor. What if there was no dedicated monitor? How would we interact when ANY surface might be a potential display (mirror, desk, wall, window, etc)?
We see the Leap Motion controller as a definite threat to the niche currently dominated by Wacom. The Cintiq graphics tablet is Wacom’s flagship product, and the only draw back is the cost. At $2,500, the Cintiq is a tool affordable only by the professionals. The Cintiq tablet enables a graphic artist to work directly on the screen, and the feel is much more natural and direct. The Leap Motion controller may completely change the market, but only if certain current limitations were addressed. How should Wacom respond to Leap Motion’s entrant into the market? We have some ideas, but that is an article for another day.
Leave a comment