Controlling our computing devices with the wave of a hand or the wink of an eye is rapidly moving from science fiction to science fact, says Intel's human-machine interface guru Mooly Eden.
Eden is senior vice president, general manager of Intel's Perceptual Computing Group, the term the silicon giant uses to describe its pursuit of what is sometimes called the Natural User Interface. The goal is to break down the barriers between users and their devices, advancing the human-machine interface beyond keyboards and touchscreens to fully embrace natural voice and gesture controls. Eden's job is to envision new ways for us to interact with the technology around us, and then bring those visions to life.
At the heart of Eden's Perceptual Computing work at Intel is a new generation of 3D cameras with two lenses side-by-side to offer accurate depth perception. These allow the computer to scan the three-dimensional space in front of it, letting users reach into that space and interact with virtual objects rather than simply wave their hands at the camera. The camera also uses lasers to track and model real-world objects.
The other key component of Intel's Perceptual Computing platform is improved speech recognition, supplied by the next-generation Nuance Dragon Assistant. The aim is to improve accuracy and natural language comprehension, which Eden says is particularly challenging with his own thick Israeli accent.
While Moore's Law has seen computing power continue to grow, advancements in human-machine interfaces haven't kept pace, says Eden – speaking at the recent Consumer Electronics Show in Las Vegas.
"When you come over to speak to me, you don't need an instruction book. You naturally know what to do. But if you want to interact with any computing device, it's not that simple," Eden says.
"For the human-machine interface to really work it needs to be natural. I want it to be multimodal so I can use my different senses. When I speak to you, you listen to me but you don't close your eyes. You're using more than one sense to interact with me and this should be the same with the computer. The goal is for it to be immersive, for that line between the computer and the user to become blurred."
Making PCs more human-friendly
One of Eden's heroes is renowned science fiction author Isaac Asimov, but Perceptual Computing is much more than a pipe dream. Intel unveiled a $100 million Perceptual Computing fund at Computex in June last year and recently announced the winners of a $1 million developer challenge to create applications for the technology.
Intel's Perceptual Computing platform, dubbed "RealSense", will ship in the second half of this year. The technology can't work with existing traditional webcams, but new three-dimensional cameras will be embedded in tablets, ultrabooks, notebooks and desktop all-in-ones from partners such as Acer, Asus, Dell, Fujitsu, HP, Lenovo and NEC. Intel also has long-term plans to extend the RealSense technology to Android devices, Eden says.
Even a company the size of Intel can't rest the entire platform on its shoulders. It's keen to build up the RealSense ecosystem, releasing a Software Development Kit and collaborating with the likes of 3D Systems, Autodesk, DreamWorks, Metaio, Microsoft's Skype and Lync, Scholastic and Tencent.
A hands-on demonstration of RealSense offers a taste of its potential, even if many of the early applications are games designed to showcase the technology. One game shows the view from the RealSense webcam of the player in front on the computer. The game allows you juggle virtual balls on the screen but the camera's depth perception lets the computer monitor the three-dimensional space in front of it. To interact with the balls it is necessary to reach forward into that space.
On the screen a blue glow appears around your hands as you reach through the edge of the field into the space where you can interact with the balls, making it easier to judge your distance from the virtual objects supposedly in front of you.
"It's one of the potential feedback mechanisms that we're working with, to improve your interaction with the computer," Eden says.
"There are also haptic feedback options, things you can feel to tell you what's happening. For example air blowing on your fingers as you touch an object or something else you can feel. The whole idea is to engage your various senses and make the interaction with the computer as natural and intuitive as possible."
Other demonstrations include interactive story books which use voice recognition to listen to people reading aloud and automatically turn the pages on the screen. Readers can also interact with characters on the screen using voice and virtual touch. Other applications include playing virtual musical instruments by striking and strumming them, once again within a three-dimensional space which places some virtual objects closer to the user than others.
Consumer demand rather than business demand will drive early adoption of Perceptual Computing, Eden says, as with many other new technologies in recent years. In terms of business use, he doesn't envision RealSense being baked into Microsoft Windows, but he does see it being a ubiquitous background application with APIs open to all.
Another demonstration includes navigating Windows 8's Modern UI tablet-style menus using gestures, grasping and moving tiles and then physically pressing them to launch applications. Partners such as AutoCAD developer Autodesk are working on ways for users to interact with their designs in three-dimensional space.
The RealSense camera's depth perception also allows it to easily separate people from backgrounds during video conferences. Eden sees this as a key selling point for business users relying on video conferencing and other collaboration tools. The ability to strip away backgrounds reduces bandwidth requirements and also reduces the amount of room taken up on the screen, leaving more room for desktop or document sharing.
"I believe that this technology will be pervasive," Eden says.
"Today you can not imagine a computing device without a 2D camera, but in a few years from now when the human-machine interface has improved you'll look back at today's old computers as primitive. Did I really use a keyboard? I believe we're on the verge of very big changes."