It's probably not surprising to any close follower of technology that the Ultrabook form factor snagged center stage at Intel's press briefing earlier today. Taking a page from Apple's MacBook Air, which is based on Intel CPUs, and hoping to steal the thunder from Apple and others' tablets, most of which are not currently based on Intel CPUs, Intel plans to put its money where its mouth is with a substantial-sized marketing campaign (akin to past "Intel Inside" and "Centrino" branding programs) set to begin later this spring. And always-entertaining spokesperson Mooly Eden (who's the VP and general manager of the PC Client Group) apparently did a good job with the pitch, judging from the plentiful post-event reports that I perused (for the first time in nearly two decades, I skipped Las Vegas this year).
Among other things, Eden showcased the diversity of new user interface options that Intel's systems partners may choose to include in the dozens of Ultrabook makes and models scheduled to hit the market this year. These emerging UI candidates included:
- Touchscreens
- Siri-reminiscent voice recognition
- Accerometer- and gyroscope-enabled motion sensing, and
- Most relevant to this audience, webcam-supported gesture control (along with, presumably, facial-recognition unlock, user account login, etc)
That someone demo'd gesture interfaces for laptop computers at CES isn't terribly surprising to me. After all, just two months back I pointed out the first production cell phone implementation of the concept, and more generally I've mentioned on numerous occasions that developers would inevitably find lots of additional interesting uses for the front-facing image sensors initially included in systems for videoconferencing functions. And I agree with Eden that applications like Google Earth (along with, say, PowerPoint in slide show mode) beg for gesture-implemented manipulation.
But here's what baffles me a bit. Apparently, from the accounts I read, Eden positioned computer-based gesture control as a revolutionary, never-seen-before concept. However, those of you who've followed the professional activities of EVA founder Jeff Bier might remember that back in early June 2011 at the Design Automation Conference, he memorably demonstrated gesture control on his laptop in conjunction with a hacked Kinect and a copy of So Touch Air Presenter.
Check out the video evidence for yourself here.