In mid-January, I published the most recent in a series of writeups on Tobii's eye tracking technology for controlling user interfaces, followed a few days later by the upload of a product demonstration video captured by EVA Founder Jeff Bier at the Consumer Electronics Show. Tobii's not the only company working on the technology, of course, and Samsung is taking a unique tack in open-sourcing the software source code and accompanying documentation. From coverage at The Verge:
EyeCan was developed by five members of Samsung's Creativity Lab, and was built with the purpose of helping those who are paralyzed from a disease like ALS control a computer through eye tracking. They've been testing it and posting videos of it in use on YouTube over the last few months (not surprisingly, one of those videos showed off a game of Angry Birds) and now the team is ready to release the software and documentation behind it for anyone to develop their own solution. Additionally, the team plans to distribute "D.I.Y kits" to people in need in South Korea for people to assemble their own hardware.
Like the previous writeup I published today, this topic has personal relevance…even more so, actually. My father died of ALS several years ago after a lengthy decline, and as such I'm resonant with breakthroughs intended improve the quality of life of those stricken with diseases such as his. Kudos to Samsung for its accomplishment, and for its empowerment of the masses to drive development forward.