Friday, September 28, 2012


eyeCan is a project to develop a low-cost, open-source eye-mouse for people with severe motor-disability. The project was initiated based upon an observation that there wasn’t any proper technology accessible in Korea, a critical link in allowing people with such disabilities to communicate with others. It was mainly due to high prices and the lack of supply chain, meaning that the problem was not merely a technical challenge. Therefore, my team and I set out to put both technical and social efforts into the project, finally presenting realistic solutions to a long-standing quality of life problem for paralyzed people.

Not only did eyeCan make a significant contribution to the field of assistive technology, it also triggered social action. We gave public talks to inspire larger communities in search of further support for eyeCan’s development and distribution. Finally, in collaboration with Korea Disabled People’s Development Institute (KODDI), we have established a full-scale service for distributing eyeCan and instructing users. The institute will build the eyeCans and install them for more than 200 patients in 2012 - 2013. As an example of developer movement, a voluntary group of designers named Design Dive started a non-profit project to develop the next version of eyeCan UX.


  • Open-source eye input device that can be built within a budget of $50
  • Ease-of-use
  • Establish a service eco-system for people in need in South Korea

Design Principles

Engage People Locked in their Body into Profound User Experience
Eye-gaze has been a popular input cue for a long time, but existing techniques failed to encompass the possible variety of interactions. Various gestures of the eyes - rolling, blinking, and gaze-pointing – can be combined together to produce a much more sophisticated interaction language than is currently available.

I pinpointed this possibility, and created Programmable Gaze Input System - a system which made use of virtually infinite variations, allowing users to perform complex tasks such as playing Super Mario Bros or freely walking through Google Streets by simply looking around the computer screen. The users, whose only gestures were their eye movements in the real world, experienced a profound empowerment in the virtual world through the eyeCan device.

eyeCan's software consists of two parts, eye-tracking layer (eye-tracker, action-decoder) and controller layer (controller logic, controller UI). Eye-tracker and action-decoder analyzes the gestures of user's eye, then dispatch pre-defined events to the controller currently selected by 'Controller Switch'. Controllers provide the logic that utilizes eye's gestures and trigger actual events such as Keyboard and Mouse inputs, or any form of computer tasks.

What can I do with eyeCan?

(1) Control Computer Mouse
Users can move mouse pointer by just looking on the computer screen, and perform mouse clicks, dragging, and scrolling by blinking eyes.
(2) Perform Simple Keyboard input or Mouse gestures
eyeCan detects simple eye gestures, e.g. blinking, rolling etc, and can accordingly triggers simple computer input. One of our users is using eyeCan as an emergency alarm - detecting long-eye-blinks as an emergency signal, and triggering pre-loaded alarm sound to call someone for help.
(3) Perform Complex Combination or Sequence of Computer Input
eyeCan utilizes combinations of eye gestures to generate complicated computer input. For example, with eyaCan, users can play Super Mario Brothers or walk through Google Street by just looking around the computer screen. Based on eye's rolloing, blinking, and gaze-pointing, users can perform detailed set of combinations or sequences of Keyboard and Mouse events.


Samsung's eyeCan + Eyewriter: Co-writing a Recipe for Collaboration, May 1 2013 @Seoul Digital Forum

Science Talk Show, Nov 10 2012, @Samsung Electronics

Soh-tong rak-seo - Passion Talk for Samsung Electronics Employees, June 27 2012, @Samsung Electronics

eyeCan: a story about a technology that changes lives, May 24 2012 @Busan tech+forum (Busan Lotte Hotel)

Dream to be a technology wizard in Korea, Apr 28 2012 @TEDxEonjuro (Severance Hospital)

eyeCan: the project story, Dec 23 2011 @Samsung Medical Center (SMC)

Developing a DIY eye-mouse for ALS patients, Aug 16 2011 @TEDxSamsung Conference (COEX)

Comment from a User's Family

"He repeatedly said that he wants to die before he could try using eyeCan, since he couldn't do anything without a help of another person. After he started using eyeCan, he stopped wishing to die, and he is well trained to even send e-mails for business purpose and to his families."
The first letter he wrote with eyeCan, to his children.

Selected Media

The Verge, Samsung releases source code for eyeCan, an eye-controlled mouse for the disabled, Feb. 23, 2012
The Korea Herald (1st page), Samsung develops eye-controlled mouse, Feb. 24, 2012
Futura Science, EyeCan, souris à commande oculaire de Samsung, en open source, Mar. 1, 2012
TED Active 2012 Blog, Developments on Mick Ebeling’s EyeWriter, Mar. 6, 2012
Wall Street Journal Blog, Samsung Develops Low-Cost “Eye Mouse”, Mar. 22, 2012


[1] "Control apparatus connected with a plurality of display apparatus and Method for controlling a plurality of display apparatus, and Display apparatus control system thereof" KR, P2012-0078305
[3] "Method for providing contents and Display apparatus thereof" KR, P2012-0078309

How to make one?

#1. Breaking a webcam
#2. Eliminating an IR filter
#3. Inserting a sheet of film
#4. Connecting LEDs and making a resistance
#5. Connecting #4 to the cam
#6. Fixing LEDs with Glue gun

May. 2011 ~ Apr. 2012
@Creativity Lab, Samsung Electronics