Automatic Eye Finder & Tracking System

Votes: 50
Views: 12183

The intrinsic speed and precision of the human ocular-motor system makes the eye an ideal pointing device for human-machine interface. Eye tracking has not yet been exploited as a robust human interface with computers due to many technical problems, such as viewing a human face under real-world conditions, robustly identifying the eyes over a diverse population range, and the complexity of the real-time image processing tasks involved. Conventional high-accuracy (<1º of visual angle) eye tracking systems require the presence of an operator and/or require the user to use a chin rest or wear an identifying marker so the eye tracking system can find the user's eyes.

We have developed an autonomous eye tracker that automatically finds, and accurately tracks both eyes of computer users and calculates their precise point of gaze as they naturally view a computer screen. Our system permits users roughly a cubic foot of head motion. Point-of-gaze accuracy is measured to between 0.25 to 0.5 degree of visual angle over the user’s ±40° field of view.

Our functioning prototype consists of a face and eye image acquisition assembly mounted in front of and below a standard computer monitor. This imaging unit is connected to real-time parallel control hardware processors residing in a PC chassis. (See Figure 1.) The imager is comprised of a high-resolution array, a wide field of view infrared illuminator, and optics to obtain a clear in-focus image of the user’s face. The user can be seated within 18" to 30" of the display. (See Figure 2.)

The face image is input to real-time custom parallel image processing hardware that automatically detects the presence of the user by skin tone and facial feature analysis. The user's face position is calculated and is used to direct high-resolution eye insets. The images of the eyes are fed into real-time eye tracking image processor hardware, and control software mediates information regarding data from the face and eye positions. The real-time eye position data is used to generate a mathematical model of the user's point of gaze and is superimposed over a video representation of the computer screen being viewed by the user.

The real-time point-of-gaze information can be used as a control input for "Look and Click" applications, to speed up conventional mouse data entry tasks, as an input device for individuals with neuromuscular impairment, and for rapid target selection in moving environments. Many other applications will make themselves apparent during beta testing of this input device.

The system works with users wearing eyeglasses and/or contact lenses. It has been tested successfully with users of light to dark skin tones.

The electronics hardware has been designed for high-volume manufacturability. Having no mechanical moving component permits the system to be manufactured at a low cost in high volume. All the tracking and control electronics have been designed to be fabricated on a custom ASIC and communicate to the host platform, via USB 3.0.

Video

  • Awards

  • 2014 Machinery/Automation/Robotics Category Winner
  • 2014 Top 100 Entries

Voting

Voting is closed!

  • ABOUT THE ENTRANT

  • Name:
    Rikki Razdan
  • Type of entry:
    team
    Team members:
    Rikki Razdan, Alan Kielar, Pat Stearns, Melissa White
  • Software used for this entry:
    Linux, C++
  • Patent status:
    pending