With cities getting smarter and services automated, human-computer interaction is an indispensable aspect of everyday life. As most popular medium of interaction are keyboard and mouse, modern software interfaces are designed to work with hands. Thus, these interfaces are rendered unusable to people with hand disabilities. In this paper review, we discuss intrusive and another non-intrusive realtime interaction technologies, which rely on eye and face movements to control a mouse cursor. Using 'off-the shelf' hardware, open source libraries and a normal processor, real time eye-cursor control is achieved in these papers. Although commercial eye trackers are available in market, they are not very practical for using at home due to either high costs(x000$) or difficult setup procedures. Thus, we need to explore systems which use cheap components easily available in market such as webcams, IR leds, goggles and have open source software allowing any user to calibrate/implement his own system.
|
|
All images shown above are from [1] | Images shown above are not copied from anywhere |
---|
Evaluation of a tracker
Many open source eye trackers are available with different specialities. For example ITU gaze tracker [1] is an offline system which packs all processing equipment into a backpack allowing the user to carry system anywhere. This system can be utilized to identify gaze patterns in scenarios such as driving, buying in market or painting. Hence it can be used for researches aiming to classify changes in patterns with learning. For example, an experiment in [4] , where a novice and experienced driver drive a car, came to the conclusion, that as learning progresses in driving, gaze moves towards empty spaces on road, whereas a novice driver looks for cars. In 2009 "ITU gaze tracker", an open source low cost system was released in Spain. This intrusive system consists of a head mounted webcam with inbuilt Infrared emitters. In [2] authors rebuilt the system and test it on two different typing applications(Gaze Talk and Star Gazer).
Aim of experiment was to present a system:
API to interface trackers with software : Another problem in adaption of gaze tracking technology is absence of standard API's which allow communication between gaze trackers and softwares. Till now most gaze trackers have their own custom built softwares or vice versa. In [3] a standard API is presented which abstracts the communication layer and provides sockets for various control parameters such as Left POG, Right POG, Screen Size , Camera size and handles for calibration and configuration. Although this API does not improve speed of gaze tracking in anyway, but it makes writing applications utilizing open gaze trackers easier thus allowing for faster adoption in development communities. The API operates on a client-server model where Tracker acts as a server and applications as client. This allows many applications to simultaneously utilize the data from tracker's transmission.
Conclusion