May 2014
I worked with the Arrington Research EyeFrame eyetracker with scene camera for about 9 months. The eyetracker was integrated into a Sony HMZ-T1 and used to operate on screen menus and buttons using eye gaze and duration. I wrote my own C/C++ code within Microsoft Visual Studio 2008 using ViewPoint EyeTracker SDK and openCV libraries. The program also drove a pan and tilt unit which contained a 1280×720 camera to provide a remote telepresence view in the HMD. The HMD was headtracked using a Pololu miniIMU9 v1 sensor board with an Arduino nano board running the head tracking firmware. I didn’t use the EyeFrame scene camera much beyond just playing with the ViewPoint EyeTracker software.
The EyeFrame eyetracker is a binocular system running at 30Hz or 60Hz depending on the resolution you select. The system is basically two NTSC 640×480 cameras pointing at your left and right eye along with IR LED illumination. The eye cameras along with the scene camera go into a Sensoray model 2255 USB frame capture box (see below) and then the data is sent to the PC for processing. The head mounted gear is mounted on a pair of safety glasses by default. You can also custom attach the system to an HMD like the Sony HMZ-T1. The cost of the system was ~$11,000. Yes, you read it correctly, $11k. You’re paying mainly for their software and expertise.
Arrington Research software called ViewPoint EyeTracker is an openCV based application that analyzes the eye and scene camera data to determine where you’re looking. There’s some adjustment initially to get the cameras pointing at the eyes to get a good image of the pupils. There’s a calibration that must be done by looking at green boxes which are placed at different locations on a screen. The system then uses these different positions to calibrate the system for your eye gaze vectors.
There are a ton of features in the ViewPoint EyeTracker software as can be seen in the image above. You can measure eye gaze, duration, major axis, minor axis, and many others. There is blink detection built in to use for things like triggering buttons. There’s even a built in demo program to drive the on screen cursor. Shown below you can also see the scene camera image on the right with blue and green dots for my estimated eye location.
Like I mentioned before I spent most of my time implementing custom code using the SDK. All functionality is available through the SDK and was relatively easy to use. Arrington Research customer support was also very good with replies within hours when I had issues with integrating their library calls.
Overall the system does what its suppose to and that is to track eye gaze. I found it to work well enough in the lab but found it to not be stable enough for any real field implementation. It constantly needs adjustment and recalibration to get reliable eye gaze or vectors. Even then it can have an offset, which can be corrected, but if the user moves the system on their head it needs to calibrated again. I chose this system because it was the smallest system I could get that could be HMD mounted. It served its purpose but there were some rough demos with customers that I just squeezed by on. It wasn’t usable on some people and worked perfect on others.
If you intend on using the EyeFrame system from Arrington Research just be prepared to do a lot of tweeking of code and physical fit to get it working. Even then you may not get reliable enough operation. However its a great exercise in what current eye tracking technology can do. There’s certainly tons of room for improvement in this field. You can get more information on this system on the Arrington Research website.