December 8, 2022


webcam eye tracker ir comparison
Image from iMotions article below

I wanted to play with eye-tracking some more because I think its really interesting and you can do some really fun things with it. At work we used Tobii eye trackers before which are really awesome. During some more research I came across this iMotions article : Webcam-Based Eye Tracking vs. an Eye Tracker [Pros & Cons], which brought me to try out GazeRecorder.

Gazecloud API

Gazecloud have a great range of options for webcame based eye-tracking. Gazecloud API offers real-time eyetracking for the browser. It’s really easy to use, the github and documentation are great and can try out a demo here.

Hello World

So let’s have a go! Find the demo example here : index.html. It REALLY just requires including the script
<script src="" ></script>
To start calibration and eye-tracking call :
GazeCloudAPI.StartEyeTracking() ;
Set the callbacks

Gazecloud + p5

So as I wanted to play a bit with interactions from this gaze data I created a test receiving the gaze data into p5 (processing). Check it out here gaze_p5.html ( the script is here gaze_p5.js )

The GazeData object comes with a bunch of fields

Gaze data
FrameNr: 1425
​GazeX: 429.4
​GazeY: 236
​HeadPitch: 0.3
​HeadRoll: -11.1
​HeadX: 4
​HeadY: 0.3
​HeadYaw: 5.1
​HeadZ: 47.8
​Xview: 0.28411458333333334
​Yview: -0.002037351443123939
​docX: 436.4
​docY: -1.2000000000000002
rh: 480
​rw: 640
​rx: 0
​ry: 0
​state: -1
​time: 1600693643281

I am using the [ docX, docY ] as gaze coordinates to move the white dot to where you are (hopefully) looking. Additionally there is a yellow dot that follows the white one, which gives a less jittery, slower movement for something like a path of attention.

doc is relative to the html page document, while view is a relative index 0 to 1 on the page. R… I am not quite sure, though Rx seems to correlate with head position.

For my immediate purposes docX & docY was perfect as I just wanted to play a little with interactions depending on where you are looking. What if things always happen when you are looking away? What if the person in a portrait was always giving you strange looks? What if there is something always in the way of where you are trying to look?

With that in mind I made my first two demo’s and its really fun! Number 2 : the Annoying blob – you’re trying to look at all of the detailed image but some annoying keeps on following your vision. Version 2 could maybe be this guy in the cinema that keeps on getting in your way.

Number 2 – Awkward mona

First test just making Mona’s eyes look wherever you are looking. Look at her and she meets your eye, look away and she looks away. Version 2 will be that she is always staring at you, but quickly looks away whenever you look at here eyes.
I really love the lofo paper cut look of her eyes, I wanted something more silly than realistic


Eye tracking is awesome and thanks to Szymon Deja‘s GazeRecorder super easy to try out and implement!
However the calibration phase seems a bit longer than necessary, at least for simple interactions like this that require less precision and especially while developing I had to do them EACH TIME after changing or updating the code which was a real test for my patience. However I’m told with GazeRecorder‘s Desktop options I have more influence on this and can also save calibration settings, so I will probably try that next.

Leave a Reply