top of page

Resources

OWLET - Infant Webcam Linked Eye Tracker

I recently developed a new open-source methodology for automatically analyzing infant eye-tracking data collected on laptops or smartphones. This tool combines algorithms from computer vision, machine learning, and ecological psychology to robustly estimate infant's point-of-gaze from pre-recorded webcam and smartphone videos. 

Validation and Technical Details

We validated OWLET using data from a large sample of 6-8 month-old infants tested on smartphones and laptops in the home. A few of our key findings were:

  • The x/y spatial accuracy was 3.36°/2.67° (SD of 1.89°/1.55°) across subjects. While this is poorer than lab-based eye trackers, it is a huge improvement on existing methods for analyzing infant gaze in remote recordings.

Heatmaps of the point-of-gaze estimated by OWLET relative to each cued calibration point, split by spatial accuracy percentile groups.

  • There were no differences in eye tracking data for infants tested on webcams versus smartphones.

 

OWLET output gaze metrics for infants tested using laptop/computer webcams versus smartphones.

  • We observed significantly greater diversity in socioeconomic and racial/ethnic backgrounds when families were given the option to participate in remote studies using smartphones.

Open Source Code

The open source Python code for OWLET, and instructions for downloading the modules, can be found on GitHub at https://github.com/denisemw/OWLET

User Guide

For instructions on how to use the beta version of the OWLET app, please download the user guide here.

MacOS App Download

Please fill out the form below to download the MacOS app for OWLET. The registration information we collect will only be used to help us track where OWLET is being used.​​

Download OWLET

Thanks! You will receive an email with download instructions shortly. If you don't see it, check your spam folder.

bottom of page