eyeNotate : interactive annotation of mobile eye tracking data based on few-shot image classification

dc.contributor.authorBarz, Michael
dc.contributor.authorBhatti, Omair Shahzad
dc.contributor.authorAlam, Hasan Md Tusfiqur
dc.contributor.authorNguyen, Duy Minh Ho
dc.contributor.authorAltmeyer, Kristin
dc.contributor.authorMalone, Sarah
dc.contributor.authorSonntag, Daniel
dc.date.accessioned2026-01-14T10:18:24Z
dc.date.issued2025
dc.date.updated2025-08-12T14:30:51Z
dc.description.abstractMobile eye tracking is an important tool in psychology and human-centered interaction design for understanding how people process visual scenes and user interfaces. However, analyzing recordings from head-mounted eye trackers, which typically include an egocentric video of the scene and a gaze signal, is a time-consuming and largely manual process. To address this challenge, we develop eyeNotate, a web-based annotation tool that enables semi-automatic data annotation and learns to improve from corrective user feedback. Users can manually map fixation events to areas of interest (AOIs) in a video-editing-style interface (baseline version). Further, our tool can generate fixation-to-AOI mapping suggestions based on a few-shot image classification model (IML-support version). We conduct an expert study with trained annotators (n = 3) to compare the baseline and IML-support versions. We measure the perceived usability, annotations’ validity and reliability, and efficiency during a data annotation task. We asked our participants to re-annotate data from a single individual using an existing dataset (n = 48). Further, we conducted a semi-structured interview to understand how participants used the provided IML features and assessed our design decisions. In a post hoc experiment, we investigate the performance of three image classification models in annotating data of the remaining 47 individuals.en
dc.description.sponsorshipEuropean Union
dc.description.sponsorshipGerman Federal Ministry of Education and Research (BMBF)
dc.identifier.issn1995-8692
dc.identifier.other1950244784
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-170050de
dc.identifier.urihttps://elib.uni-stuttgart.de/handle/11682/17005
dc.identifier.urihttps://doi.org/10.18419/opus-16986
dc.language.isoen
dc.relationinfo:eu-repo/grantAgreement/EC/HE/101093079
dc.relation.uridoi:10.3390/jemr18040027
dc.rightsCC BY
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subject.ddc004
dc.titleeyeNotate : interactive annotation of mobile eye tracking data based on few-shot image classificationen
dc.typearticle
dc.type.versionpublishedVersion
ubs.fakultaetInformatik, Elektrotechnik und Informationstechnik
ubs.fakultaetExterne wissenschaftliche Einrichtungen
ubs.fakultaetFakultätsübergreifend / Sonstige Einrichtung
ubs.institutInstitut für Künstliche Intelligenz
ubs.institutMax-Planck-Institut für Intelligente Systeme
ubs.institutFakultätsübergreifend / Sonstige Einrichtung
ubs.publikation.seiten35
ubs.publikation.sourceJournal of eye movement research 18 (2025), No. 27
ubs.publikation.typZeitschriftenartikel

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
jemr-18-00027.pdf
Size:
2.8 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.3 KB
Format:
Item-specific license agreed upon to submission
Description: