eyeNotate : interactive annotation of mobile eye tracking data based on few-shot image classification
| dc.contributor.author | Barz, Michael | |
| dc.contributor.author | Bhatti, Omair Shahzad | |
| dc.contributor.author | Alam, Hasan Md Tusfiqur | |
| dc.contributor.author | Nguyen, Duy Minh Ho | |
| dc.contributor.author | Altmeyer, Kristin | |
| dc.contributor.author | Malone, Sarah | |
| dc.contributor.author | Sonntag, Daniel | |
| dc.date.accessioned | 2026-01-14T10:18:24Z | |
| dc.date.issued | 2025 | |
| dc.date.updated | 2025-08-12T14:30:51Z | |
| dc.description.abstract | Mobile eye tracking is an important tool in psychology and human-centered interaction design for understanding how people process visual scenes and user interfaces. However, analyzing recordings from head-mounted eye trackers, which typically include an egocentric video of the scene and a gaze signal, is a time-consuming and largely manual process. To address this challenge, we develop eyeNotate, a web-based annotation tool that enables semi-automatic data annotation and learns to improve from corrective user feedback. Users can manually map fixation events to areas of interest (AOIs) in a video-editing-style interface (baseline version). Further, our tool can generate fixation-to-AOI mapping suggestions based on a few-shot image classification model (IML-support version). We conduct an expert study with trained annotators (n = 3) to compare the baseline and IML-support versions. We measure the perceived usability, annotations’ validity and reliability, and efficiency during a data annotation task. We asked our participants to re-annotate data from a single individual using an existing dataset (n = 48). Further, we conducted a semi-structured interview to understand how participants used the provided IML features and assessed our design decisions. In a post hoc experiment, we investigate the performance of three image classification models in annotating data of the remaining 47 individuals. | en |
| dc.description.sponsorship | European Union | |
| dc.description.sponsorship | German Federal Ministry of Education and Research (BMBF) | |
| dc.identifier.issn | 1995-8692 | |
| dc.identifier.other | 1950244784 | |
| dc.identifier.uri | http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-170050 | de |
| dc.identifier.uri | https://elib.uni-stuttgart.de/handle/11682/17005 | |
| dc.identifier.uri | https://doi.org/10.18419/opus-16986 | |
| dc.language.iso | en | |
| dc.relation | info:eu-repo/grantAgreement/EC/HE/101093079 | |
| dc.relation.uri | doi:10.3390/jemr18040027 | |
| dc.rights | CC BY | |
| dc.rights | info:eu-repo/semantics/openAccess | |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
| dc.subject.ddc | 004 | |
| dc.title | eyeNotate : interactive annotation of mobile eye tracking data based on few-shot image classification | en |
| dc.type | article | |
| dc.type.version | publishedVersion | |
| ubs.fakultaet | Informatik, Elektrotechnik und Informationstechnik | |
| ubs.fakultaet | Externe wissenschaftliche Einrichtungen | |
| ubs.fakultaet | Fakultätsübergreifend / Sonstige Einrichtung | |
| ubs.institut | Institut für Künstliche Intelligenz | |
| ubs.institut | Max-Planck-Institut für Intelligente Systeme | |
| ubs.institut | Fakultätsübergreifend / Sonstige Einrichtung | |
| ubs.publikation.seiten | 35 | |
| ubs.publikation.source | Journal of eye movement research 18 (2025), No. 27 | |
| ubs.publikation.typ | Zeitschriftenartikel |