Show simple item record

dc.contributor.authorSen, Argha
dc.contributor.authorBandara, Nuwan
dc.contributor.authorGokarn, Ila
dc.contributor.authorKandappu, Thivya
dc.contributor.authorMisra, Archan
dc.date.accessioned2024-12-19T21:37:08Z
dc.date.available2024-12-19T21:37:08Z
dc.date.issued2024-11-21
dc.identifier.issn2474-9567
dc.identifier.urihttps://hdl.handle.net/1721.1/157898
dc.description.abstractEye-tracking technology has gained significant attention in recent years due to its wide range of applications in human-computer interaction, virtual and augmented reality, and wearable health. Traditional RGB camera-based eye-tracking systems often struggle with poor temporal resolution and computational constraints, limiting their effectiveness in capturing rapid eye movements. To address these limitations, we propose EyeTrAES, a novel approach using neuromorphic event cameras for high-fidelity tracking of natural pupillary movement that shows significant kinematic variance. One of EyeTrAES's highlights is the use of a novel adaptive windowing/slicing algorithm that ensures just the right amount of descriptive asynchronous event data accumulation within an event frame, across a wide range of eye movement patterns. EyeTrAES then applies lightweight image processing functions over accumulated event frames from just a single eye to perform pupil segmentation and tracking (as opposed to gaze-based techniques that require simultaneous tracking of both eyes). We show that these two techniques boost pupil tracking fidelity by 6+%, achieving IoU~=92%, while incurring at least 3x lower latency than competing pure event-based eye tracking alternatives. We additionally demonstrate that the microscopic pupillary motion captured by EyeTrAES exhibits distinctive variations across individuals and can thus serve as a biometric fingerprint. For robust user authentication, we train a lightweight per-user Random Forest classifier using a novel feature vector of short-term pupillary kinematics, comprising a sliding window of pupil (location, velocity, acceleration) triples. Experimental studies with two different datasets (capturing eye movement across a range of environmental contexts) demonstrate that the EyeTrAES-based authentication technique can simultaneously achieve high authentication accuracy (~=0.82) and low processing latency (~=12ms), and significantly outperform multiple state-of-the-art competitive baselines.en_US
dc.publisherACMen_US
dc.relation.isversionofhttps://doi.org/10.1145/3699745en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleEyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicingen_US
dc.typeArticleen_US
dc.identifier.citationSen, Argha, Bandara, Nuwan, Gokarn, Ila, Kandappu, Thivya and Misra, Archan. 2024. "EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 8 (4).
dc.contributor.departmentSingapore-MIT Alliance in Research and Technology (SMART)en_US
dc.relation.journalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologiesen_US
dc.identifier.mitlicensePUBLISHER_POLICY
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2024-12-01T08:54:39Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2024-12-01T08:54:40Z
mit.journal.volume8en_US
mit.journal.issue4en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record