dc.contributor.author | Sen, Argha | |
dc.contributor.author | Bandara, Nuwan | |
dc.contributor.author | Gokarn, Ila | |
dc.contributor.author | Kandappu, Thivya | |
dc.contributor.author | Misra, Archan | |
dc.date.accessioned | 2024-12-19T21:37:08Z | |
dc.date.available | 2024-12-19T21:37:08Z | |
dc.date.issued | 2024-11-21 | |
dc.identifier.issn | 2474-9567 | |
dc.identifier.uri | https://hdl.handle.net/1721.1/157898 | |
dc.description.abstract | Eye-tracking technology has gained significant attention in recent years due to its wide range of applications in human-computer interaction, virtual and augmented reality, and wearable health. Traditional RGB camera-based eye-tracking systems often struggle with poor temporal resolution and computational constraints, limiting their effectiveness in capturing rapid eye movements. To address these limitations, we propose EyeTrAES, a novel approach using neuromorphic event cameras for high-fidelity tracking of natural pupillary movement that shows significant kinematic variance. One of EyeTrAES's highlights is the use of a novel adaptive windowing/slicing algorithm that ensures just the right amount of descriptive asynchronous event data accumulation within an event frame, across a wide range of eye movement patterns. EyeTrAES then applies lightweight image processing functions over accumulated event frames from just a single eye to perform pupil segmentation and tracking (as opposed to gaze-based techniques that require simultaneous tracking of both eyes). We show that these two techniques boost pupil tracking fidelity by 6+%, achieving IoU~=92%, while incurring at least 3x lower latency than competing pure event-based eye tracking alternatives. We additionally demonstrate that the microscopic pupillary motion captured by EyeTrAES exhibits distinctive variations across individuals and can thus serve as a biometric fingerprint. For robust user authentication, we train a lightweight per-user Random Forest classifier using a novel feature vector of short-term pupillary kinematics, comprising a sliding window of pupil (location, velocity, acceleration) triples. Experimental studies with two different datasets (capturing eye movement across a range of environmental contexts) demonstrate that the EyeTrAES-based authentication technique can simultaneously achieve high authentication accuracy (~=0.82) and low processing latency (~=12ms), and significantly outperform multiple state-of-the-art competitive baselines. | en_US |
dc.publisher | ACM | en_US |
dc.relation.isversionof | https://doi.org/10.1145/3699745 | en_US |
dc.rights | Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. | en_US |
dc.source | Association for Computing Machinery | en_US |
dc.title | EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing | en_US |
dc.type | Article | en_US |
dc.identifier.citation | Sen, Argha, Bandara, Nuwan, Gokarn, Ila, Kandappu, Thivya and Misra, Archan. 2024. "EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 8 (4). | |
dc.contributor.department | Singapore-MIT Alliance in Research and Technology (SMART) | en_US |
dc.relation.journal | Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies | en_US |
dc.identifier.mitlicense | PUBLISHER_POLICY | |
dc.eprint.version | Final published version | en_US |
dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
dc.date.updated | 2024-12-01T08:54:39Z | |
dc.language.rfc3066 | en | |
dc.rights.holder | The author(s) | |
dspace.date.submission | 2024-12-01T08:54:40Z | |
mit.journal.volume | 8 | en_US |
mit.journal.issue | 4 | en_US |
mit.license | PUBLISHER_CC | |
mit.metadata.status | Authority Work and Publication Information Needed | en_US |