SaccadeX: Directed Acyclic Graph-based Semi-Supervised Learning of Continuous Ocular Dynamics from Sparse Neuromorphic Streams
Abstract
Continuous eye tracking is critical for applications in human-computer interaction, including biometric authentication, gaze-based systems, and affective-cognitive modeling. Recent interest in neuromorphic event cameras has grown due to their sub-microsecond latency in capturing eye movement dynamics. However, existing event-based eye-tracking methods face challenges such as limited labels, sub-optimal event accumulation, and a lack of frameworks that fail to capture fine-grained temporal relationships within event volumes. To address these, we propose a directed acyclic graph-based semi-supervised approach with a framework that is adaptable across multiple closely related tasks, including gaze tracking, pupil tracking, and eye-based emotion recognition. Our approach enables efficient spatiotemporal learning with 95.5% parameter reduction compared to existing methods, achieving significant performance improvements: 38.75% improvement in pupil tracking accuracy, 68% and 63% reductions in gaze angle error on EV-Eye and EBV-Eye datasets respectively, and 3.3% improvement in emotion recognition across all evaluated datasets.