Detecting Social Engagement of Elderly From Lifelog Image-streams to Identify Effective Cues for Autobiographic Recall
Abstract
Lifelog images captured automatically by wearable cameras serve as effective cues that induce Autobiographic Memory Recall (AMR), during personalized memory interventions. However, manual selection of images for such therapy imposes significant load on the caregivers. To reduce this load, automated tools that identify moments involving significant engagement of the camera wearer in social interactions are needed. To achieve this, we re-annotate images extracted from public lifelog datasets for the presence of non-verbal social signals and the perceived engagement of the life-logger during interactions. We use this data to develop deep learning models and explore how social signals and the detected intensity of social engagement influences the predictions of AMR from lifelogs. We show that understanding \textit{visual social engagement} can enhance AMR prediction, demonstrating the potential of the models in reducing caregivers' effort.