Exploring autobiographical memories for real life sequences of event episodes with a wearable camera
A current hallmark in Autobiographical Memory (AM) research is to unravel how individual sequences of real-life event episodes are encoded and retrieved from long-term memory. To address this question experimentally, we recorded electroencephalographic activity (EEG) while participants retrieved their individual AMs cued by pictures taken automatically by a wearable camera from the past one-week daily life. As our experience is continuous, we sampled real-life experiences into segments of context-based episode units identified automatically by a semantic regularized clustering algorithm (SRclustering), that groups together temporally adjacent images sharing contextual and semantic attributes (extracted employing a convolutional network-based approach). This approach allowed a simplification and an unbiased identification of the possible underlying structure of the sequential nature of the autobiographical experience.