← Back
Publicaciones

PaGER-Sync ADICVIDEO: Player affective gaming experience & responses - synchronized ADICVIDEO dataset.

Authors

Civit-Masot, J , Luna-Perejón, F , Muñoz-Saavedra, L , CIVIT MASOT, MIGUEL, Dominguez-Morales, M , Miró-Amarante, L

External publication

No

Means

Data Brief

Scope

Article

Nature

Científica

JCR Quartile

SJR Quartile

Publication date

01/12/2025

Abstract

The PaGER-Sync ADICVIDEO dataset is a multimodal, temporally synchronized repository of physiological and facial expression data recorded during controlled, immersive video game sessions designed to simulate realistic home gaming environments. It integrates biosignals from the Empatica E4 wristband -including Electrodermal Activity (EDA), Blood Volume Pulse (BVP), and Skin Temperature (TEMP) - with facial expression features extracted from video recordings using FaceReader software. Additionally, the dataset includes scores from pre-session psychometric questionnaires (Gaming Addiction Scale, Scale of Positive and Negative Experience, Emotion Regulation Questionnaire) and demographic gender data, providing psychological and individual difference context. A summary file detailing the two strongest emotions expressed by each participant with their respective percentages is included. A total of 25 participants played three commercial video games (Tetris, Sonic Racing, and Fall Guys) under controlled conditions, while their physiological responses were continuously recorded and their facial expressions captured on video for subsequent analysis. All data streams were precisely aligned using a common video-based timestamp, enabling frame-level synchronization across modalities, and the data were segmented by game. The dataset supports a wide range of research applications in affective computing, human-computer interaction, and behavioral analysis, and is particularly well-suited for the development and evaluation of multimodal affect detection models, as well as for exploring the interplay between psychological traits and real-time emotional responses.

Keywords

Affective computing; Empatica E4; FaceReader; Facial action units; Physiological signals; Psychometric scales; Synchronized dataset; Video games

Universidad Loyola members