Charles Explorer logo
🇬🇧

Quantifying the Rating Performance of Ambiguous and Unambiguous Facial Expression Perceptions Under Conditions of Stress by Using Wearable Sensors

Publication at Faculty of Science, Faculty of Humanities |
2022

Abstract

Background: In real-world scenarios humans perceive the world contextually, relying on previous information to modify their responses. During interactions with a machine, missing contexts may decrease the accuracy of judgements.

In the realm of human-computer interactions (HCI), relatively easy tasks as controls may not be relevant.

To evaluate the impact of stress we increased the cortisol level by the safe but reliable procedure Cold Pressor Task. We used five stimuli represented by facial expressions: 'neutral', 'laughter', 'fear', 'pain', and 'pleasure'.

Aim: We intend to find out how the responses to stimuli are altered by stress and statistically quantify the BVP (Blood Volume Pulse) signals.

Materials: 27 raters rated these five stimuli presented by 5 actors and 5 actresses, while BVP was being registered.

Methods: Each physiological response was a six-second time series after the rater rated the stimulus. A nontrivial model includes lag dependencies on either previous states or previous noise. The simplest models would be ARMA(p, q) models with to-be-determined parameters ϕ1, . . . ϕp and θ1, . . . θq.

Inferences: In this study, we find that the wearables' sampling for six seconds cannot separate signal from noise significantly. Only one response was found to be significantly affected by the condition of stress: the perception of fear.