Simultaneously Presented Facial and Contextual Information Influences Observers' Facial Expressions and Self-reports of Emotion, 2018-2019

DOI

We receive emotional signals from different sources, including the face and its surrounding context. Previous research has shown the effect that facial expressions and contextual affective information have on people’s brain responses. This study measured physiological responses and ratings of affect to face-context compounds of varied emotional content. Forty-two participants freely viewed face-context and context-only natural threat, mutilation, happy, erotic and neutral scenes whilst corrugator, zygomatic and startle eyeblink responses were recorded. Concerning the emotional content presented, participants’ corrugator, zygomatic, startle eyeblink responses and their valence and arousal ratings varied with the stimuli valence and arousal matched the stimuli valence. Face-context threat and mutilation scenes elicited more negative emotional experiences and larger corrugator responses than context-only scenes. In contrast, happy face-context scenes elicited more positive emotional experiences and a decreased corrugator response. The zygomatic showed an enhanced response to face-context scenes, regardless of the valence of the scenes. Our results show that the simultaneous perception of emotional signals from faces and contextual information induce enhanced facial reactions and affective responses.An increasing volume of violent and distressing imagery is being shared online, and this provides a challenge to any organisation that moderates their online content. The problem is particularly acute in public policing and social services organisations that must analyse such imagery in the course of investigation or child protection. Such organisations have a duty of care to protect their employees, and ensure their welfare. In support of this goal, this project will perform research which will facilitate the development of novel digital tools to assist users and reduce the mental health burden created by viewing this imagery. This objective will be attained by working in collaboration with the Child Exploitation Online Protection command of the National Crime Agency and the artificial intelligence technology firm, Qumodo. Three strands of research will be performed: 1) Evaluating emotional image recognition in the context of image manipulations guided by artificial intelligence to reduce emotional impact while still retaining scene information. 2) Experiments in social neuroscience that evaluate the effectiveness of the image manipulations from Strand 1, and help better to understand the nature of how the emotional processing of distressing images might compete with cognitive processing. 3) Determining what are the potential risks of implementing this new technology and how can these risks be dealt with effectively to maximise benefit to both employees and their organisations.

Forty-three participants took part in this study. One participant was excluded from all analyses due to equipment malfunction. In total, 42 participants were included in the data (23 females, mean age/sd = 25.42 ± 7.12, age-range = 18 - 49). Participants reported no neurological or psychiatric history, were right-handed and had normal or corrected-to-normal vision. Data collection took place using physiological recording of participants' corrugator, zygomatic and orbicularis oculi muscles and the participants ratings of valence and arousal using the Self-Assessment Manikin (Bradley & Lang, 1994).

Identifier
DOI https://doi.org/10.5255/UKDA-SN-855096
Metadata Access https://datacatalogue.cessda.eu/oai-pmh/v0/oai?verb=GetRecord&metadataPrefix=oai_ddi25&identifier=468c5025519f60123c8ddad79781780c3bcb0478e43bf8642b5859a921a8f248
Provenance
Creator Denk-Florea, C, The University of Glasgow
Publisher UK Data Service
Publication Year 2021
Funding Reference ESRC
Rights Cristina-Bianca Denk-Florea, The University of Glasgow. Frank Pollick, The University of Glasgow; The Data Collection is available for download to users registered with the UK Data Service.
OpenAccess true
Representation
Language English
Resource Type Numeric
Discipline Psychology; Social and Behavioural Sciences
Spatial Coverage Glasgow; United Kingdom