The Multisensory Perception of Emotion in Real and Virtual Humans
Joanna E McHugh, Rachel McDonnell, Jason S Chan, Fiona N Newell
Last modified: 2008-05-13
The perception of real and virtual humans activates different neural networks (Han et al., 2004; Mar et al., 2007). The current study investigates this difference on a behavioural level, focusing on the differences between the audiovisual perception of emotional cues when portrayed by real and virtual humans. Real and virtual correlates of the same actors portraying emotional body language (De Gelder, 2006) were created using Motion Capture technology. All stimuli were tested for recognisability in a series of pilot studies. Experiment 1 consisted of a six alternative forced choice task, in which participants had to accurately identify the emotion displayed, and it was found that virtual human portrayal is as effective as real human portrayal of emotional body language. Experiment 2 consisted of a crossmodal priming study with auditory primes (emotional utterances) and visual target stimuli (emotional body language). We expected that congruent audiovisual pairs would be recognised more efficiently than incongruent audiovisual pairs (see Van Den Stock et al., 2007), for both real and virtual humans. Results will help elucidate the processes involved in the perception of emotion in real humans and whether similar processes are involved in the perception of socially-relevant information from virtual humans.