Affective Computing: Detecting Emotion from Motion
Mon, September 10, 2012 • 3:00 AM - 4:15 AM • SEA 3.250
IRC/Cognitive Systems Seminar
Speaker: Takashi Yamauchi, Ph.D., Department of Psychology, Texas A&M University
Abstract: An adaptive computer system that can detect users’ emotional states in real time and tailor its output dynamically will revolutionize the nature of human-computer interaction. In my talk, I will introduce a robust method that analyzes users’ cursor movement (trajectories of a computer mouse) and detect their emotions. Recent revolutionary research in cognitive science and cognitive neuroscience has shown that neural states underlying cognitive decision making is well reflected in the way people move their hand to reach their choice. Hand reaching behavior occurs with the coordinated dynamic neural activation of the prefrontal cortex, which involves executive evaluation of stimulus information, and sensory-motor activities arising from cortical-striatal feedback loops. My hypothesis is that this complex coordination process of choice reaching behavior is influenced by the affective state (e.g., mood) of a person and that the subtle change in the coordination process can be captured by the way people move their computer cursor to make a intended choice (pressing a button). The results of my preliminary study (N=133) showed that extracted cursor trajectory properties can account for more than 40% of participants’ self-reported anxiety scores. In my talk, I will extend this finding by establishing a causal link between people’s emotional states (e.g., anxiety) and their mouse trajectory patterns.