Affective Social Computing Laboratory
  • Research
    • Virtual Health Assistants
    • Interactive Virtual Training
    • HapFACS open source software/API Download
    • Emotion Recognition from Text >
      • Ontology of Healthy Lifestyles
      • Sentiment Analysis
      • Lexical Resources
      • Unified Semantic Network
  • Publications
  • People
  • Resources
  • Contact

Emotion Recognition from Physiological Signals


Picture
Emotion recognition from physiological signals

In this project, we investigate how human physiological signals associated with emotional arousal (e.g. heart rate, body temperature, galvanic skin response) can be captured and interpreted to provide insights into drivers' emotional states.  river's safety can be enhanced 

We simulate driving scenes and scenarios in a 3D virtual reality immersive environment to sense and interpret drivers' affective states (e.g. anger, fear, boredom).   

Drivers wear non invasive bio-sensors while operating vehicles with haptic feedback (e.g. break failure, shaking with flat tire) and experience emotionally loaded 3D interactive driving events (e.g. driving in frustrating delayed New-York traffic, arriving at full speed in a blocked accident scene  with failed breaks and pedestrian crossing).

Our results are discussed in our article on drivers' safety: F. Nasoz, C. Lisetti, A. Vasilakos.  Affective intelligent and Adaptive Car Interfaces.  Information Sciences, Vol. 180: 3817-3836 [PDF].

Powered by Create your own unique website with customizable templates.