From Physiological Signals
In this project (described here), we simulate driving scenes and scenarios in a 3D virtual reality immersive environment to sense and interpret drivers’ affective states (e.g. anger, fear, boredom).
Drivers wear non invasive bio-sensors while operating vehicles with haptic feedback (e.g. break failure, shaking with flat tire) and experience emotionally loaded 3D interactive driving events (e.g. driving in frustrating delayed New-York traffic, arriving at full speed in a blocked accident scene with failed breaks and pedestrian crossing).
Sentiment Analysis from Text
We compared the performance of three rule-based approaches and two supervised approaches (i.e. Naive Bayes and Maximum Entropy). We trained and tested our system using the SemEval-2007 affective text dataset, which contains news headlines extracted from news websites.
Our results (described here) show that our systems outperform the systems demonstrated in SemEval-2007.