Behavioral Information & Interaction Computation Lab, NTHU 2025.04 - Present
Postdoctoral Researcher working with Professor Chi-Chun Lee (Jeremy)
Research in Modeling Emotions in Interactive and Multimodal Contexts
- Lead the development of a large-scale multimodal interaction corpus capturing co-speech gestures, full-body motion, and expressive speech during dyadic conversations, in collaboration with experts in acting and behavioral sciences.
- Design affective computing frameworks that move beyond static, unimodal processing by modeling dynamic emotion co-regulation, turn-taking cues, and gestural-prosodic synchrony in real-time interactions.
Research in Trustworthy AI on Emotion Recognition and Healthcare
- Advance fairness-aware learning across heterogeneous signals. Design and evaluate fairness-aware models for both speech and physiological signals, tackling demographic imbalance and individual variability.
- Mitigate reliability challenges under real-world data constraints and address incomplete or noisy input conditions through adaptive model design.
- Investigate how fairness and uncertainty modeling contribute to the trustworthiness of affective systems deployed in sensitive domains such as healthcare and mental wellbeing.