Emotion Recognition Using Mobile Sensors for Mental Health Telemonitoring: A Deep Learning Approach
Keywords:
Emotion Recognition, Mobile Sensors, Deep Learning, Telemonitoring, Mental Health, mHealth, CNN-LSTM, Stress Detection, Behavioral Analytics, Digital BiomarkersAbstract
Advancements in mobile sensing technologies have made it possible to monitor emotional states in real time, offering significant opportunities for mental health telemonitoring. This research investigates a deep learning–based framework for emotion recognition using multimodal mobile sensor data, including accelerometer signals, gyroscope patterns, heart rate variability, ambient audio features, and usage behavior metrics. The objective is to identify stress, anxiety, depression-related indicators, and mood fluctuations through passive and unobtrusive sensing. A hybrid deep learning model combining Convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM), and attention mechanisms is proposed to classify emotional states with high temporal accuracy. A real-world case study conducted with 135 participants demonstrates the feasibility and clinical relevance of mobile sensor–based emotional monitoring. Data analysis indicates that multimodal fusion significantly improves performance, reaching an accuracy of 91.7% in distinguishing between relaxed, stressed, neutral, and anxious emotional states. The study also identifies implementation barriers such as sensor noise, privacy concerns, inconsistent sampling rates, and limited user adherence. Recommendations highlight the need for ethically compliant systems, robust on-device models, transparent data-processing workflows, and stronger clinical validation. Overall, the findings support the integration of deep learning–powered mHealth systems into mental health telemonitoring frameworks to enable early detection, continuous assessment, and personalized interventions.








