Moods and Metrics

AI-Powered Mental Health Analysis Tool

🏆 Winner - Otsuka Valuenex Award at TreeHacks 2025
Back to Portfolio

The Problem

In today's world, stress and anxiety are increasingly prevalent, yet many people struggle to understand and quantify their emotional states. Traditional mental health monitoring often relies on subjective self-reporting, making it difficult to get accurate, real-time insights into one's emotional well-being.

Limited Self-Awareness

People often struggle to accurately assess their own emotional states and stress levels, leading to poor mental health management and delayed intervention when needed.

AI-Powered Analysis

Moods and Metrics provides objective, multi-modal analysis of emotional states through audio, video, and transcription sentiment analysis, helping users gain deeper insights into their mental well-being.

Analysis Modalities

Audio Analysis
Utilizes a fine-tuned wav2vec2 transformer to capture arousal, valence, and dominance values, mapping them to stress scores through advanced sentiment analysis
Video Analysis
Sends video clips to Google Gemini AI for facial expression analysis, evaluating calm/stress levels with detailed reasoning
Transcription Analysis
Performs sentiment analysis on video transcription data, providing comprehensive emotional insights through text analysis
2D & 3D Visualization
Interactive data visualization using three.js to map valence, dominance, and arousal in both 2D and 3D formats
AI-Driven Insights
Advanced AI models provide real-time emotional analysis and personalized insights for better mental health understanding
User-Friendly Interface
Clean React frontend with intuitive UI design, making complex emotional analysis accessible to everyone

How It Works

1
Multi-Modal Input
Users can provide audio recordings, video clips, or text transcriptions for analysis, allowing flexibility in how they want to assess their emotional state.
2
AI Processing
Advanced AI models (wav2vec2 for audio, Gemini for video/text) analyze the input to extract emotional indicators like arousal, valence, and dominance.
3
Emotional Mapping
Using a tri-dimensional model of core affect, the system maps emotional data to comprehensive stress scores and emotional states.
4
Interactive Visualization
Results are presented through interactive 2D and 3D visualizations, helping users understand their emotional patterns and trends over time.

Technologies Used

React: Frontend framework for interactive UI components
Python: Backend development and AI model integration
Hugging Face: Fine-tuned wav2vec2 transformer for audio analysis
PyTorch: Deep learning framework for model training and inference
FastAPI: High-performance backend API framework
Three.js: 3D visualization library for interactive emotion mapping
Tailwind CSS: Utility-first CSS framework for modern styling
D3.js: Data visualization library for 2D charts and graphs

Future Plans

Enhanced AI Models: Integrate additional modalities such as heartbeat monitoring, EKG signals, and other biometric data for more comprehensive emotional analysis.
Custom Model Training: Collect and train models on breathing audio to stress score data for more accurate local emotional tracking.
Real-Time Analysis: Implement live video analysis without requiring pre-recorded clips for immediate emotional feedback.
Personalized Recommendations: Add AI-powered stress relief suggestions and personalized wellness recommendations based on emotional patterns.
Mobile Application: Develop a mobile version for on-the-go emotional tracking with fully local processing capabilities.