Your private, AI-powered voice diary -> record, reflect, and rediscover yourself through sound.
Resonate is a full-stack web application designed to be a modern, intelligent journaling experience. Users can record audio diary entries which are transcribed and analyzed by a Hybrid AI Engine. The application leverages a microservice-inspired architecture where a Node.js backend handles business logic, API security, and automated maintenance, while a detached Python FastAPI service handles heavy ML computation asynchronously.
The system is built for ultimate flexibility and protection. It supports both Local AI (Ollama + local Whisper) for offline, privacy-focused users, and Cloud AI (Google Gemini + Groq) for high-speed analysis. The API layer is strictly protected by Upstash Serverless Redis to prevent abuse and manage rate limits effectively.
- ποΈ Voice Recording: Intuitive interface to record, preview, and upload audio entries.
- β‘ Real-Time Architecture:
- Fire-and-Forget Processing: The user is never blocked waiting for AI. Uploads return immediately while analysis runs in the background using FastAPI's BackgroundTasks and asyncio.
- Live Notifications: Integrated Socket.io pushes real-time updates to the client when analysis completes, updating the UI instantly without page reloads.
- π§ Hybrid AI Analysis & Transcription:
- Flexible Backend: Seamlessly switch between Local LLMs (Ollama) or Cloud AI (Google Gemini) via environment variables.
- High-Speed Transcription: Toggle between local Whisper processing or ultra-fast external transcription using Groq's API (whisper-large-v3).
- Adaptive Prompting: Uses "One-Shot" prompting for Gemini and "Chain of Thought" (4 separate calls) for local models to ensure high accuracy.
- π‘οΈ API Protection & Rate Limiting:
- Integrated Upstash Serverless Redis with Node.js to strictly govern API usage without consuming local server memory.
- Tiered rate limiting tied to Clerk User IDs:
- Standard DB Limits: 100 requests per minute.
- AI Burst Limits: 3 AI requests per minute.
- AI Daily Quota: 20 AI processing requests per 24 hours.
- π― Smart Goal Detection: The AI intelligently identifies potential life goals mentioned in your audio and suggests adding them to your tracker.
- π Analytics Dashboard: Server-side aggregated visualizations using SQL functions for maximum performance:
- Mood Trend Line: Track emotional changes over time.
- Emotion Heatmap: Calendar view of daily dominant emotions.
- Topic Frequency: Analysis of most discussed themes.
- π Enterprise-Grade Security:
- Encryption at Rest: All sensitive text (transcripts, summaries, reflections) is encrypted at the application layer before storage.
- Row Level Security (RLS): Supabase policies ensure strict data isolation between usersβusers can only access their own data.
- π Performance:
- TanStack Query: All API calls utilize useQuery and useMutation for aggressive caching, optimistic updates, and background re-fetching.
- Server-Side Aggregation: Heavy analytics calculations are offloaded to Postgres functions via schema_logic.sql, keeping the API lightweight.
Resonate uses an Event-Driven, Asynchronous Architecture to handle heavy AI workloads without compromising user experience.
sequenceDiagram
participant User as Frontend (Next.js)
participant Node as Backend (Express)
participant DB as Supabase (Postgres)
participant Python as ML Service (FastAPI)
User->>Node: 1. Upload Audio
Node->>DB: 2. Save File & Create Entry (Status: Processing)
Node->>Python: 3. Dispatch Analysis Job (Fire & Forget)
Node-->>User: 4. Return 200 OK (Immediate)
Note over Python: Background Tasks (Async)
Python->>Python: 5. Transcribe (Whisper)
Python->>Python: 6. Analyze (Gemini/Ollama)
Python->>Node: 7. Webhook POST /ai-result
Node->>DB: 8. Update Entry (Encrypted Data)
Node->>User: 9. Socket Emit (Real-time Update)
Resonate
βββ LICENSE
βββ README.md
βββ resonate-backend
β βββ Backend-ML # Python FastAPI Service
β β βββ main.py # Entry point & Endpoints
β β βββ requirements.txt
β β βββ utils
β β βββ ai_service.py # LLM Logic (Gemini/Ollama)
β β βββ helperFunction.py
β βββ Backend-Node # Node.js Express Service
β βββ controllers/ # Business Logic
β β βββ entryController.js
β β βββ goalController.js
β β βββ insightController.js
β β βββ quoteController.js
β β βββ webhookController.js
| βββ jobs/
| | βββ storageCleanUp.js # Cron Job
| βββ middleware/
| | βββ rateLimiter.js # API Rate Limiters
β βββ routes/ # API Routes
β β βββ entryRoutes.js
β β βββ goalRoutes.js
β β βββ insightRoutes.js
β β βββ quoteRoutes.js
β β βββ webhookRoutes.js
β βββ server.js
β βββ utils
β βββ config.js
β βββ encryption.js # AES Encryption Logic
βββ resonate-frontend # Next.js Application
β βββ src
β β βββ app # App Router
β β βββ components # Shadcn UI & Custom Components
β β βββ hooks # Custom React Query Hooks
β β βββ lib # Utilities & Socket Client
β β βββ ...
βββ schema_logic.sql # Database Triggers & Functions
Resonate uses a combination of server-side Postgres functions and Node.js scheduled tasks to maintain performance and data integrity.
-
Server-Side Analytics (
get_insights):- Instead of fetching thousands of rows to Node.js to calculate averages, we call a single SQL RPC function.
- It computes Heatmaps, Mood Charts, and Topic frequencies directly within the Postgres engine and returns a single, pre-calculated JSON object.
-
Automated Storage Cleanup (Orphan Sweeper):
- Note: Due to Supabase policy restrictions on executing direct SQL deletes on storage buckets via triggers, the previous database-level cleanup triggers were dropped.
- Node.js Cron Job: We now utilize a dedicated scheduled task (
jobs/storageCleanUp.js) using thecronlibrary on the Express backend. - Execution: Runs every Sunday at 3:00 AM (
0 3 * * 0). - Logic: It fetches all file paths currently sitting in the Supabase storage bucket and cross-references them against a
Setofaudio_pathvalues actively linked to user Diary Entries in the database. Any file in storage that does not exist in the database records is identified as an orphan and permanently deleted, ensuring zero wasted cloud storage costs.
- Bun (v1.0+)
- Python (v3.10+)
- Supabase Project
- Clerk Account
- Ollama (Optional, for local AI)
Clone the repo and configure environment variables. Refer to .env.example
git clone https://github.com/CodeDevvv/Resonate.git
cd resonate- Go to your Supabase SQL Editor.
- Run the contents of
schema_logic.sql. This creates the Tables, Enums, Triggers, and Analytics Functions required for the app to function.
The application can toggle seamlessly between Cloud and Local processing simply by changing the USE_LOCAL_LLM flag in your environment variables.
Option A: Cloud AI (Google Gemini + Groq Transcription) This is the recommended setup for the fastest processing times and lowest server memory usage.
- Get an API Key from Google AI Studio and the Groq Console.
- Set the following in your
Backend-ML/.env:
USE_LOCAL_LLM=False
RESONATE_GEMINI_KEY=your_gemini_api_key
GROQ_WHISPER_KEY=your_groq_api_key
LLM_MODEL_ID=gemini-1.5-pro-latest # Or your preferred Gemini model
# LLM_API_URL= # - Create a free Serverless Redis database on Upstash.
- Copy the Redis URL and add it to
Backend-Node/.env:
UPSTASH_REDIS_URL="rediss://default:your_password@your_url.upstash.io:6379"Step 1: Start ML Backend (Python) Handles Transcription & Intelligence.
cd resonate-backend/Backend-ML
pip install -r requirements.txt
uvicorn main:app --reload --port 8000Step 2: Start API Backend (Node.js) Handles Database, Auth, and Webhooks.
cd resonate-backend/Backend-Node
bun install
bun run serverStep 3: Start Frontend (Next.js) The User Interface.
cd resonate-frontend
bun install
bun run devVisit http://localhost:3000 to start recording.
| Feature | Tech Stack |
|---|---|
| Frontend Caching | TanStack Query (Stale-while-revalidate strategy) |
| Real-time Status | Socket.io (Event-driven updates) |
| DB, Storage | Supabase (PostgreSQL + Triggers) |
| Rate Limiting | Upstash Serverless Redis + Express Rate Limit |
| Transcription | Groq API (whisper-large-v3) or Local OpenAI Whisper |
| LLM Orchestration | FastAPI (Background Tasks) |
| Analytics | RPC |
| CRON | node-cron (clean up db storage) |
This project is licensed under the MIT License. See the LICENSE file for the full text.