YES - YouTube Summarizer is a web application that automatically summarizes YouTube videos using AI and allows users to interact with video content through intelligent conversations.
Instead of watching long videos, users can quickly understand key points and ask questions about the content.
- π¬ Summarize any YouTube video
- π User authentication system
- π€ AI-generated concise summaries
- π¬ Discuss with AI (ask questions about video content)
- π§ Context-aware conversation memory
- π Modern responsive web interface
- π³ Fully Dockerized deployment
Frontend (Next.js)
β
Backend API (Express.js)
β
External Services
βββ Ollama (AI Processing)
βββ MySQL Database
Both Ollama and MySQL currently run as external services.
The backend connects to them via environment configuration.
β οΈ Note: Currently, the database service is external and must be started manually. Future versions may include MySQL as part of Docker Compose for fully automated setup.
- Next.js
- React
- Tailwind CSS
- Node.js
- Express.js
- Sequelize ORM
- MySQL
- Ollama (Local LLM)
- YouTube Transcript Extraction
- Docker
- Docker Compose
yes-youtube-summarizer/
β
βββ frontend/ # Next.js client application
βββ backend/ # Express.js API server
βββ docker-compose.dev.yml
βββ docker-compose.yml
To get a local copy of this project up and running, follow these steps.
- Docker (v20.x or higher) and Docker Compose.
- Ollama Required as the external AI service.
- MySQL (or another compatible SQL database).
- Node.js and npm (optional)
-
Clone the repository
git clone https://github.com/HmizR/yes-youtube-summarizer.git cd yes-youtube-summarizer -
Environment Setup
Each folder already provides a template:
frontend/.env.example backend/.env.exampleCreate your environment files by copying them:
-
Backend
cp backend/.env.example backend/.env
-
Frontend
cp frontend/.env.example frontend/.env.local
Then edit the
.envand.env.localfiles according to your local configuration. -
-
Ollama Setup (Required)
This project uses Ollama as an external AI service.
You must install and run Ollama locally before starting the application.
-
Install Ollama
Download from here.
Verify installation:
ollama --version
-
Pull Required Model
Example:
ollama pull llama3
(Adjust the model name according to your
.envconfiguration.) -
Start Ollama Service
ollama serve
By default, Ollama runs at:
http://localhost:11434Make sure your backend
.envmatches this URL:OLLAMA_HOST=http://localhost:11434
-
-
Database Setup (MySQL)
The MySQL database is not yet managed by Docker Compose.
You must provide your own running MySQL instance.
This may be containerized in future releases.
-
Install MySQL
Install MySQL locally or run it using Docker manually.
Example (Docker):
docker run -d \ --name youtube-mysql \ -e MYSQL_ROOT_PASSWORD=password \ -e MYSQL_DATABASE=youtube_summarizer \ -p 3306:3306 \ mysql:8
-
Configure Backend Environment
Update:
backend/.envExample:
DB_HOST=localhost DB_PORT=3306 DB_USER=root DB_PASSWORD=password DB_NAME=youtube_summarizer
-
-
Start the development server
docker compose -f docker-compose.dev.yml up --build
-
Development mode
docker compose -f docker-compose.dev.yml build
-
Production mode
docker compose build
-
Development mode
docker compose -f docker-compose.dev.yml up
-
Production mode
docker compose up -d
Now you can view the app at http://localhost:3000 on your browser.
-
Development mode
docker compose -f docker-compose.dev.yml down
-
Production mode
docker compose down
The API documentation can be accessed at http://localhost:5000/api/v1/docs.
Developed by Hamizan Rifqi Afandi.