A demonstration implementation of Google's NotebookLM concept, built with modern web technologies. This project showcases how to create an AI-powered note-taking application that can understand, analyze, and interact with your notes in a conversational way.
This project is a demonstration of the NotebookLM concept, which combines document understanding with conversational AI. It allows users to:
- Upload and manage documents
- Have natural conversations about the content
- Get AI-powered insights and summaries
- Search through documents using semantic understanding
-
Frontend:
- Next.js 15 with React 19
- assistant-ui for AI chat interface
- Tailwind CSS for styling
- Radix UI for accessible components
-
AI & Processing:
- OpenAI's GPT models for conversation and understanding
- LangChain for AI workflow orchestration
- Pinecone for vector storage and semantic search
-
Backend:
- FastAPI for the API layer
- PostgreSQL for data storage
- Drizzle ORM for database management
The project is organized into two main components:
web/: Frontend application built with Next.jsapi/: Backend API service built with Python FastAPI
- Modern, responsive UI with AI chat interface
- Document upload and management
- Real-time AI-powered conversations about your notes
- Semantic search capabilities
- Vector storage for efficient document retrieval
- Docker support for easy deployment
- Node.js (v18 or higher)
- Python 3.11+
- PostgreSQL
- Docker and Docker Compose (for containerized deployment)
# Database Configuration
DATABASE_URL=postgresql://postgres:postgres@db:5432/lmnotes
# API Configuration
AGENT_API_URL=http://localhost:8000
# Node Environment
NODE_ENV=development# Database Configuration
DATABASE_URL=postgresql://postgres:postgres@db:5432/lmnotes
# OpenAI Configuration
OPENAI_API_KEY=your_openai_api_key
# Pinecone Configuration
PINECONE_API_KEY=your_pinecone_api_key
PINECONE_INDEX_NAME=your_pinecone_index_name-
Navigate to the frontend directory:
cd web -
Install dependencies:
pnpm install
-
Set up environment variables: Create a
.envfile in thewebdirectory with the necessary environment variables. -
Start the development server:
pnpm dev
-
Navigate to the API directory:
cd api -
Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Start the API server:
uvicorn app.main:app --reload
-
Create the required
.envfiles:- Create
web/.envwith frontend environment variables - Create
api/.envwith backend environment variables
- Create
-
Build and start the containers:
docker-compose up --build
-
Access the services:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
- Database: localhost:5432
-
To stop the services:
docker-compose down
-
To view logs:
docker-compose logs -f
If you want to run the application locally but use the Docker database:
-
Start only the database service:
docker-compose up db
-
The database will be available at:
- Host: localhost
- Port: 5432
- User: postgres
- Password: postgres
- Database: lmnotes
-
Update your local
.envfiles to use this database:# In web/.env and api/.env DATABASE_URL=postgresql://postgres:postgres@localhost:5432/lmnotes
-
To stop the database:
docker-compose down
The project uses Drizzle ORM for database management. Available commands:
# Generate database migrations
pnpm db:generate
# Push migrations to database
pnpm db:push
# Open Drizzle Studio
pnpm db:studioThe project includes Docker configuration for both frontend and backend services. The setup includes:
- Multi-stage builds for optimized images
- Automatic database migrations on startup
- Environment variable management
- Health checks for service dependencies
- Next.js 15
- React 19
- Tailwind CSS
- Radix UI Components
- Drizzle ORM
- TypeScript
- FastAPI
- PostgreSQL
- Python 3.11+
- OpenAI API
- Pinecone Vector Database
To run the application, you'll need to obtain the following:
-
OpenAI API Key
- Visit OpenAI Platform
- Create an account and generate an API key
- Add the key to
api/.envasOPENAI_API_KEY
-
Pinecone Account and API Key
- Visit Pinecone
- Create an account and create a new project
- Generate an API key
- Create an index for vector storage
- Add the following to
api/.env:PINECONE_API_KEYPINECONE_ENVIRONMENTPINECONE_INDEX_NAME
This is a demonstration project, but contributions are welcome! If you'd like to contribute:
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Inspired by Google's NotebookLM
- Built with assistant-ui
- Uses LangChain for AI workflows

