Welcome to your own PDF-powered AI Chatbot! This project helps you build a smart, document-aware chatbot using Flask, LangChain, and modern LLMs. Upload a PDF, ask questions, and get instant, context-aware answers—all through a beautiful web interface.
- Chat with your PDFs: Upload any PDF and ask questions about its content.
- Modern UI: Clean, responsive frontend with light/dark mode.
- Flexible LLM Backend: Use IBM WatsonX or Hugging Face models.
- Fast & Private: All processing happens locally or in your own cloud.
- Easy Deployment: Run locally or with Docker in minutes.
- Getting Started
- How It Works
- Running the Application
- Docker Deployment
- Switching to Hugging Face
- Project Structure
- License
- Python 3.x
pip
andvirtualenv
git
# Clone the repository
$ git clone https://github.com/Garbii1/personal-data-assistant-chatbot.git
$ cd personal-data-assistant-chatbot/build_chatbot_for_your_data
# Create and activate a virtual environment
$ pip install virtualenv
$ virtualenv my_env
# On Windows:
$ my_env\Scripts\activate
# On macOS/Linux:
$ source my_env/bin/activate
# Install dependencies
$ pip install -r requirements.txt
$ pip install langchain-community
index.html
: Chat interface layoutstatic/style.css
: Modern, responsive styles (light/dark mode)static/script.js
: Handles chat, file upload, and UI events
worker.py
: Handles PDF processing, embeddings, and LLM queriesserver.py
: Flask server, API endpoints, and file handling
- Upload PDF → Processed & split into chunks
- Embeddings → Chunks stored in a vector database (Chroma)
- Ask Questions → LLM retrieves relevant info and answers
$ python server.py
Visit http://127.0.0.1:8000 in your browser. Upload a PDF and start chatting!
# Build the Docker image
$ docker build . -t build_chatbot_for_your_data
# Run the container
$ docker run -p 8000:8000 build_chatbot_for_your_data
Access at http://127.0.0.1:8000.
- Install extra dependencies:
pip install langchain==0.1.17 huggingface-hub==0.23.4
- Get your Hugging Face API key from huggingface.co
- Edit
worker.py
:- Replace the
init_llm()
function as shown in the code comments - Set your API key and desired model (e.g.,
mistralai/Mistral-7B-Instruct-v0.3
)
- Replace the
personal-data-assistant-chatbot/
├── build_chatbot_for_your_data/
│ ├── Dockerfile
│ ├── requirements.txt
│ ├── server.py
│ ├── worker.py
│ ├── static/
│ │ ├── script.js
│ │ └── style.css
│ └── templates/
│ └── index.html
├── my_env/ (virtual environment)
└── README.md
This project is licensed under the MIT License. See the LICENSE file for details.
Made with ❤️ by Garbii1