Skip to content

rishiraj-dev/sm-assignment-llm

Repository files navigation

AI Chat Application

A full-featured AI chat application built with FastAPI, PostgreSQL, and Google Gemini AI, featuring real-time WebSocket communication and a Streamlit frontend.

🚀 Features

  • Real-time Chat: WebSocket-based real-time messaging
  • AI Integration: Google Gemini AI for intelligent responses
  • User Management: JWT-based authentication and user registration
  • Conversation Management: Create, manage, and search conversations
  • Chat Summaries: AI-generated conversation summaries
  • RESTful API: Comprehensive REST API with FastAPI
  • Web Interface: Modern Streamlit-based frontend
  • Database: PostgreSQL with SQLAlchemy ORM
  • Testing: Comprehensive test suite with pytest

🛠️ Tech Stack

  • Backend: FastAPI, Python 3.13+
  • Database: PostgreSQL with SQLAlchemy
  • AI: Google Gemini API
  • Frontend: Streamlit
  • Authentication: JWT tokens
  • WebSockets: Real-time communication
  • Testing: pytest, httpx

📋 Prerequisites

  • Python 3.13+
  • PostgreSQL
  • Google Gemini API key

🔧 Installation

Option 1: Docker (Recommended)

  1. Clone and navigate to the project

  2. Set up environment variables:

    # Create .env file with your Google API key
    echo "GOOGLE_API_KEY=your-google-gemini-api-key-here" > .env
  3. Run with Docker Compose:

    # Build and start all services
    docker-compose up -d
    
    # View logs
    docker-compose logs -f
    
    # Stop all services
    docker-compose down
  4. Access the application:

Option 2: Local Development

  1. Clone and navigate to the project

  2. Create and activate virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  3. Install dependencies:

    pip install -r requirements.txt
  4. Set up environment variables:

    cp .env.example .env
    # Edit .env with your configuration
  5. Set up PostgreSQL database:

    CREATE DATABASE chat;
    CREATE DATABASE chat_test;

⚙️ Configuration

For Docker (docker-compose.yml)

The Docker setup uses environment variables defined in docker-compose.yml. Update the GOOGLE_API_KEY in your .env file:

GOOGLE_API_KEY=your-google-gemini-api-key

For Local Development

Update the .env file with your settings:

# Database
DATABASE_URL=postgresql+psycopg://username:password@localhost:5432/chatapp
TEST_DATABASE_URL=postgresql+psycopg://username:password@localhost:5432/chatapp_test

# Security
SECRET_KEY=your-super-secret-key-at-least-32-characters-long
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30

# Google Gemini API
GOOGLE_API_KEY=your-google-gemini-api-key

# Server Configuration
HOST=0.0.0.0
PORT=8000
DEBUG=true

🚀 Running the Application

Start the FastAPI Backend:

uvicorn main:app --reload --host 0.0.0.0 --port 8000

Start the Streamlit Frontend:

streamlit run streamlit_ui/main.py

API Documentation:

📁 Project Structure

ai/
├── main.py                 # FastAPI application entry point
├── requirements.txt        # Python dependencies
├── .env                   # Environment configuration
├── README.md              # This file
├── app/
│   ├── __init__.py
│   ├── api/
│   │   ├── __init__.py
│   │   └── chat.py        # Chat API endpoints
│   ├── core/
│   │   ├── __init__.py
│   │   └── config.py      # Application configuration
│   ├── database/
│   │   └── database.py    # Database models and connection
│   ├── schemas/
│   │   └── schemas.py     # Pydantic models
│   └── services/
│       ├── auth.py        # Authentication service
│       ├── llm.py         # AI/LLM service
│       └── websocket.py   # WebSocket management
├── streamlit_ui/
│   └── main.py           # Streamlit frontend
└── tests/
    └── test_api.py       # API tests

🔌 API Endpoints

Authentication

  • POST /auth/register - Register new user
  • POST /auth/login - User login
  • GET /auth/me - Get current user info

Chat

  • GET /chat/conversations - List user conversations
  • POST /chat/conversations - Create new conversation
  • GET /chat/conversations/{id}/messages - Get conversation messages
  • POST /chat/conversations/{id}/messages - Send message
  • GET /chat/conversations/{id}/summary - Get conversation summary
  • WebSocket /ws/{conversation_id} - Real-time chat

🧪 Testing

Run the test suite:

pytest tests/ -v

Run with coverage:

pytest tests/ --cov=app --cov-report=html

🔧 Development

Database Migrations

# Create migration
alembic revision --autogenerate -m "description"

# Apply migrations
alembic upgrade head

Code Quality

# Format code
black app/ tests/

# Lint code
flake8 app/ tests/

# Type checking
mypy app/

🚀 Deployment

Using Docker

Build and run individual containers:

# Build the API container
docker build -t ai-chat-api .

# Build the Streamlit container  
docker build -f Dockerfile.streamlit -t ai-chat-streamlit .

# Run PostgreSQL
docker run -d --name postgres \
  -e POSTGRES_DB=chat \
  -e POSTGRES_USER=postgres \
  -e POSTGRES_PASSWORD=postgres \
  -p 5432:5432 \
  postgres:15-alpine

# Run API (after PostgreSQL is ready)
docker run -d --name api \
  --link postgres:postgres \
  -e DATABASE_URL=postgresql+psycopg://postgres:postgres@postgres:5432/chat \
  -e GOOGLE_API_KEY=your-api-key \
  -p 8000:8000 \
  ai-chat-api

# Run Streamlit
docker run -d --name streamlit \
  --link api:api \
  -e API_BASE_URL=http://api:8000/api/v1 \
  -p 8501:8501 \
  ai-chat-streamlit

Using Docker Compose (Recommended):

# Start all services in detached mode
docker-compose up -d

# Start with build (if you made changes)
docker-compose up -d --build

# View logs for all services
docker-compose logs -f

# View logs for specific service
docker-compose logs -f api
docker-compose logs -f streamlit
docker-compose logs -f postgres

# Stop all services
docker-compose down

# Stop and remove volumes (clears database)
docker-compose down -v

# Scale services (run multiple instances)
docker-compose up -d --scale api=2

Docker Commands Cheatsheet

# Development workflow
docker-compose up -d --build    # Build and start
docker-compose logs -f api      # Watch API logs
docker-compose exec api bash    # Enter API container
docker-compose exec postgres psql -U postgres -d chat  # Connect to DB

# Production deployment
docker-compose -f docker-compose.prod.yml up -d

# Cleanup
docker-compose down -v          # Stop and remove volumes
docker system prune -f          # Clean up unused containers/images

Environment Variables for Docker

Create a .env file in the project root:

# Required
GOOGLE_API_KEY=your-google-gemini-api-key-here

# Optional (defaults are set in docker-compose.yml)
SECRET_KEY=your-secret-key-change-in-production
POSTGRES_PASSWORD=postgres

Production Considerations

  1. Security:

    • Change default passwords
    • Use secrets management
    • Enable SSL/TLS
    • Set proper firewall rules
  2. Scaling:

    docker-compose up -d --scale api=3 --scale streamlit=2
  3. Monitoring:

    • Add health checks
    • Set up logging aggregation
    • Monitor resource usage
  4. Backup:

    # Backup database
    docker-compose exec postgres pg_dump -U postgres chat > backup.sql
    
    # Restore database
    docker-compose exec -T postgres psql -U postgres chat < backup.sql

About

FastAPI Chat APP with LLM Summary and Insights

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published