Administrative documentation assistant that collects patient, encounter, and medication data through conversational chat, stores in Supabase, and generates PDF visit packs.
- UV - Python package manager
- Bun - JavaScript runtime (or Node.js)
- Docker (optional, for containerized deployment)
- DeepSeek API key (default) OR Ollama server (fallback)
# Copy and configure environment
cp .env.example .env
# Build and start all services
docker compose up --buildcd backend
# Install dependencies with UV
uv sync
# Copy environment file and configure
cp ../.env.example .env
# Edit .env with your Supabase and Ollama settings
# Start development server
uv run uvicorn app.main:app --reload --port 8000cd frontend
# Install dependencies with Bun
bun install
# Start development server
bun run devOpen http://localhost:3000 to access the application.
Frontend (Next.js) Backend (FastAPI)
React Query + TailwindCSS LangGraph Workflow
│ │
└──────── HTTP API ────────────┘
│
┌────────────────────────────┼────────────────────────────┐
│ │ │
DeepSeek (Cloud) Ollama (Local) Supabase
DEFAULT Provider Fallback + MedGemma PostgreSQL
deepseek-chat qwen3:30b + Storage
│ │
│ MedGemma (Local)
│ alibayram/medgemma:4b
│ Medication normalization
│ │
└─── Automatic Fallback ─────┘
(DeepSeek → Ollama)
| Priority | Provider | Endpoint | Model | Purpose |
|---|---|---|---|---|
| Default | DeepSeek (Cloud) | https://api.deepseek.com/v1 |
deepseek-chat | Intent parsing, question generation |
| Fallback | Ollama (Local) | http://localhost:11434 |
qwen3:30b | Fallback when DeepSeek unavailable |
| Medical | MedGemma (Local) | http://localhost:11434 |
alibayram/medgemma:4b | Medication name normalization |
Fallback Behavior: When DeepSeek fails or API key not configured → automatic fallback to Ollama.
| Layer | Technology |
|---|---|
| Backend | Python, FastAPI, LangGraph |
| LLM | DeepSeek (default) + Ollama (fallback) |
| Medical Text | MedGemma via Ollama |
| Database | Supabase (PostgreSQL + Storage) |
| Frontend | Next.js, React Query, TailwindCSS |
| Packaging | Docker + docker compose |
Interactive API Documentation:
- Swagger UI: /api/v1/docs
- ReDoc: /api/v1/redoc
- OpenAPI JSON: /api/v1/openapi.json
| Endpoint | Method | Description |
|---|---|---|
/api/v1/sessions |
POST | Create new clinic visit session |
/api/v1/sessions/{id} |
GET | Get session state |
/api/v1/chat |
POST | Process user message |
/api/v1/encounters |
POST | Finalize and persist to DB |
/api/v1/documents/{id} |
GET | Retrieve PDF document |
/api/v1/sessions/{id} |
DELETE | Delete/abandon session |
/api/v1/models |
GET | List available LLM models |
/api/v1/health |
GET | Health check |
/api/v1/health/live |
GET | Kubernetes liveness probe |
/api/v1/health/ready |
GET | Kubernetes readiness probe |
backend/
app/
agent/ # LangGraph workflow, prompts, state
services/ # Ollama, Supabase, PDF, session management
routers/ # FastAPI route handlers
main.py # Application entry point
tests/ # pytest test suite
pyproject.toml # UV project configuration
frontend/
app/ # Next.js App Router pages
components/ # React components
lib/ # API client, React Query hooks
types/ # TypeScript definitions
supabase/
migrations/ # Database schema migrations
docs/
epics/ # Feature specifications
api-design.md # API documentation
uv sync # Install dependencies
uv run uvicorn app.main:app --reload # Dev server
uv run pytest tests/ -v # Run tests
uv run ruff check . # Lint codebun install # Install dependencies
bun run dev # Dev server (port 3000)
bun run build # Production build
bun run lint # Run ESLintdocker compose up --build # Build and start
docker compose logs -f backend # Follow backend logs
docker compose down # Stop all servicesSee .env.example for all configuration options. Key variables:
# Supabase (Required)
SUPABASE_URL=https://xxx.supabase.co
SUPABASE_SERVICE_ROLE_KEY=eyJ...
# DeepSeek (Default LLM Provider)
DEEPSEEK_API_KEY=sk-...
DEEPSEEK_MODEL=deepseek-chat
DEFAULT_LLM_PROVIDER=deepseek # "deepseek" or "ollama"
# Ollama (Fallback LLM)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=qwen3:30b
# MedGemma (Medical Text - via Ollama)
MEDGEMMA_BASE_URL=http://localhost:11434
MEDGEMMA_MODEL=alibayram/medgemma:4b
MEDGEMMA_ENABLED=true
# Security (Optional - disabled by default)
API_KEYS=key1,key2 # Comma-separated
AUTH_ENABLED=false # Set true in production
RATE_LIMIT_ENABLED=trueWhen AUTH_ENABLED=true, all endpoints require X-API-Key header:
# Generate secure key
python -c "import secrets; print(secrets.token_urlsafe(32))"
# Use in requests
curl -H "X-API-Key: your-key" http://localhost:8000/api/v1/sessionsEnabled by default. Per-endpoint limits:
| Endpoint | Limit |
|---|---|
| Sessions | 10/min |
| Chat | 30/min |
| Finalize | 10/min |
| Documents | 60/min |
# Backend tests
cd backend
uv run pytest tests/ -v
# Test Ollama connectivity
curl https://ollama2.tprun.deinfra.net/api/version
curl http://localhost:11434/api/version- Administrative clinic visit data collection
- Structured patient/encounter/medication data
- PDF visit pack generation
- Conversational chat interface
- Multi-model LLM support (DeepSeek, Ollama, MedGemma)
Note: This is an administrative tool, not a clinical decision system. It does not provide medical advice, diagnosis, or EHR integration.