Skip to main content

Installation

WASP runs entirely in Docker. The installation process takes approximately 10 minutes.

Prerequisites

  • Docker 24.0+ and Docker Compose v2.20+
  • Ubuntu 22.04 / 24.04 (or any Linux distro with Docker support)
  • 4 GB RAM minimum (8 GB recommended for browser skills)
  • 20 GB disk minimum
  • A Telegram bot token (from @BotFather)
  • At least one AI API key: OpenAI, Anthropic, Google Gemini, or xAI

Step 1: Install Docker

# Install Docker (Ubuntu)
curl -fsSL https://get.docker.com | bash
sudo usermod -aG docker $USER
newgrp docker

# Verify
docker --version
docker compose version

Step 2: Clone the Repository

git clone https://github.com/agentwasp/wasp.git /home/agent
cd /home/agent

Step 3: Create Environment File

cp .env.example .env
nano .env

Minimum required values:

# Database
POSTGRES_PASSWORD=your_secure_password_here

# Telegram
TELEGRAM_BOT_TOKEN=123456789:ABCdefGHI...
TELEGRAM_ALLOWED_USERS=your_telegram_user_id

# Dashboard (min 16 chars)
DASHBOARD_SECRET=your_secure_dashboard_secret

# At least one AI provider
OPENAI_API_KEY=sk-...
# or
ANTHROPIC_API_KEY=sk-ant-...
# or
GOOGLE_API_KEY=AIza...

# Optional: timezone
TIMEZONE=America/New_York
Find your Telegram user ID

Send a message to @userinfobot on Telegram to get your numeric user ID.

Step 4: Create Data Directories

mkdir -p /home/agent/data/{redis,postgres,memory,logs,backups,shared,screenshots,chat-uploads,browser-sessions,skills,ollama,config}
chmod 777 /home/agent/data/browser-sessions

Step 5: Build and Start

cd /home/agent
docker compose build
docker compose up -d

This will start 6 services: redis, postgres, core, telegram, broker, nginx.

Step 6: Verify Installation

# Check all containers are running
docker compose ps

# Check core logs
docker compose logs agent-core --tail=50

# Test health endpoint
curl http://localhost:8080/health

You should see all containers in Up status and the health endpoint returning 200.

Optional: Enable Local LLM (Ollama)

# Start with Ollama profile
docker compose --profile local-llm up -d

# Pull a model
docker exec agent-ollama ollama pull qwen2.5:7b

Next Steps