Skip to content

Self-Hosted Quick Start

This guide will help you deploy Flexus on your own infrastructure using Docker Compose. In about 5 minutes, you’ll have a fully functional Flexus instance running locally.

Prerequisites

Before you begin, make sure you have:

  • Docker (version 20.10 or later)
  • Docker Compose (version 2.0 or later)
  • Git for cloning the repository
  • At least 4GB RAM available for Docker
  • An LLM API key (OpenAI, Anthropic, or compatible provider)
LLM Provider

Flexus is model-agnostic. You can use OpenAI, Anthropic, or any LiteLLM-compatible provider. You’ll configure this in the .env file.

Quick Start

  1. Clone the Repository

    Terminal window
    git clone https://github.com/smallcloudai/flexus.git
    cd flexus
  2. Configure Environment

    Copy the example environment file and edit it with your settings:

    Terminal window
    cp .env.example .env

    Open .env and set the required variables:

    Terminal window
    # Required: Your LLM provider API key
    LITELLM_API_KEY=sk-your-openai-key
    # Optional: Change default model (defaults to gpt-4o)
    LITELLM_MODEL=gpt-4o
    # Optional: Set a custom secret key for sessions
    SESSION_SECRET=your-random-secret-here
  3. Start Flexus

    Terminal window
    docker-compose up -d

    This will start all required services:

    • Frontend (Nuxt.js) on port 3000
    • Backend (Python/GraphQL) on port 8000
    • PostgreSQL database
    • Redis for caching and pub/sub
    • Background services (scheduler, advancer, etc.)
  4. Open Flexus

    Navigate to http://localhost:3000 in your browser.

  5. Create Your Account

    • Click “Sign Up” to create your first account
    • This account will automatically become the admin
    • Create your first workspace

Verify Installation

After starting, verify all services are running:

Terminal window
docker-compose ps

You should see all services in “Up” state:

NAME STATUS
flexus-frontend Up
flexus-backend Up
flexus-postgres Up
flexus-redis Up
flexus-advancer Up
flexus-scheduler Up

Check logs if something isn’t working:

Terminal window
# All services
docker-compose logs -f
# Specific service
docker-compose logs -f backend

Your First Bot

Once logged in, let’s hire your first bot:

  1. Navigate to Marketplace

    Click “Marketplace” in the sidebar to see available bots.

  2. Hire Frog Bot

    Find “Frog” — it’s a simple demo bot designed for learning. Click “Hire” to add it to your workspace.

  3. Configure the Bot

    After hiring, you’ll see the bot’s setup dialog. For Frog, the defaults work fine. Click “Save”.

  4. Start a Conversation

    Click on the Frog bot in your sidebar and send a message:

    Hello! Can you ribbit for me?
  5. Watch It Work

    The bot will respond using its tools. You can see the Kanban board showing task progress.

Congratulations!

You now have a working Flexus installation. Explore the marketplace to find more bots, or continue to learn how to build your own.

Configuration Options

Database

By default, Flexus uses a PostgreSQL container. For production, you may want to use an external database:

.env
DATABASE_URL=postgresql://user:password@your-db-host:5432/flexus

Redis

Similarly, you can configure an external Redis:

.env
REDIS_URL=redis://your-redis-host:6379

LLM Providers

Flexus uses LiteLLM for model routing. You can configure multiple providers:

Terminal window
# OpenAI
LITELLM_API_KEY=sk-...
LITELLM_MODEL=gpt-4o
# Or Anthropic
LITELLM_API_KEY=sk-ant-...
LITELLM_MODEL=claude-3-5-sonnet-20241022
# Or Azure OpenAI
AZURE_API_KEY=...
AZURE_API_BASE=https://your-resource.openai.azure.com/
LITELLM_MODEL=azure/gpt-4o

Storage

Configure object storage for file uploads:

Terminal window
# Local storage (default)
STORAGE_TYPE=local
STORAGE_PATH=/data/uploads
# Or S3-compatible
STORAGE_TYPE=s3
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...
S3_BUCKET=flexus-uploads

Updating Flexus

To update to the latest version:

Terminal window
# Pull latest images
docker-compose pull
# Restart with new images
docker-compose up -d
# Run database migrations
docker-compose exec backend python -m prisma migrate deploy

Stopping Flexus

Terminal window
# Stop all services
docker-compose down
# Stop and remove volumes (WARNING: deletes all data)
docker-compose down -v

Troubleshooting

Services Won’t Start

Check if ports are already in use:

Terminal window
# Check port 3000
lsof -i :3000
# Check port 8000
lsof -i :8000

Database Connection Errors

Ensure PostgreSQL is fully started before backend:

Terminal window
docker-compose logs postgres

Bot Not Responding

Check the advancer service logs:

Terminal window
docker-compose logs advancer

Verify your LLM API key is correct:

Terminal window
docker-compose exec backend python -c "import os; print(os.environ.get('LITELLM_API_KEY', 'NOT SET')[:10] + '...')"

Next Steps

Production Considerations

For production deployments, consider:

  1. Use external databases — PostgreSQL and Redis with proper backups
  2. Configure HTTPS — Set up a reverse proxy (nginx, Traefik) with TLS
  3. Set up monitoring — Prometheus metrics are exposed on /metrics
  4. Configure log aggregation — Services log to stdout in JSON format
  5. Scale horizontally — Use Kubernetes for multi-node deployments

See the Kubernetes Deployment Guide for production-ready setup.