Installation Guide¶
This guide will help you install and set up AI Dev Local on your system.
Prerequisites¶
Before installing AI Dev Local, ensure you have the following prerequisites:
Required¶
- Python 3.10+ - AI Dev Local requires Python 3.10 or higher
- Docker & Docker Compose - For running the containerized services
- pipx - For isolated Python package installation
Recommended¶
- Git - For version control integration
- curl - For API testing and health checks
Installation Methods¶
Method 1: pipx (Recommended)¶
pipx installs Python packages in isolated environments, preventing dependency conflicts:
# Install pipx if you haven't already
python3 -m pip install --user pipx
python3 -m pipx ensurepath
# Install AI Dev Local
pipx install ai-dev-local
# Verify installation
ai-dev-local --version
Method 2: pip (Global Installation)¶
Not Recommended
Global pip installation can cause dependency conflicts. Use pipx instead.
Method 3: Development Installation¶
For development or contributing to the project:
# Clone the repository
git clone https://github.com/brunseba/ai-dev-local.git
cd ai-dev-local
# Install in development mode
pip install -e .
# Or using uv (faster)
uv pip install -e .
System Setup¶
Docker Installation¶
AI Dev Local requires Docker and Docker Compose:
- Download and install Docker Desktop for Windows
- Ensure WSL2 is enabled
- Restart your system after installation
Verify Docker Installation¶
# Check Docker version
docker --version
# Check Docker Compose version
docker compose version
# Test Docker installation
docker run hello-world
Initial Setup¶
1. Initialize Configuration¶
This creates a .env
file from the template with default values.
2. Configure API Keys¶
Set your OpenAI API key (required):
Optional: Configure additional LLM providers:
ai-dev-local config set ANTHROPIC_API_KEY your-anthropic-key
ai-dev-local config set GEMINI_API_KEY your-gemini-key
ai-dev-local config set COHERE_API_KEY your-cohere-key
3. Validate Configuration¶
4. Start Services¶
# Start all core services
ai-dev-local start
# Or start with Ollama for local models
ai-dev-local start --ollama
5. Verify Installation¶
Post-Installation¶
Access Services¶
After successful installation, you can access:
Service | URL | Purpose |
---|---|---|
Dashboard | http://localhost:3002 | Main control panel |
Open WebUI | http://localhost:8081 | Chat interface |
FlowiseAI | http://localhost:3001 | Visual workflows |
Langfuse | http://localhost:3000 | LLM observability |
LiteLLM | http://localhost:4000 | API gateway |
Documentation | http://localhost:8000 | This documentation |
Default Credentials¶
Some services require login credentials:
Service | Username | Password |
---|---|---|
FlowiseAI | admin | admin123 |
LiteLLM UI | admin | admin123 |
Security
Change default passwords in production environments using the Configuration Guide.
Troubleshooting¶
Common Installation Issues¶
Permission Denied Errors¶
# Fix Docker permissions (Linux/macOS)
sudo usermod -aG docker $USER
newgrp docker
# Fix file permissions
sudo chown -R $USER:$USER ~/.local/share/pipx
Port Conflicts¶
# Check what's using a port
lsof -i :3000
# Change conflicting ports
ai-dev-local config set LANGFUSE_PORT 3030
Docker Issues¶
# Restart Docker daemon
sudo systemctl restart docker # Linux
# Or restart Docker Desktop on macOS/Windows
# Clean Docker system
docker system prune -a
Python Version Issues¶
# Check Python version
python3 --version
# Install Python 3.10+ using pyenv
curl https://pyenv.run | bash
pyenv install 3.11.0
pyenv global 3.11.0
Getting Help¶
If you encounter issues:
- Check the logs:
ai-dev-local logs
- Validate configuration:
ai-dev-local config validate
- Review documentation: Configuration Guide
- Search existing issues: GitHub Issues
- Report new issues: Create Issue
Next Steps¶
After successful installation:
- Quick Start Tutorial - Learn the basics
- Configuration Guide - Customize your setup
- IDE MCP Setup - Integrate with your editor
Uninstallation¶
To completely remove AI Dev Local: