How to Run Aider AI Chat in Docker on Linux

Aider is a lightweight AI pair-programming assistant that runs in your terminal and edits your code in place. Running it in Docker keeps your environment clean and makes it easy to switch between models. This guide shows how to run Aider with:

  • OpenRouter (cloud API for GPT, Claude, Mistral, etc.)
  • Ollama (local models like LLaMA, Mistral, CodeLlama, etc.)

1. Prerequisites


2. Project structure

Create a working folder:

mkdir ~/aider-docker
cd ~/aider-docker

Suggested structure:


aider-docker/
├─ Dockerfile
├─ docker-compose.yml   (optional)
├─ .aider.conf.yml      (optional)
└─ .env                 (optional - contains API key)

3. Dockerfile (minimal)

FROM python:3.11-slim

RUN apt-get update && apt-get install -y git curl \
    && rm -rf /var/lib/apt/lists/*

WORKDIR /app

RUN pip install --no-cache-dir aider-chat

ENV PATH=$PATH:/root/.local/bin

ENTRYPOINT ["aider"]

Build it:

docker build -t aider .

 


 

4. Option A: Run with OpenRouter

Export your OpenRouter key:

export OPENROUTER_API_KEY="or_xxxxxxxx"

Run Aider:

docker run -it --rm \
  -v "$(pwd)":/workspace \
  -w /workspace \
  -e OPENROUTER_API_KEY="$OPENROUTER_API_KEY" \
  aider --model openrouter/anthropic/claude-3.5-sonnet

Pick any model from OpenRouter’s model list, for example:

  • openrouter/openai/gpt-4.1
  • openrouter/anthropic/claude-3.5-sonnet
  • openrouter/meta/llama-3.1-70b-instruct

Docker Compose example:

version: "3.9"

services:
  aider:
    build: .
    volumes:
      - .:/workspace
    working_dir: /workspace
    environment:
      - OPENROUTER_API_KEY=${OPENROUTER_API_KEY}
    tty: true
    stdin_open: true
    command: ["--model", "openrouter/anthropic/claude-3.5-sonnet"]

 


 

5. Option B: Run with Ollama (local models)

Install Ollama on your Linux host:

curl -fsSL https://ollama.com/install.sh | sh

Start a model:

ollama run llama3

Remember to have a little patience when downloading the models. Currently this llama3 command is downloading 4,7 gigabytes of data. Other models can be much larger.

Run Aider and point it to Ollama’s local API:

docker run -it --rm \
  -v "$(pwd)":/workspace \
  -w /workspace \
  aider --model ollama/llama3 --api-base http://host.docker.internal:11434/v1

Notes:

  • ollama/llama3 tells Aider which local model to use
  • --api-base points to Ollama’s OpenAI-compatible endpoint
  • On Linux, replace host.docker.internal with your host IP (e.g. 172.17.0.1)

Other Ollama models you can try:

  • ollama/mistral
  • ollama/codellama
  • ollama/phi3

 


 

6. Optional: .aider.conf.yml

You can avoid passing --model and --api-base every time by creating .aider.conf.yml in your project:

For OpenRouter:

model: openrouter/anthropic/claude-3.5-sonnet
editor: nano

For Ollama:

model: ollama/llama3
api_base: http://host.docker.internal:11434/v1
editor: nano

 


 

7. Updating Aider

To update to the latest Aider version:

docker build --no-cache -t aider .

 


 

8. Conclusion

With this Docker setup you can run Aider either:

  • Remotely via OpenRouter for access to cutting-edge models like GPT-4.1 or Claude 3.5
  • Locally via Ollama for private, offline coding sessions with models like LLaMA 3 or Mistral

This gives you a flexible workflow: test locally with Ollama, then switch to OpenRouter when you need stronger models.

9. Bonus

I have creates a git repo here so you can get started playing with the setup : https://github.com/larsmw/aider-docker

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
Please share this article on your favorite website or platform.