How to install Ollama and Open WebUI using docker compose

Install Ollama and Open WebUI

Stop and remove all containers which are already running.

docker stop $(docker ps -q); docker rm $(docker ps -q -a)

Ollama Docker to run with NVIDIA GPU (–gpus=all)

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

You might see these errors.

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Unable to find image 'ollama/ollama:latest' locally
latest: Pulling from ollama/ollama
857cc8cb19c0: Pull complete
2c3df3d580d0: Pull complete
6014224340ef: Pull complete
fc86a784b436: Pull complete
Digest: sha256:bf65f12eb64051f9429d6348e65c2d4c5415981de10eefe3faa45f1078abb729
Status: Downloaded newer image for ollama/ollama:latest
fba9c1ad05b2deeace40580f5b347244930cae5c2d882a92704bab84ca040494
docker: Error response from daemon: driver failed programming external connectivity on endpoint ollama (747deb6a8acb6ef69f7e287c3719f379c94042da90f25a2397b3cc4286113e70): Error starting userland proxy: listen tcp4 0.0.0.0:11434: bind: address already in use.

If see above errors, remove all containers as below and run again.

docker stop $(docker ps -q); docker rm $(docker ps -q -a)

Open WebUI Docker

Run WebUI docker.

docker run -d -p 3000:8080 --env WEBUI_AUTH=False --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Open this URL, add a model such as “phi3”

http://localhost:3000/

Chooose the models from this URL depending on your PC’s RAM size.

https://www.ollama.com/library

Docker Compose

Configure Docker Compose to define ollama and open-webui in a single file (docker-compose.yml) and run with a single command (application) instead of running containers separately as before explanation.

Create docker-compose.yml file anywhere you like suck as “~/ollama”. This docker compose will create a separate storage (volume).

services:
  ollama:
    volumes:
      - ollama:/root/.ollama
    container_name: ollama
    pull_policy: always
    tty: true
    restart: unless-stopped
    image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest}
    environment:
      - 'OLLAMA_HOST=0.0.0.0:11434'
    # GPU support
    deploy:
      resources:
        reservations:
          devices:
            - driver: ${OLLAMA_GPU_DRIVER-nvidia}
              count: ${OLLAMA_GPU_COUNT-1}
              capabilities:
                - gpu
  open-webui:
    build:
      context: .
      args:
        OLLAMA_BASE_URL: '/ollama'
      dockerfile: Dockerfile
    image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG-main}
    container_name: open-webui
    volumes:
      - open-webui:/app/backend/data
    depends_on:
      - ollama
    ports:
      - ${OPEN_WEBUI_PORT-3000}:8080
    environment:
      - 'OLLAMA_BASE_URL=http://ollama:11434'
      - 'WEBUI_AUTH=False'
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped

volumes:
  ollama: {}
  open-webui: {}

These lines to use NVIDIA GPU

    # GPU support
    deploy:
      resources:
        reservations:
          devices:
            - driver: ${OLLAMA_GPU_DRIVER-nvidia}
              count: ${OLLAMA_GPU_COUNT-1}
              capabilities:
                - gpu

This line to skip user registration screen.

      - 'WEBUI_AUTH=False'

To pull images and run containers

docker compose up -d
[+] Running 1/1
 ✔ ollama Pulled                                                                                                   2.7s
[+] Running 3/3
 ✔ Network test_default  Created                                                                                   0.1s
 ✔ Container ollama      Started                                                                                   0.5s
 ✔ Container open-webui  Started                                                                                   0.8s

To down containers

docker compose down
[+] Running 3/3
 ✔ Container open-webui  Removed                                                                                   1.7s
 ✔ Container ollama      Removed                                                                                   0.6s
 ✔ Network test_default  Removed    

If you are running on WSL2, Ollama and Open WebUI will start again after your PC boot and WSL2 ubuntu terminal is started.