Setup

Docker + Ollama Setup

Use Resume Matcher with Ollama and Docker

Default Docker + Ollama Flow

The checked-in Docker setup still serves Resume Matcher on a single public origin at http://localhost:3000. To use Ollama with Docker, run Ollama on the host machine and point Resume Matcher at it.

Host Ollama Setup

  1. Install Ollama from ollama.com.
  2. Pull a model on the host, for example:
ollama pull qwen3:8b
  1. Start Resume Matcher from the app repo root with Ollama wired in:
LLM_API_BASE=http://host.docker.internal:11434 docker compose up -d
  1. Open http://localhost:3000/settings and set:
    • Provider: Ollama
    • Model: qwen3:8b or another model you already pulled

URLs

SurfaceURL
Apphttp://localhost:3000
Settingshttp://localhost:3000/settings
Healthhttp://localhost:3000/api/v1/health
API docshttp://localhost:3000/docs

What This Page Covers

This page documents the supported host-Ollama pattern for Docker.

  • Resume Matcher stays on http://localhost:3000
  • Ollama runs on the host machine
  • LLM_API_BASE points the containerized app at the host Ollama server

If you decide to run Ollama in a separate container, treat that as a custom compose extension. It is not the checked-in default Docker flow.

Troubleshooting

Ollama is not responding

Check the host Ollama service:

curl http://localhost:11434/api/tags

Resume Matcher cannot reach Ollama

Make sure you started Docker with:

LLM_API_BASE=http://host.docker.internal:11434 docker compose up -d

Responses are slow

Use a smaller local model or free up memory on the host.

Next Steps