PAUL'S BLOG

Learn. Build. Share. Repeat.

Running Ollama locally with Open WebUI

2025-10-08 8 min read Tutorial AI Ollama Open WebUI Docker

Ollama is a popular tool for running large language models (LLMs) locally on your machine. It provides a simple interface to interact with various models without needing an internet connection. Open WebUI is a web-based user interface that allows you to interact with LLMs through a browser.

I am working on an Ubuntu 24.04.3 LTS Desktop machine with decent hardware to run models locally, so my preference is to run Ollama as a local service rather than confining it to a container. This way, Ollama can take full advantage of my machine’s capabilities, especially the GPU.

Continue reading