Open Web UI & Ollama Integration

There are multiple modes of deployment but as we have installed ollama LLMs locally we will use the installation method to use docker UI & host based LLMs
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainAfter the docker instance is sucessfullt launched we will have to edit some service configuration to make ollama available on the "everywhere" mode

We will add two lines in the config file to allow "0.0.0.0:11434" interface to access ollama
Save the file & exit and now we will reload the service to make changes reflected
To verify the changes are sucessfully reflected back we will open the following URL
Setting UFW Firewall
Accessing the web UI


& Here we go , Our local LLM with Open WebUI interface is running .....
For other modes of installation we can refer to the ollama & open webUI documentation
Last updated