Open Web UI & Ollama Integration

Open Web UI Interface

There are multiple modes of deployment but as we have installed ollama LLMs locally we will use the installation method to use docker UI & host based LLMs

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

After the docker instance is sucessfullt launched we will have to edit some service configuration to make ollama available on the "everywhere" mode

$> systemctl edit ollama.service

We will add two lines in the config file to allow "0.0.0.0:11434" interface to access ollama

[Service]
Environment="OLLAMA_HOST=0.0.0.0"

Save the file & exit and now we will reload the service to make changes reflected

$> sudo systemctl daemon-reload
$> sudo systemctl restart ollama.service

To verify the changes are sucessfully reflected back we will open the following URL

curl -l http://localhost:11434

// THE OUTPUT SHOULD BE //
ollama is running

Setting UFW Firewall

sudo ufw allow 3000
sudo ufw allow 11434
sudo ufw allow 80

Accessing the web UI

http://localhost:3000/auth

& Here we go , Our local LLM with Open WebUI interface is running .....


For other modes of installation we can refer to the ollama & open webUI documentation

Last updated