# Open Web UI & Ollama Integration

{% hint style="info" %}
<https://github.com/open-webui/open-webui>
{% endhint %}

<figure><img src="https://2332860236-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2Fq6mjlFfyDOi3mV0lemKE%2Fuploads%2FcyDpsSNISrJq2bLDTw3f%2Fimage.png?alt=media&#x26;token=ab180122-ec90-40db-8bb8-785b5b93c4c5" alt=""><figcaption><p>Open Web UI Interface</p></figcaption></figure>

There are multiple modes of deployment but as we have installed ollama LLMs locally we will use the installation method to use docker UI & host based LLMs

```
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```

After the docker instance is sucessfullt launched we will have to edit some service configuration to make ollama available on the "everywhere" mode

```
$> systemctl edit ollama.service
```

<figure><img src="https://2332860236-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2Fq6mjlFfyDOi3mV0lemKE%2Fuploads%2FfTLS0CFrf2k9Li5xoOIe%2Fimage.png?alt=media&#x26;token=80cfba4f-75dc-4461-a2fb-ac89d23550a7" alt=""><figcaption></figcaption></figure>

We will add two lines in the config file to allow "0.0.0.0:11434" interface to access ollama

```
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
```

Save the file & exit and now we will reload the service to make changes reflected&#x20;

```
$> sudo systemctl daemon-reload
$> sudo systemctl restart ollama.service
```

To verify the changes are sucessfully reflected back we will open the following URL

```
curl -l http://localhost:11434

// THE OUTPUT SHOULD BE //
ollama is running
```

***

#### Setting UFW Firewall

```
sudo ufw allow 3000
sudo ufw allow 11434
sudo ufw allow 80
```

***

#### Accessing the web UI

```
http://localhost:3000/auth
```

<figure><img src="https://2332860236-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2Fq6mjlFfyDOi3mV0lemKE%2Fuploads%2F8GgYT9HUjPkOgMK3P8Bh%2Fimage.png?alt=media&#x26;token=59e70442-ffe6-4a9f-9086-d0144b37bb13" alt=""><figcaption></figcaption></figure>

<figure><img src="https://2332860236-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2Fq6mjlFfyDOi3mV0lemKE%2Fuploads%2FsVKU1GmMm9pVkfNiGORA%2Fimage.png?alt=media&#x26;token=5435b405-2cce-4993-a7ed-98a2ef28e871" alt=""><figcaption></figcaption></figure>

& Here we go , Our local LLM with Open WebUI interface is running .....

***

For other modes of installation we can refer to the ollama & open webUI documentation&#x20;

{% hint style="info" %}
<https://docs.openwebui.com/features/>
{% endhint %}
