Open Web UI & Ollama Integration
Last updated
Last updated
There are multiple modes of deployment but as we have installed ollama LLMs locally we will use the installation method to use docker UI & host based LLMs
After the docker instance is sucessfullt launched we will have to edit some service configuration to make ollama available on the "everywhere" mode
We will add two lines in the config file to allow "0.0.0.0:11434" interface to access ollama
Save the file & exit and now we will reload the service to make changes reflected
To verify the changes are sucessfully reflected back we will open the following URL
& Here we go , Our local LLM with Open WebUI interface is running .....
For other modes of installation we can refer to the ollama & open webUI documentation