0% found this document useful (0 votes)
233 views15 pages

Ollama With Open-Webui With A Fix - by Majed Ali - Medium

Uploaded by

Syed Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
233 views15 pages

Ollama With Open-Webui With A Fix - by Majed Ali - Medium

Uploaded by

Syed Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.

com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

Ollama with open-webui with a fix


Majed Ali · Follow
4 min read · Feb 10, 2024

239 4

1 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

2 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

Dalle 3 Generated image

NOTE: Edited on 11 May 2014 to reflect the naming change from ollama-webui to
open-webui

Before delving into the solution let us know what is the problem first, since
this problem will arise whenever we go deeper into using docker.

Lately, I have started playing with Ollama and some tasty LLM such as (llama
2, mistral, and Tinyllama), and nothing easier than installing Ollama, it’s
only one line:

curl -fsSL https://ollama.com/install.sh | sh

Downloading the language models even easier, choose a model from their
library, and the following command:

ollama run llama2

3 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

But I’ve got bored using the command line interface, I wanted to work with a
richer UI.

A while on GitHub’s Ollama page landed me on “open-webui”, which gives a


ChatGPT-like interface.

A good thing about it, is the docker image that I can use for installation, An
approach I prefer over cloning the repository and installing all libraries
contaminating my OS with cache files everywhere.

After trying multiple times to run open-webui docker container using the
command available on its GitHub page, it failed to connect to the Ollama API
server on my Linux OS host, the problem arose from the fact, that the Open-

WebUI tries to connect to Ollama on http://localhost:11434 from inside the


Docker container assuming the Ollama exists inside the container itself, but
that is not true, because Ollama is installed in the host OS itself.

So I need a way to tell Docker to redirect the connection outside the


container, and after some research, I’ve created a very simple setup to test
the connection between my OS and the container.

4 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

Created a simple web server using Python:

python -m http.server 8000

Which is considered the fastest web server anyone can make with only one
line, it only shows a web page containing the files and folders in the current
folder and offers them for downloads.

Then, I created a temporary Ubuntu container which will give me a


command line running inside the container.

docker run -it --rm ubuntu /bin/bash

Inside the container, I executed the following command to install curl to


test the connection:

5 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

apt update && apt install curl

Tested the connection using the command:

curl http://localhost:8000

But It gave the next error:

curl: (7) Failed to connect to localhost port 8000 after 0 ms: Connection refused

I made a small change to the container by adding the option --

network="host" to be like the following:

6 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

docker run -it --rm --network="host" ubuntu /bin/bash

Then running the curl command again:

curl http://localhost:8000

It successfully connected to the localhost showing the content of the


server's HTML

<!DOCTYPE HTML>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Directory listing for /</title>
</head>
<body>
<h1>Directory listing for /</h1>
<hr>
<ul>

7 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

...

So from the above pieces of information and trials, I’ve distilled the
following command:

docker run -d --network="host" -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http:

It gives a long string of characters and numbers, Representing the container


address, meaning our container is now running in the background.

You can access the open-webui on the URL:

http://localhost:8080/

Regarding the command above, here is a breakdown of the command


options:

8 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

• -d for running the container in the background.

• --network=”host” used to let the container access OS Host localhost

• -v open-webui:/app/backend/data used to mount a volume folder inside


the container, so we don’t lose our data whenever the container shuts
down.

• -e OLLAMA_API_BASE_URL=http://localhost:11434/api define environment


variable inside the container which will be used by the open-webui app to
connect to the Ollama server.

• --name open-webui the name we want to give to the container.

• ghcr.io/open-webui/open-webui:main the image on the hub we want to pull


to create the container from it.

One last thing, if you intend to use the open-webui on a daily basis and want
the container to be run automatically without needing to start it manually
every time, you may want to add --restart always in the command to let the
Docker run it whenever it shuts for any reason, so the command will be as
follows:

9 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

docker run -d --network="host" -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http:

You can now explore Ollama’s LLMs through a rich web UI, while Ollama is a
powerful platform, you want some convenience with its power, And that’s
what I’ve tried to accomplish in this minimum tutorial.

Ollama Docker Llm Llama 2 AI

10 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

Written by Majed Ali Follow

43 Followers

More from Majed Ali

Majed Ali Majed Ali in TechMirror

Free Access LLM Playgrounds Install Flutter SDK on windows


What Happening without Android Studio
To me Android Studio just a nightmare, as
soon as I start it, and my PC start screaming…

May 31 Jan 22, 2019 884 24

11 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

Majed Ali in TechMirror

Mirror Android Screen to a Linux


Wirelessly with Scrcpy and ADB
Introduction

Jun 6

See all from Majed Ali

Recommended from Medium

12 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

vvs in Write A Catalyst Bhavik Jikadara

How to Use Tools With Open WebUI Open WebUI Unveiled: Installation
A quick guide to installing and using Tools and Configuration
with Open WebUI What is Open WebUI?

Sep 26 52 May 12 11

Lists

Coding & Development Natural Language Processing


11 stories · 839 saves 1750 stories · 1343 saves

13 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

Venky Sam Smith in Technology Hits

Run Llama 3.1 Locally with Ollama How to run Ubuntu 24.04 on
and Open WebUI Windows via WSL
Create a free version of Chat GPT for yourself. Did you know, you can run Ubuntu on
Windows via Windows Subsystem for Linux. …

Jul 30 28 May 23 3

14 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4

AI Rabbit Gary A. Stafford

How to Set Up and Run Ollama on a Local Inference with Meta’s Latest
GPU-Powered VM (vast.ai) Llama 3.2 LLMs Using Ollama,…
In this tutorial, we’ll walk you through the Meta’s latest Llama 3.2 1B and 3B models are
process of setting up and using Ollama for… available from Ollama. Learn how to install…

Jul 10 57 Sep 27 204 3


See more recommendations

Help Status About Careers Press Blog Privacy Terms Text to speech Teams

15 of 15 10/12/2024, 9:52 AM

You might also like