Ollama With Open-Webui With A Fix - by Majed Ali - Medium
Ollama With Open-Webui With A Fix - by Majed Ali - Medium
com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
239 4
1 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
2 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
NOTE: Edited on 11 May 2014 to reflect the naming change from ollama-webui to
open-webui
Before delving into the solution let us know what is the problem first, since
this problem will arise whenever we go deeper into using docker.
Lately, I have started playing with Ollama and some tasty LLM such as (llama
2, mistral, and Tinyllama), and nothing easier than installing Ollama, it’s
only one line:
Downloading the language models even easier, choose a model from their
library, and the following command:
3 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
But I’ve got bored using the command line interface, I wanted to work with a
richer UI.
A good thing about it, is the docker image that I can use for installation, An
approach I prefer over cloning the repository and installing all libraries
contaminating my OS with cache files everywhere.
After trying multiple times to run open-webui docker container using the
command available on its GitHub page, it failed to connect to the Ollama API
server on my Linux OS host, the problem arose from the fact, that the Open-
4 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
Which is considered the fastest web server anyone can make with only one
line, it only shows a web page containing the files and folders in the current
folder and offers them for downloads.
5 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
curl http://localhost:8000
curl: (7) Failed to connect to localhost port 8000 after 0 ms: Connection refused
6 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
curl http://localhost:8000
<!DOCTYPE HTML>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Directory listing for /</title>
</head>
<body>
<h1>Directory listing for /</h1>
<hr>
<ul>
7 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
...
So from the above pieces of information and trials, I’ve distilled the
following command:
http://localhost:8080/
8 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
One last thing, if you intend to use the open-webui on a daily basis and want
the container to be run automatically without needing to start it manually
every time, you may want to add --restart always in the command to let the
Docker run it whenever it shuts for any reason, so the command will be as
follows:
9 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
You can now explore Ollama’s LLMs through a rich web UI, while Ollama is a
powerful platform, you want some convenience with its power, And that’s
what I’ve tried to accomplish in this minimum tutorial.
10 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
43 Followers
11 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
Jun 6
12 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
How to Use Tools With Open WebUI Open WebUI Unveiled: Installation
A quick guide to installing and using Tools and Configuration
with Open WebUI What is Open WebUI?
Sep 26 52 May 12 11
Lists
13 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
Run Llama 3.1 Locally with Ollama How to run Ubuntu 24.04 on
and Open WebUI Windows via WSL
Create a free version of Chat GPT for yourself. Did you know, you can run Ubuntu on
Windows via Windows Subsystem for Linux. …
Jul 30 28 May 23 3
14 of 15 10/12/2024, 9:52 AM
Ollama with open-webui with a fix | by Majed Ali | Medium https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4
How to Set Up and Run Ollama on a Local Inference with Meta’s Latest
GPU-Powered VM (vast.ai) Llama 3.2 LLMs Using Ollama,…
In this tutorial, we’ll walk you through the Meta’s latest Llama 3.2 1B and 3B models are
process of setting up and using Ollama for… available from Ollama. Learn how to install…
Help Status About Careers Press Blog Privacy Terms Text to speech Teams
15 of 15 10/12/2024, 9:52 AM