Ollama + Open WebUI deploys your own local large language model knowledge base and creates a custom Ollama model.

Written by
Iris Vance
Updated on:July-17th-2025
Recommendation

Create a personal AI knowledge base and customize the model experience.

Core content:
1. Ollama installation and custom startup settings
2. Open WebUI installation and environment configuration
3. Create an administrator account and upload private files to build a knowledge base

Yang Fangxian
Founder of 53AI/Most Valuable Expert of Tencent Cloud (TVP)

Install and set up Ollama

https://ollama.com/

After downloading and installing, it will start automatically with the system by default. The model defaults to the system disk, so there is no need to set it to start automatically. You can start it whenever you want; then set the default installation path of the model.

Delete the Ollama shortcut in the C:\Users\ xxx \AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup directory, and it will not start automatically.

Open the system environment variables and create new variables OLLAMA_MODELS Variable value: Customized model installation path.

Then reboot Ollama.

Install Open WebUI

  • Install miniconda:

https://docs.anaconda.com/miniconda/install/

  • Install Open WebUI:

https://github.com/open-webui/open-webui

You want to put Open WebUI Create an environment in the folder openwebui:

python -m venv openwebui

Activate the environment:

openwebui\Scripts\activate

Install using Tsinghua source:

pip install -i https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple open-webui

start up Open WebUI:

open-webui serve

After the startup is complete, the browser opens: http://localhost:8080

Create an administrator account (also local, will not be uploaded to the Internet):

If you don't need the openai API, turn it off, otherwise an error will be reported when starting (but it has no effect):

You can https://ollama.com/search Download various open source models and play with them locally. Upload your own private files and build your own knowledge base.

What if I want to play with models that are not in the Ollama model library (uncensored, NSFW)?

Create your own Ollama model

Official tutorial:

https://github.com/ollama/ollama/blob/main/docs/import.md

Here we take the gguf model as an example:

  • https://hf-mirror.com/models Download the model you like and put it in the Ollama model installation directory

  • In the same directory, create a new Modelfile.txt File, Input FROM ./model name.gguf

  • In the same directory, open cmd and execute ollama create my_model -f Modelfile.txt

  • New my_model The model can be Open WebUI Used in:

Quick Start

Create a new run.bat file, enter the following content, and put it in the newly created environment openwebui Directory, double-click it to start Open WebUI:

chcp 65001

@ echo  off
echo " ___ ___ ___ " ; 
echo " /\ \ /\ \ /\__\ ___ " ; 
echo " |::\ \ _\:\ \ /:/ _/_ /\__\ " ; 
echo " |:|:\ \ /\ \:\ \ /:/ /\ \ /:/__/ " ; 
echo " __|:|\:\ \ _\:\ \:\ \ /:/ /::\ \ /::\ \ " ; 
echo " /::::|_\:\__\ /\ \:\ \:\__\ /:/_/:/\:\__\ \/\:\ \ " ; 
echo " \:\~~\ \/__/ \:\ \:\/:/ / \:\/:/ /:/ / ~~\:\ \ " ; 
echo " \:\ \ \:\ \::/ / \::/ /:/ / \:\__\"; 
echo "
   \:\ \ \:\/:/ / \/_/:/ / /:/ / ";
echo "
    \:\__\ \::/ / /:/ / /:/ /  ";
echo "
     \/__/ \/__/ \/__/ \/__/   ";

REM 1. Activate the environment
call "
%~dp0Scripts\activate "

REM 2. Start
open-webui serve

REM 3. Optional: Keep the window
pause






end