NixOS - Ollama AI tool with Oterm and Open WebUI Setup
After my adventure with Serge AI Chat it was time to take a look at other AI tools. But now in combination with NixOS. And I would like to use the GPU this time, cuda in my case, where Serge only used the CPU.
After some research, I came across Ollama, easy-to-set up software that can serve as a backend for many integrations and it has the much desired GPU support.
As a frontend, I wanted to try out the following software:
-
oterm. Text-based terminal client, after installing Ollama all you have to do is type
oterm
in the terminal. - Open WebUI mimics ChatGPT’s frontend and has many features including image generation with e.g. AUTOMATIC1111 or ComfyUI. Also, according to the developers, it works completely offline and priority is given to privacy and data security above all. That’s something I can always appreciate!
Ollama is available as a service within NixOS, oterm I have only found in the unstable channel at the moment. There’s a request open to add Open WebUI as a package to NixOS, but it already works fine via a Podman container as I’ll describe below.
Configuration
Open configuration.nix
:
sudo nano /etc/nixos/configuration.nix
First add the Ollama service. I chose to use the service from the unstable channel to have more options like environmentVariables
(I think this will be soon available via the Stable Channel):
# Ollama
services.ollama = {
#package = pkgs.unstable.ollama; # Uncomment if you want to use the unstable channel, see https://fictionbecomesfact.com/nixos-unstable-channel
enable = true;
acceleration = "cuda"; # Or "rocm"
#environmentVariables = { # I haven't been able to get this to work myself yet, but I'm sharing it for the sake of completeness
# HOME = "/home/ollama";
# OLLAMA_MODELS = "/home/ollama/models";
# OLLAMA_HOST = "0.0.0.0:11434"; # Make Ollama accesible outside of localhost
# OLLAMA_ORIGINS = "http://localhost:8080,http://192.168.0.10:*"; # Allow access, otherwise Ollama returns 403 forbidden due to CORS
#};
};
And then add oterm to the system packages (optional):
environment.systemPackages = with pkgs; [
unstable.oterm
];
Again, I use the unstable channel because otherwise oterm is not available at the moment.
Save the changes to configuration.nix
.
Now you can switch to the new configuration:
sudo nix-collect-garbage # optional: clean up
sudo nixos-rebuild switch
You can check the logs with the command journalctl -u ollama.service
too see if Ollama is running properly. The address http://127.0.0.1:11434/ should also show the message Ollama is running
in the browser.
With the command oterm
you can then open the client.
It is good to know that all files are stored in
/var/lib/ollama
by default. For example, downloaded models can be found in/var/lib/ollama/models
. The folders can be accessed by root. Ollama also uses its own user, which I haven’t looked at yet.
Open WebUI
To make it even better, you can install the Open WebUI via a Podman container.
Open configuration.nix
:
sudo nano /etc/nixos/configuration.nix
Add the folder to configuration.nix
which is needed by the Podman container:
# Create directories and run scripts for the containers
system.activationScripts = {
script.text = ''
install -d -m 755 /home/USER/open-webui/data -o root -g root
'';
};
IMPORTANT: make sure the folders are created with the root user and group else the container will not work properly
Adjust the following:
USER
Replace USER with your NixOS user
Add virtualisation and the settings:
virtualisation = {
podman = {
enable = true;
dockerCompat = true;
#defaultNetwork.settings.dns_enabled = true;
};
oci-containers = {
backend = "podman";
containers = {
open-webui = import ./containers/open-webui.nix;
};
};
};
Exit Nano (CTRL-X) and save the changes to configuration.nix
.
Create and edit open-webui.nix
:
mkdir -p /etc/nixos/containers # make sure the directory exists
sudo nano /etc/nixos/containers/open-webui.nix
Copy the below to open-webui.nix
:
{
image = "ghcr.io/open-webui/open-webui:main";
environment = {
"TZ" = "Europe/Amsterdam";
"OLLAMA_API_BASE_URL" = "http://127.0.0.1:11434/api";
"OLLAMA_BASE_URL" = "http://127.0.0.1:11434";
};
volumes = [
"/home/USER/open-webui/data:/app/backend/data"
];
ports = [
"127.0.0.1:3000:8080" # Ensures we listen only on localhost
];
extraOptions = [
"--pull=newer" # Pull if the image on the registry is newer
"--name=open-webui"
"--hostname=open-webui"
"--network=host"
"--add-host=host.containers.internal:host-gateway"
];
}
Adjust the following if needed:
“TZ” = “Europe/Amsterdam”
Pick the right timezone
volumes
Replace USER with your username
”–pull=newer”
Disable this option if you do not want the image to be automatically replaced by new versions
”–network=host”
It is also possible to use net_macvlan. For an example please see the note about MariaDB
Exit Nano (CTRL-X) and save the changes to open-webui.nix
.
Now you can switch to the new configuration:
sudo nix-collect-garbage # optional: clean up
sudo nixos-rebuild switch
You can check the logs with the command journalctl -u podman-open-webui.service
too see if the container is running properly.
The Open WebUI can be reached via the URL:
http://127.0.0.1:8080
You will be asked to create an account. This is an account that is only used locally!
If Ollama has not already downloaded a model, you can also download it via Settings
> Model
and pull a model.
Read other notes
Tags
Notes mentioning this note
- NixOS - Add Unstable Channel Packages and Services to Configuration
Some packages are lagging behind within the default channel of NixOS. Then it is nice to have the option to...
Comments
No comments found for this note.
Join the discussion for this note on this ticket. Comments appear on this page instantly.