How to install Ollama AI Tool with Oterm and Open WebUI on NixOS
Introduction
After my adventure with Serge AI Chat, it was time to explore other AI tools, this time in combination with NixOS. I also want to utilize the GPU, specifically CUDA, where Serge only used the CPU.
After some research, I came across Ollama, an easy-to-set-up software that can serve as a backend for many integrations and it has the much-desired GPU support.
For the frontend, I wanted to try out the following software:
- oterm, a text-based terminal client. After installing Ollama, all you have to do is type
oterm
in the terminal. - Open WebUI, which mimics ChatGPT’s frontend and has many features, including image generation with tools like AUTOMATIC1111 or ComfyUI. According to the developers, it works completely offline and prioritizes privacy and data security—something I can always appreciate!
Both Ollama and Open WebUI are available as services within NixOS. A package is available for oterm.
How To
NixOS Configuration
-
Add the
ollama
andopen-webui
services andoterm
package to/etc/nixos/configuration.nix
:In this case I am using the default packages from the stable channel. But you can also use newer versions from the unstable channel (if available).
-
Switch NixOS configuration
Now you can switch to the new configuration within NixOS, the image will be downloaded and the container will be created:
Run the following command:
You can check the logs with the command
journalctl -u ollama.service
to see if Ollama is running properly. The address http://127.0.0.1:11434/ should also show the messageOllama is running
in the browser. With the commandoterm
, you can then open the client.If the open-webui service doesn’t work properly, the instructions below explain how to install the container.
Open WebUI container (optional)
Since the open-webui service didn’t work for me, I installed the container, and that worked well.
-
Add virtualisation to
configuration.nix
Add
virtualisation
and the import to a seperate nix file for the container toconfiguration.nix
: -
Add the macvlan network to
configuration.nix
The container will use a macvlan network (
net_macvlan
) with a dedicated IP address. Add the following toconfiguration.nix
:Instructions:
- Required Replace
192.168.1.1
with your gateway IP address - Required Replace
192.168.1.0
with your subnet - Required Replace
ens18
with the name of own network interface
- Required Replace
-
Add a script to create folders to
configuration.nix
Make sure the folders for use with the container are created by adding the following to
configuration.nix
:Instructions:
- Required Replace
<username>
with your NixOS username
- Required Replace
-
Create the containers folder
Run the following command:
-
Add the container configuration to
open-webui.nix
Add the following to
open-webui.nix
:Instructions:
- Required Replace
Europe/Amsterdam
with your own timezone - Required Replace
<ollama IP address>
with the IP address of Ollama - Required Replace
<username>
with your NixOS username - Optional Replace
--pull=newer
with--pull=never
if you do not want the image to be automatically replaced by new versions - Optional Replace
net_macvlan
with the name of your macvlan network if needed - Required Replace
<IP address>
with the IP address of this container. Make sure it is within the range of the macvlan network - Required Replace
<MAC address>
a (randomly generated) MAC address. Otherwise, every time the container is started, a new mac address will be used, which for example will be created as a new device within the Unifi Network Application. Or temporarily disable this option, and add the MAC address that is generated the first time when this container is started. Use inspect to get the MAC address if needed:sudo podman inspect <container name> |grep MacAddress|tr -d ' ,"'|sort -u
- Required Replace
-
Switch NixOS configuration
Now you can switch to the new configuration within NixOS, the image will be downloaded and the container will be created:
Run the following command:
-
Check the results
Run the following command to check if the container is working properly:
Now you can browse to Open WebUI by opening a web browser and going to:
http://localhost:8080
. Replace localhost with the relevant IP address or FQDN if needed, and adjust the port if you changed it earlier.You will be prompted to create an account, which is only used locally. If Ollama has not already downloaded a model, you can also download one via
Settings
>Model
and select a model.
No comments found for this note.
Join the discussion for this note on Github. Comments appear on this page instantly.