Skip to content

How to install Ollama AI Tool with Oterm and Open WebUI on NixOS

Introduction

After my adventure with Serge AI Chat, it was time to explore other AI tools, this time in combination with NixOS. I also want to utilize the GPU, specifically CUDA, where Serge only used the CPU.

After some research, I came across Ollama, an easy-to-set-up software that can serve as a backend for many integrations and it has the much-desired GPU support.

For the frontend, I wanted to try out the following software:

  • oterm, a text-based terminal client. After installing Ollama, all you have to do is type oterm in the terminal.
  • Open WebUI, which mimics ChatGPT’s frontend and has many features, including image generation with tools like AUTOMATIC1111 or ComfyUI. According to the developers, it works completely offline and prioritizes privacy and data security—something I can always appreciate!

Both Ollama and Open WebUI are available as services within NixOS. A package is available for oterm.

How To

NixOS Configuration

  1. Add the ollama and open-webui services and oterm package to /etc/nixos/configuration.nix:

    /etc/nixos/configuration.nix
    # To edit use your text editor application, for example Nano
    services.ollama = {
    # package = pkgs.unstable.ollama; # If you want to use the unstable channel package for example
    enable = true;
    acceleration = "cuda"; # Or "rocm"
    # environmentVariables = { # I haven't been able to get this to work myself yet, but I'm sharing it for the sake of completeness
    # HOME = "/home/ollama";
    # OLLAMA_MODELS = "/home/ollama/models";
    # OLLAMA_HOST = "0.0.0.0:11434"; # Make Ollama accesible outside of localhost
    # OLLAMA_ORIGINS = "http://localhost:8080,http://192.168.0.10:*"; # Allow access, otherwise Ollama returns 403 forbidden due to CORS
    #};
    };
    services.open-webui = {
    enable = true;
    environment = {
    ANONYMIZED_TELEMETRY = "False";
    DO_NOT_TRACK = "True";
    SCARF_NO_ANALYTICS = "True";
    OLLAMA_API_BASE_URL = "http://127.0.0.1:11434/api";
    OLLAMA_BASE_URL = "http://127.0.0.1:11434";
    };
    };
    # Add oterm to the systemPackages
    environment.systemPackages = with pkgs; [
    oterm
    ];

    In this case I am using the default packages from the stable channel. But you can also use newer versions from the unstable channel (if available).

  2. Switch NixOS configuration

    Now you can switch to the new configuration within NixOS, the image will be downloaded and the container will be created:

    Run the following command:

    # Open your terminal application
    sudo nix-collect-garbage # Optional: clean up
    sudo nixos-rebuild switch

    You can check the logs with the command journalctl -u ollama.service to see if Ollama is running properly. The address http://127.0.0.1:11434/ should also show the message Ollama is running in the browser. With the command oterm, you can then open the client.

    If the open-webui service doesn’t work properly, the instructions below explain how to install the container.

Open WebUI container (optional)

Since the open-webui service didn’t work for me, I installed the container, and that worked well.

  1. Add virtualisation to configuration.nix

    Add virtualisation and the import to a seperate nix file for the container to configuration.nix:

    /etc/nixos/configuration.nix
    # To edit use your text editor application, for example Nano
    virtualisation = {
    podman = {
    enable = true;
    dockerCompat = true; # Create a `docker` alias for podman, to use it as a drop-in replacement
    defaultNetwork.settings.dns_enabled = true; # release 23.05
    };
    oci-containers = {
    backend = "podman";
    containers = {
    open-webui = import ./containers/open-webui.nix;
    };
    };
    };
  2. Add the macvlan network to configuration.nix

    The container will use a macvlan network (net_macvlan) with a dedicated IP address. Add the following to configuration.nix:

    /etc/nixos/configuration.nix
    # To edit use your text editor application, for example Nano
    systemd.services.create-podman-network = with config.virtualisation.oci-containers; {
    serviceConfig.Type = "oneshot";
    wantedBy = [ "${backend}-open-webui.service" ];
    script = ''${pkgs.podman}/bin/podman network exists net_macvlan || \ ${pkgs.podman}/bin/podman network create --driver=macvlan --gateway=192.168.1.1 --subnet=192.168.1.0/24 -o parent=ens18 net_macvlan'';
    };
    # IMPORTANT: Please read the instructions below
    Instructions:
    • Required Replace 192.168.1.1 with your gateway IP address
    • Required Replace 192.168.1.0 with your subnet
    • Required Replace ens18 with the name of own network interface
  3. Add a script to create folders to configuration.nix

    Make sure the folders for use with the container are created by adding the following to configuration.nix:

    /etc/nixos/configuration.nix
    # To edit use your text editor application, for example Nano
    system.activationScripts = {
    script.text = ''
    install -d -m 755 /home/<username>/open-webui/data -o root -g root
    '';
    };
    # IMPORTANT: Please read the instructions below
    Instructions:
    • Required Replace <username> with your NixOS username
  4. Create the containers folder

    Run the following command:

    # Open your terminal application
    mkdir -p /etc/nixos/containers # Make sure the directory exists
  5. Add the container configuration to open-webui.nix

    Add the following to open-webui.nix:

    /etc/nixos/containers/caddy.nix
    # To edit use your text editor application, for example Nano
    {
    image = "ghcr.io/open-webui/open-webui:main";
    environment = {
    "TZ" = "Europe/Amsterdam";
    "OLLAMA_API_BASE_URL" = "http://<ollama IP address>:11434/api";
    "OLLAMA_BASE_URL" = "http://<ollama IP address>:11434";
    };
    volumes = [
    "/home/<username>/open-webui/data:/app/backend/data"
    ];
    extraOptions = [
    "--pull=newer" # Pull if the image on the registry is newer
    "--name=open-webui"
    "--hostname=open-webui"
    "--network=net_macvlan"
    "--ip=<IP address>"
    "--mac-address=<MAC address>"
    ];
    }
    # IMPORTANT: Please read the instructions below
    Instructions:
    • Required Replace Europe/Amsterdam with your own timezone
    • Required Replace <ollama IP address> with the IP address of Ollama
    • Required Replace <username> with your NixOS username
    • Optional Replace --pull=newer with --pull=never if you do not want the image to be automatically replaced by new versions
    • Optional Replace net_macvlan with the name of your macvlan network if needed
    • Required Replace <IP address> with the IP address of this container. Make sure it is within the range of the macvlan network
    • Required Replace <MAC address> a (randomly generated) MAC address. Otherwise, every time the container is started, a new mac address will be used, which for example will be created as a new device within the Unifi Network Application. Or temporarily disable this option, and add the MAC address that is generated the first time when this container is started. Use inspect to get the MAC address if needed: sudo podman inspect <container name> |grep MacAddress|tr -d ' ,"'|sort -u
  6. Switch NixOS configuration

    Now you can switch to the new configuration within NixOS, the image will be downloaded and the container will be created:

    Run the following command:

    # Open your terminal application
    sudo nix-collect-garbage # Optional: clean up
    sudo nixos-rebuild switch
  7. Check the results

    Run the following command to check if the container is working properly:

    # Open your terminal application
    journalctl -u podman-open-webui.service

    Now you can browse to Open WebUI by opening a web browser and going to: http://localhost:8080. Replace localhost with the relevant IP address or FQDN if needed, and adjust the port if you changed it earlier.

    You will be prompted to create an account, which is only used locally. If Ollama has not already downloaded a model, you can also download one via Settings > Model and select a model.

Comments

    No comments found for this note.

    Join the discussion for this note on Github. Comments appear on this page instantly.

    Copyright 2021- Fiction Becomes Fact