Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

open-webui: python3.12-torch is marked as broken, refusing to evaluate. #379354

Open
3 tasks done
MROvaiz opened this issue Feb 4, 2025 · 4 comments
Open
3 tasks done
Labels

Comments

@MROvaiz
Copy link

MROvaiz commented Feb 4, 2025

Nixpkgs version

  • Unstable (25.05)

Describe the bug

Hi,
I have been tiring to setup open-webui for ollama.
i have updated nix with nix flake update.
with this rebuild is throwing error.

Steps to reproduce

add services.open-webui.enable=true;, this should start the service for open-webui.
with latest version, its throwing error: "is marked as broken, refusing to evaluate."

Expected behaviour

rebuild should be success and open-webui access in browser.

Screenshots

error:
       … while calling the 'head' builtin
         at /nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source/lib/attrsets.nix:1574:11:
         1573|         || pred here (elemAt values 1) (head values) then
         1574|           head values
             |           ^
         1575|         elsewhile evaluating the attribute 'value'
         at /nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source/lib/modules.nix:846:9:
          845|     in warnDeprecation opt //
          846|       { value = addErrorContext "while evaluating the option `${showOption loc}':" value;
             |         ^
          847|         inherit (res.defsFinal') highestPrio;while evaluating the option `system.build.toplevel':

       … while evaluating definitions from `/nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source/nixos/modules/system/activation/top-level.nix':

       … while evaluating the option `warnings':

       … while evaluating definitions from `/nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source/nixos/modules/system/boot/systemd.nix':

       … while evaluating the option `systemd.services.open-webui.serviceConfig':

       … while evaluating definitions from `/nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source/nixos/modules/services/misc/open-webui.nix':

       (stack trace truncated; use '--show-trace' to show the full, detailed trace)

       error: Packagepython3.12-torch-2.5.1in /nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source/pkgs/development/python-modules/torch/default.nix:664 is marked as broken, refusing to evaluate.

       a) To temporarily allow broken packages, you can use an environment variable
          for a single invocation of the nix tools.

            $ export NIXPKGS_ALLOW_BROKEN=1

          Note: When using `nix shell`, `nix build`, `nix develop`, etc with a flake,
                then pass `--impure` in order to allow use of environment variables.

       b) For `nixos-rebuild` you can set
         { nixpkgs.config.allowBroken = true; }
       in configuration.nix to override this.

       c) For `nix-env`, `nix-build`, `nix-shell` or any other Nix command you can add
         { allowBroken = true; }
       to ~/.config/nixpkgs/config.nix.

Relevant log output

Additional context

No response

System metadata

  • system: "x86_64-linux"
  • host os: Linux 6.12.11, NixOS, 25.05 (Warbler), 25.05.20250201.3a22805
  • multi-user?: yes
  • sandbox: yes
  • version: nix-env (Nix) 2.24.12
  • nixpkgs: /nix/store/hjb1rqv2mfs5ny47amj2gsc8xk05x5g6-source

Notify maintainers


Note for maintainers: Please tag this issue in your pull request description. (i.e. Resolves #ISSUE.)

I assert that this issue is relevant for Nixpkgs

Is this issue important to you?

Add a 👍 reaction to issues you find important.

@MROvaiz MROvaiz added the 0.kind: bug Something is broken label Feb 4, 2025
@khaneliman
Copy link
Contributor

  brokenConditions = attrsets.filterAttrs (_: cond: cond) {
    "CUDA and ROCm are mutually exclusive" = cudaSupport && rocmSupport;
    "CUDA is not targeting Linux" = cudaSupport && !stdenv.hostPlatform.isLinux;
    "Unsupported CUDA version" =
      cudaSupport
      && !(builtins.elem cudaPackages.cudaMajorVersion [
        "11"
        "12"
      ]);
    "MPI cudatoolkit does not match cudaPackages.cudatoolkit" =
      MPISupport && cudaSupport && (mpi.cudatoolkit != cudaPackages.cudatoolkit);
    # This used to be a deep package set comparison between cudaPackages and
    # effectiveMagma.cudaPackages, making torch too strict in cudaPackages.
    # In particular, this triggered warnings from cuda's `aliases.nix`
    "Magma cudaPackages does not match cudaPackages" =
      cudaSupport && (effectiveMagma.cudaPackages.cudaVersion != cudaPackages.cudaVersion);
    "Rocm support is currently broken because `rocmPackages.hipblaslt` is unpackaged. (2024-06-09)" =
      rocmSupport;
  };

@raj-magesh
Copy link

I'm trying to run this with nixpkgs.config.rocmSupport = true; and I'm getting the same error: Package ‘python3.12-torch-2.5.1’ in /nix/store/k8nkf470zpidpa5nh76lh2x6rxfzpwa4-source/pkgs/development/python-modules/torch/default.nix:667 is marked as broken, refusing to evaluate.

@khaneliman Do you expect that this draft PR which packages rocmPackages.hipblaslt will unbreak it?

@MROvaiz
Copy link
Author

MROvaiz commented Feb 6, 2025

Yes,
if i comment nixpkgs.config.rocmSupport = true; then it was able to rebuild.
and it's working completely fine.

@raj-magesh , Please do let me know the exact use of nixpkgs.config.rocmSupport = true;.
I do have i5 13600k, RX 7800XT with 32GB RAM.

is that required config?

if its related to that PR, its been going on for long time.

@raj-magesh
Copy link

I'm still a noob, so take all this with a grain of salt.

nixpkgs.config.rocmSupport = true seems to control whether many packages in nixpkgs are built with ROCm support (i.e. AMD's general GPU computation libraries, equivalent to CUDA on Nvidia).

From my lurking on various issue trackers here, ROCm packages seem to be a massive headache to package in general, and probably even more so with Nix/NixOS.

I haven't used open-webui before, but I want to try it out. I'm planning to mostly use it with ollama (whose ROCm support works after a recent commit), so I don't know if I really need open-webui itself built with ROCm support. I suppose I could just disable global ROCm support (i.e. nixpkgs.config.rocmSupport = false) and enable it selectively for e.g. ollama but I'd rather the broken packages (open-webui, tabby, etc.) get properly fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants