Skip to content

Bug: Uncaught SIGABRT (SI_TKILL) running gpt-oss-20b #802

@gabrieltotene

Description

@gabrieltotene

Contact Details

gabriel.totene@lnbio.cnpem.br

What happened?

Hello, i tried to run the following gguf for gpt-oss-20b:
https://huggingface.co/unsloth/gpt-oss-20b-GGUF/resolve/main/gpt-oss-20b-Q8_0.gguf
https://huggingface.co/unsloth/gpt-oss-20b-GGUF/resolve/main/gpt-oss-20b-F16.gguf
https://huggingface.co/unsloth/gpt-oss-20b-GGUF/resolve/main/gpt-oss-20b-UD-Q8_K_XL.gguf

I tried a lot of other version of gguf, but always get the same error:
"Uncaught SIGABRT (SI_TKILL) at 0x6c01b75500008f43 on b-cca05-l.abtlus.org.br pid 36675 tid 36675
/usr/local/bin/llamafile
No error information
Linux Cosmopolitan 3.9.7 MODE=x86_64; #202508231538175738533622.04~8f363f2 SMP PREEMPT_DYNAMIC Tue S b-cca05-l.abtlus.org.br 6.16.3-76061603-generic"

I tested to run the same models with llama.cpp and they worked without any problem. Here is the command line that i used to run "llamafile -ngl 999 -m gpt-oss-20b.gguf --server --v2", my goal was running first the server just to see if it works and then build a .llamafile. I tried another versions for cuda and nvcc but didn't get far.

Version

OS: Linux Pop!_OS ubuntu 22.04 LTS
llamafile v0.9.3

What operating system are you seeing the problem on?

No response

Relevant log output

import_cuda_impl: initializing gpu module...
get_rocm_bin_path: note: hipcc not found on $PATH
get_rocm_bin_path: note: $HIP_PATH/bin/hipcc does not exist
get_rocm_bin_path: note: /opt/rocm/bin/hipcc does not exist
extract_cuda_dso: note: prebuilt binary /zip/ggml-rocm.so not found
import_cuda_impl: won't compile AMD GPU support because $HIP_PATH/bin/clang++ is missing
extract_cuda_dso: note: prebuilt binary /zip/ggml-rocm.so not found
link_cuda_dso: note: dynamically linking /home/ABTLUS/gabriel.totene/.llamafile/v/0.9.3/ggml-cuda.so
ggml_cuda_link: CUDA kernel version 13.0
ggml_cuda_link: CUDA runtime version is 13.0
ggml_cuda_link: welcome to CUDA SDK with cuBLAS
link_cuda_dso: GPU support loaded
llama.cpp/ggml.c:19663: GGML_ASSERT(0 <= info->type && info->type < GGML_TYPE_COUNT) failed

error: Uncaught SIGABRT (SI_TKILL) at 0x6c01b75500008f43 on b-cca05-l.abtlus.org.br pid 36675 tid 36675
  /usr/local/bin/llamafile
  No error information
  Linux Cosmopolitan 3.9.7 MODE=x86_64; #202508231538~1757385336~22.04~8f363f2 SMP PREEMPT_DYNAMIC Tue S b-cca05-l.abtlus.org.br 6.16.3-76061603-generic

RAX 0000000000000000 RBX 0000000000000006 RDI 0000000000008f43
RCX 00000000009ae101 RDX 0000000000000000 RSI 0000000000000006
RBP 00007ffdd595f8e0 RSP 00007ffdd595f8e0 RIP 00000000009ae101
 R8 0000000000000000  R9 0000000000000000 R10 00000000009ae101
R11 0000000000000296 R12 0000000000a1356e R13 0000000000004ccf
R14 0000000000c5cfb4 R15 0000700f8b24aae0
TLS 0000000000c4bf40

XMM0  00000000000000000000000000000000 XMM8  0000700fb4444bc00000700fb3cbf868
XMM1  00000000000000000000000000000000 XMM9  0000700fb44459400000700fb3cbf808
XMM2  00000000000000010000000000000001 XMM10 0000700fb44452800000700fb3cbf7a8
XMM3  00000000000000010000000000000001 XMM11 0000700fb44474400000700fb3cbf748
XMM4  c3b2c290c3a4c491c3a3c491c320aec4 XMM12 0000700fb4446d800000700fb3cbf6e8
XMM5  91c3bbc290c3b5c290c3a4c491c3b0c2 XMM13 0000700fb44445000000700fb3cbf688
XMM6  90c3b4c290c3bec290c3bdc290c3bec2 XMM14 0000700fb44460000000700fb3cbf628
XMM7  90c3bac290c3b0c290c3b7c290c3a0c4 XMM15 0000700fb44466c00000700fb3cbf5c8

cosmoaddr2line /usr/local/bin/llamafile 9ae101 99700b 4078c5 53282f 55869d 5c71db 5b43e9 42cede 41d566 404284 4015f4

0x00000000009ae101: ?? ??:0
0x000000000099700b: ?? ??:0
0x00000000004078c5: ?? ??:0
0x000000000053282f: ?? ??:0
0x000000000055869d: ?? ??:0
0x00000000005c71db: ?? ??:0
0x00000000005b43e9: ?? ??:0
0x000000000042cede: ?? ??:0
0x000000000041d566: ?? ??:0
0x0000000000404284: ?? ??:0
0x00000000004015f4: ?? ??:0

000000400000-000000b851e0 r-x-- 7700kb
000000b86000-000003306000 rw--- 40mb
0006fe000000-0006fe001000 rw-pa 4096b
700f8b200000-700f8e000000 rw-pa 46mb
700f98200000-700f98800000 rw-pa 6144kb
700fca636000-700fca64a000 rw-pa 80kb
700fca67a000-700fca68e000 rw-pa 80kb
700fca6db000-700fca826000 rw-pa 1324kb
700fca826000-700fca93479e r--s- 1082kb
700fca950000-700fca950650 rw-pa 1616b
700fca951000-700fca9f2000 rw-pa 644kb
7ffdd5165000-7ffdd5965000 rw--- 8192kb
# 115'519'488 bytes in 12 mappings


/usr/local/bin/llamafile -ngl 999 -m gpt-oss-20b.gguf --server --v2

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions