Skip to content

Eval bug: ApertusForCausalLM is not supported - gguf conversion for model swiss-ai/Apertus-8B-Instruct-2509 & swiss-ai/Apertus-70B-Instruct-2509 #15751

@ClaudeStabile

Description

@ClaudeStabile

Name and Version

version: 6357 (0a2a384)
built with cc (Ubuntu 14.2.0-19ubuntu2) 14.2.0 for x86_64-linux-gnu

When i try to convert files to .gguf format using convert_hf_to_gguf.py
i get this error :
INFO:hf-to-gguf:Model architecture: ApertusForCausalLM
ERROR:hf-to-gguf:Model ApertusForCausalLM is not supported

I am expecting to convert latest apertus LLM files to .gguf format i used to do that with several LLM published on huggingface

I did a conversion try with this brand new LLM
https://huggingface.co/swiss-ai/Apertus-8B-Instruct-2509

I have updated transformers to 4.56.0 as requested by ApertusForCausalLM

Tu reproduce : Download LLM files and try to convert to .gguf format using convert_hf_to_gguf.py

Many thanks in advance for effort and clues

Operating systems

Linux

GGML backends

CUDA

Hardware

13th Gen Intel(R) Core(TM) i9-13900K To Be Filled By O.E.M. CPU @ 3.0GHz
2x NVIDIA GeForce RTX 4090

Models

swiss-ai/Apertus-8B-Instruct-2509 & swiss-ai/Apertus-70B-Instruct-2509

Problem description & steps to reproduce

To reproduce : Download LLM files swiss-ai/Apertus-8B-Instruct-2509 and try to convert to .gguf format using convert_hf_to_gguf.py

First Bad Commit

No response

Relevant log output

(gguf_env) llama@aiborg:/data/LLAMA/llama.cpp$ /data/LLAMA/llama.cpp/convert_hf_to_gguf.py --outtype f16 --outfile /data/LLAMA/models/Swiss-AI/apertus-70B/swissai-70B.gguf /data/LLAMA/models/Swiss-AI/apertus-70B/
INFO:hf-to-gguf:Loading model: apertus-70B
INFO:hf-to-gguf:Model architecture: ApertusForCausalLM
ERROR:hf-to-gguf:Model ApertusForCausalLM is not supported

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions