Skip to content

Bump safetensors from 0.5.3 to 0.6.2 in /requirements #1216

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Aug 11, 2025

Bumps safetensors from 0.5.3 to 0.6.2.

Release notes

Sourced from safetensors's releases.

v0.6.2

What's Changed

Full Changelog: huggingface/safetensors@v0.6.1...v0.6.2

v0.6.1

What's Changed

New Contributors

Full Changelog: huggingface/safetensors@v0.6.0...v0.6.1

v0.6.1-rc0

What's Changed

New Contributors

Full Changelog: huggingface/safetensors@v0.6.0...v0.6.1-rc0

v0.6.0

Biggest changes

  • Added support for FP4/FP6 https://www.opencompute.org/documents/ocp-microscaling-formats-mx-v1-0-spec-final-pdf Support is still nascent in most frameworks (will require torch 2.8 which isn't released yet, and that will only support fp4 with caveats), however being an openspec supported by hardware manufacturers (and therefore hardware support most likely), it fits the bill of implementing it in safetensors (rather than all custom quantized formats existing in the wild in various frameworks.

    What FP4/FP6 mean, is that now a element of a tensor may have a non byte-aligned size/access. If you store a single fp4, then there is 4 bit on that byte that is outside of the spec. For now, safetensors library will simply raise MisalignedByte exception whenever an operation leads to unused/unaligned bits within a byte. Since most tensors are larger power of 2s, this shouldn't come up too often in practice. Raising an exception now means we have freedom later to actually implement a behavior which could align with tensor libraries.

    In that regard Dtype.size() is now deprecated, as it returns the size of the dtype in bytes, and we now favor bitsize() and it's up to users for now to handle something like len * bitsize() / 8 (and verifying the division is acceptable)

    On that note, and for Pytorch users specifically, torch doesn't actually implement fp4, it has a dtype called float4_e2m1fn_x2 which actually represents 2 fp4. This is why torch shouldn't have any alignement problems for now (but cannot implement fp6). But that also means that the shape of a [2, 2] tensor for floa4, contains actually 8 values. safetensors will actuallly silently cast a tensor of shape, [x, y, ...z] into [x, y, ..., z/2], using the last dimension to "swallow" , the x2 contained within the types. Again, there is no definite behavior just yet, so this might be subject to change.

What's Changed

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [safetensors](https://github.com/huggingface/safetensors) from 0.5.3 to 0.6.2.
- [Release notes](https://github.com/huggingface/safetensors/releases)
- [Changelog](https://github.com/huggingface/safetensors/blob/main/RELEASE.md)
- [Commits](huggingface/safetensors@v0.5.3...v0.6.2)

---
updated-dependencies:
- dependency-name: safetensors
  dependency-version: 0.6.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Aug 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants