Skip to content

fix: some tensors are in bf16 #82

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Nov 20, 2023
Merged

Conversation

Green-Sky
Copy link
Contributor

numpy does not support bf16, so we conservatively upcast to f32.

I encountered some loras with (some?) tensors in bf16.

@leejet
Copy link
Owner

leejet commented Nov 20, 2023

Yes, I forgot to do it for lora, I did it for model weights before.

Green-Sky and others added 2 commits November 20, 2023 22:33
numpy does not support bf16, so we conservatively upcast to f32
@leejet leejet merged commit c874063 into leejet:master Nov 20, 2023
@Green-Sky
Copy link
Contributor Author

ah right. thanks for correcting :)

@Green-Sky
Copy link
Contributor Author

ah, you seem to down cast it to f16, have you seen any issues with that? I went with the safe route of upcasting to f32, since bf16 is a chopped-off f32.

@leejet
Copy link
Owner

leejet commented Nov 20, 2023

Nothing special, just maintaining consistency with the model. I've also forgotten why I initially converted to f16; it seems like f32 would be a better choice.

@Green-Sky Green-Sky deleted the fix_bf16_loras branch November 23, 2024 12:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants