Skip to content

Conversation

jeffrey-fong
Copy link
Contributor

@jeffrey-fong jeffrey-fong commented Apr 26, 2024

Streaming implementation in progress currently and will be in a future PR.

@abetlen abetlen merged commit f178636 into abetlen:main Apr 28, 2024
@abetlen
Copy link
Owner

abetlen commented Apr 28, 2024

@jeffrey-fong thank you!

xhedit pushed a commit to xhedit/llama-cpp-conv that referenced this pull request Apr 30, 2024
* fix completion tokens tracking, prompt forming

* fix 'function_call' and 'tool_calls' depending on 'functions' and 'tools', incompatibility with python 3.8

* Updated README

* fix for openai server compatibility

---------

Co-authored-by: Andrei <abetlen@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Functionary is not compatible with python 3.8 API server: 'completion_tokens' always 1 (running Functionary 2.4)
2 participants