Description
Name and Version
version: 5415 (6a2bc8b)
built with cc (Ubuntu 9.4.0-1ubuntu1~20.04.2) 9.4.0 for x86_64-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
llama-server -m <any model>
Problem description & steps to reproduce
Here's what happens:
- I open a new chat
- Write a prompt (often, paste something in and add some surrounding instruction text)
- Press Enter to submit
When the bug happens, the prompt I just entered never appears as a user message. All I see is the assistant message pending animation, and the response is a hallucination (depends on the model exactly what happens, but usually it responds to the system message). So, the prompt is lost.
This behaviour does not always happen, and it only happens on the first user message. I don't know exactly what causes it. Sometimes it happens several times in a row (when I keep clicking new conversation).
Clicking "New conversation" several times seems to be helping it to occur.
Browser: firefox
First Bad Commit
Began happening in the past few months (maybe the last UI overhaul)
Relevant log output
console log:
Uncaught (in promise)
Object { name: "ConstraintError", message: "A mutation operation in the transaction failed because a constraint was not satisfied.\n ConstraintError: A mutation operation in the transaction failed because a constraint was not satisfied.", inner: DOMException, stack: "", … }
inner: DOMException: A mutation operation in the transaction failed because a constraint was not satisfied.
message: "A mutation operation in the transaction failed because a constraint was not satisfied.\n ConstraintError: A mutation operation in the transaction failed because a constraint was not satisfied."
name: "ConstraintError"
stack:
<get stack()>: function get()
<prototype>: Object { stack: "", … }
Object { messages: [] }
Video
ous2.webm
Note the delay in the video is due to contention with another user. The hallucination would happen right away if there is no contention and there is no issue on the backend.