Eval bug: llama-server reproducibly enters degenerate per-slot generation state after contaminated prompt; returns 1-4 token completions until context reset

May 8, 2026 ยท #22828
View on GitHub
cpp Difficulty: Medium

Labels

bug-unconfirmed

Sign in required

Authenticate to use favourites & bookmarks

5