Eval bug: [XPU] Potential memory leak or KV Cache fragmentation in llama-server with oneAPI (Intel GPU) during long-running sessions
May 7, 2026 ยท #22781
cpp
Difficulty: Medium
Labels
bug-unconfirmed
Parent Repository
ggml-org/llama.cpp
cpp repository
108,724 17,865