[Bug] fast_inference crashes with vLLM 0.19 + bitsandbytes: "Tried to erase Node size_1"

April 3, 2026 ยท #4841
View on GitHub
Python Difficulty: Medium

Labels

feature request colab

Sign in required

Authenticate to use favourites & bookmarks

5