[Bug]: GPTQ INT4 quantized Qwen3-vl-30b-a3b can not deploy by vllm with tensor-parallel-size=2

April 8, 2026 ยท #2582
View on GitHub
Python Difficulty: Medium

Labels

bug

Sign in required

Authenticate to use favourites & bookmarks

5