[Bug]: vLLM + FlexAttention crashes with torch._dynamo.exc.InternalTorchDynamoError: AcceleratorError: CUDA error: misaligned address
April 29, 2026 ยท #41257
Python
Difficulty: Medium
Labels
bug
Parent Repository
vllm-project/vllm
Python repository
78,601 16,254