[Bug]: vLLM + FlexAttention crashes with torch._dynamo.exc.InternalTorchDynamoError: AcceleratorError: CUDA error: misaligned address

April 29, 2026 ยท #41257
View on GitHub
Python Difficulty: Medium

Labels

bug

Sign in required

Authenticate to use favourites & bookmarks

5