`torch.compile` produces different classification output for model using E8M0 bit-manipulation quantization (`view(int32) → bitshift → clamp → uint8`)

March 31, 2026 · #178880
View on GitHub
Python Difficulty: Easy

Labels

triage review module: correctness (silent) oncall: pt2 module: inductor topic: fuzzer

Sign in required

Authenticate to use favourites & bookmarks

5