[Chore]: TTS/omni: use `vllm.vllm_flash_attn` instead of `flash-attn` package for 5 models
April 25, 2026 ยท #3131
Python
Difficulty: Medium
Parent Repository
vllm-project/vllm-omni
Python repository
4,517 844
vllm-project/vllm-omni
Python repository
Sign in required
Authenticate to use favourites & bookmarks