You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- HunyuanVideo1.5 use attention masks with variable-length sequences. For best performance, we recommend using an attention backend that handles padding efficiently.
58
58
59
59
-**H100/H800:**`_flash_3_hub` or `_flash_varlen_3`
60
-
-**A100/A800/RTX 4090:**`flash` or `flash_varlen`
61
-
-**Other GPUs:**`sage`
60
+
-**A100/A800/RTX 4090:**`flash_hub` or `flash_varlen`
61
+
-**Other GPUs:**`sage_hub`
62
62
63
63
Refer to the [Attention backends](../../optimization/attention_backends) guide for more details about using a different backend.
0 commit comments