Fix split_attn description: not required for xformers

xformers handles variable-length sequences via BlockDiagonalMask
natively; split_attn is an optional alternative, not a requirement.

https://claude.ai/code/session_01FQWfefStwK4SL6Cf4rKK5m
pull/3485/head
Claude 2026-02-12 14:34:17 +00:00
parent 25f9df5fd5
commit 9d2eb48dc6
No known key found for this signature in database
1 changed files with 1 additions and 1 deletions

View File

@ -147,7 +147,7 @@ class animaTraining:
self.anima_split_attn = gr.Checkbox(
label="Split Attention",
value=self.config.get("anima.anima_split_attn", False),
info="Split attention computation to reduce memory. Required when using xformers attn_mode.",
info="Split attention per-sequence to save memory. Optional with xformers (uses BlockDiagonalMask otherwise). Useful when xformers lacks mask support or for max VRAM savings.",
interactive=True,
)