Fix for CogVideoX-{2B,5B}

When loading CogVideX-{2B,5B}, `patch_size_t` is None,
which results in the `prepare_rotary_position_embeddings` function.
This commit is contained in:
OleehyO 2024-12-12 14:18:55 +00:00
parent 36f1333788
commit 7b4c9db6d9

View File

@ -1375,7 +1375,7 @@ def main(args):
num_frames=num_frames,
vae_scale_factor_spatial=vae_scale_factor_spatial,
patch_size=model_config.patch_size,
patch_size_t=model_config.patch_size_t,
patch_size_t=model_config.patch_size_t if model_config.patch_size_t is not None else 1,
attention_head_dim=model_config.attention_head_dim,
device=accelerator.device,
)