Skip to content

Commit 7fe4759

Browse files
authored
Allow diffusers to load with Flax, w/o PyTorch (#6272)
1 parent 59d1caa commit 7fe4759

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

src/diffusers/utils/torch_utils.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ def is_compiled_module(module) -> bool:
8989
return isinstance(module, torch._dynamo.eval_frame.OptimizedModule)
9090

9191

92-
def fourier_filter(x_in: torch.Tensor, threshold: int, scale: int) -> torch.Tensor:
92+
def fourier_filter(x_in: "torch.Tensor", threshold: int, scale: int) -> "torch.Tensor":
9393
"""Fourier filter as introduced in FreeU (https://arxiv.org/abs/2309.11497).
9494
9595
This version of the method comes from here:
@@ -121,8 +121,8 @@ def fourier_filter(x_in: torch.Tensor, threshold: int, scale: int) -> torch.Tens
121121

122122

123123
def apply_freeu(
124-
resolution_idx: int, hidden_states: torch.Tensor, res_hidden_states: torch.Tensor, **freeu_kwargs
125-
) -> Tuple[torch.Tensor, torch.Tensor]:
124+
resolution_idx: int, hidden_states: "torch.Tensor", res_hidden_states: "torch.Tensor", **freeu_kwargs
125+
) -> Tuple["torch.Tensor", "torch.Tensor"]:
126126
"""Applies the FreeU mechanism as introduced in https:
127127
//arxiv.org/abs/2309.11497. Adapted from the official code repository: https://github.com/ChenyangSi/FreeU.
128128

0 commit comments

Comments
 (0)