Skip to content

Commit c33fd8b

Browse files
universomesayakpaul
authored andcommitted
Use parameters + buffers when deciding upscale_dtype (#9882)
Sometimes, the decoder might lack parameters and only buffers (e.g., this happens when we manually need to convert all the parameters to buffers — e.g. to avoid packing fp16 and fp32 parameters with FSDP)
1 parent e50b36f commit c33fd8b

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

src/diffusers/models/autoencoders/autoencoder_kl_temporal_decoder.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14+
import itertools
1415
from typing import Dict, Optional, Tuple, Union
1516

1617
import torch
@@ -94,7 +95,7 @@ def forward(
9495

9596
sample = self.conv_in(sample)
9697

97-
upscale_dtype = next(iter(self.up_blocks.parameters())).dtype
98+
upscale_dtype = next(itertools.chain(self.up_blocks.parameters(), self.up_blocks.buffers())).dtype
9899
if torch.is_grad_enabled() and self.gradient_checkpointing:
99100

100101
def create_custom_forward(module):

0 commit comments

Comments
 (0)