Skip to content

New features on LORA break lora+dreambooth model loading #3174

Closed
@tcapelle

Description

@tcapelle

Describe the bug

Unable to load the attn_proc on the freshly trained dreambooth+lora. Just using the supplied example. It was working on 7b0ba48

Reproduction

Train a model and save the new weights in "pytorch_lora_weights.bin". That works fine, then try to create inference with that model by doing this:

model_base = "runwayml/stable-diffusion-v1-5"

pipeline = DiffusionPipeline.from_pretrained(model_base, torch_dtype=torch.float16)
pipeline.scheduler = DPMSolverMultistepScheduler.from_config(pipeline.scheduler.config)

lora_model_path = "pytorch_lora_weights.bin"
pipeline.unet.load_attn_procs(lora_model_path)

Logs

The following command is not working anymore:

pipeline.unet.load_attn_procs(lora_model_path)

The error:

KeyError Traceback (most recent call last)
Cell In[4], line 1
----> 1 pipeline.unet.load_attn_procs(lora_model_path)
2 pipeline.to("cuda")

File ~/Apps/diffusers/src/diffusers/loaders.py:279, in UNet2DConditionLoadersMixin.load_attn_procs(self, pretrained_model_name_or_path_or_dict, **kwargs)
276 attn_processors = {k: v.to(device=self.device, dtype=self.dtype) for k, v in attn_processors.items()}
278 # set layers
--> 279 self.set_attn_processor(attn_processors)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:513, in UNet2DConditionModel.set_attn_processor(self, processor)
510 fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)
512 for name, module in self.named_children():
--> 513 fn_recursive_attn_processor(name, module, processor)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:510, in UNet2DConditionModel.set_attn_processor..fn_recursive_attn_processor(name, module, processor)
507 module.set_processor(processor.pop(f"{name}.processor"))
509 for sub_name, child in module.named_children():
--> 510 fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:510, in UNet2DConditionModel.set_attn_processor..fn_recursive_attn_processor(name, module, processor)
507 module.set_processor(processor.pop(f"{name}.processor"))
509 for sub_name, child in module.named_children():
--> 510 fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

[... skipping similar frames: UNet2DConditionModel.set_attn_processor.<locals>.fn_recursive_attn_processor at line 510 (3 times)]

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:510, in UNet2DConditionModel.set_attn_processor..fn_recursive_attn_processor(name, module, processor)
507 module.set_processor(processor.pop(f"{name}.processor"))
509 for sub_name, child in module.named_children():
--> 510 fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:507, in UNet2DConditionModel.set_attn_processor..fn_recursive_attn_processor(name, module, processor)
505 module.set_processor(processor)
506 else:
--> 507 module.set_processor(processor.pop(f"{name}.processor"))
509 for sub_name, child in module.named_children():
510 fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

KeyError: 'down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor'

System Info

  • diffusers version: 0.16.0.dev0
  • Platform: Linux-5.15.0-1031-gcp-x86_64-with-glibc2.31
  • Python version: 3.10.8
  • PyTorch version (GPU?): 1.13.1 (True)
  • Huggingface_hub version: 0.13.4
  • Transformers version: 4.28.1
  • Accelerate version: 0.18.0
  • xFormers version: 0.0.18
  • Using GPU in script?: V100
  • Using distributed or parallel set-up in script?: NO

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions