Skip to content

I fine-tuned Stable Diffusion using the LoRA method on my own dataset. However, during the inference process, I encountered the error: TypeError: __init__() got an unexpected keyword argument 'lora_bias'. #10472

Closed
@qinchangchang

Description

@qinchangchang

Describe the bug

I fine-tuned Stable Diffusion using the LoRA method on my own dataset. However, during the inference process, I encountered the error: TypeError: init() got an unexpected keyword argument 'lora_bias'.

Reproduction

CUDA_VISIBLE_DEVICES=1 accelerate launch train_text_to_image_lora.py
--mixed_precision="fp16"
--pretrained_model_name_or_path=$MODEL_NAME
--train_data_dir=$TRAIN_DATA_DIR --caption_column="additional_feature"
--resolution=512 --random_flip
--train_batch_size=1
--num_train_epochs=100 --checkpointing_steps=5000
--learning_rate=1e-04 --lr_scheduler="constant" --lr_warmup_steps=0
--seed=42
--output_dir=$OUTPUT_DIR
--validation_prompt=None

export MODEL_PATH="/home/new_project/prohibit/model"
export SAVE_PATH="./data_generated/" \

CUDA_VISIBLE_DEVICES=0 python generate.py --model_path $MODEL_PATH --save_path $SAVE_PATH

generate.py:
model_path = args.model_path

pipe = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", torch_dtype=torch.float16)

pipe.safety_checker = None

pipe.requires_safety_checker = False

print("加载参数")
pipe.unet.load_attn_procs(model_path)
pipe.to("cuda")
normal_image = pipe("a drawing of a cartoon character laying on the ground", num_inference_steps=30, guidance_scale=7.5).images[0]

Logs

/home/miniconda3/envs/new/lib/python3.9/site-packages/diffusers/loaders/unet.py:215: FutureWarning: `load_attn_procs` is deprecated and will be removed in version 0.40.0. Using the `load_attn_procs()` method has been deprecated and will be removed in a future version. Please use `load_lora_adapter()`.
  deprecate("load_attn_procs", "0.40.0", deprecation_message)
Traceback (most recent call last):
  File "/home/new_project/prohibit/generate.py", line 29, in <module>
    pipe.unet.load_attn_procs(model_path)
  File "/home/miniconda3/envs/new/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "/home/miniconda3/envs/new/lib/python3.9/site-packages/diffusers/loaders/unet.py", line 220, in load_attn_procs
    is_model_cpu_offload, is_sequential_cpu_offload = self._process_lora(
  File "/home/miniconda3/envs/new/lib/python3.9/site-packages/diffusers/loaders/unet.py", line 346, in _process_lora
    lora_config = LoraConfig(**lora_config_kwargs)
TypeError: __init__() got an unexpected keyword argument 'lora_bias'

System Info

  • 🤗 Diffusers version: 0.33.0.dev0
  • Platform: Linux-6.8.0-48-generic-x86_64-with-glibc2.39
  • Running on Google Colab?: No
  • Python version: 3.9.21
  • PyTorch version (GPU?): 2.5.1+cu124 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Huggingface_hub version: 0.27.0
  • Transformers version: 4.47.1
  • Accelerate version: 1.2.1
  • PEFT version: 0.7.0
  • Bitsandbytes version: not installed
  • Safetensors version: 0.5.0
  • xFormers version: not installed
  • Accelerator: NVIDIA GeForce RTX 3090 Ti, 24564 MiB
    NVIDIA GeForce RTX 3090 Ti, 24564 MiB
  • Using GPU in script?: yes
  • Using distributed or parallel set-up in script?: <no

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions