Description
Describe the bug
attributeerror: 'distributeddataparallel' object has no attribute 'dtype'. did you mean: 'type'?
Reproduction
export MODEL_NAME="black-forest-labs/FLUX.1-dev"
export OUTPUT_DIR="trained-flux-dev-dreambooth-lora"
accelerate launch train_dreambooth_lora_flux.py
--pretrained_model_name_or_path=$MODEL_NAME
--instance_data_dir=$INSTANCE_DIR
--output_dir=$OUTPUT_DIR
--mixed_precision="bf16"
--train_text_encoder
--instance_prompt="a photo of sks dog"
--resolution=512
--train_batch_size=1
--guidance_scale=1
--gradient_accumulation_steps=4
--optimizer="prodigy"
--learning_rate=1.
--report_to="wandb"
--lr_scheduler="constant"
--lr_warmup_steps=0
--max_train_steps=500
--validation_prompt="A photo of sks dog in a bucket"
--seed="0"
--push_to_hub
Logs
System Info
- 🤗 Diffusers version: 0.33.0
- Platform: Linux-5.15.0-78-generic-x86_64-with-glibc2.35
- Running on Google Colab?: No
- Python version: 3.10.12
- PyTorch version (GPU?): 2.4.0+cu121 (True)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Huggingface_hub version: 0.30.2
- Transformers version: 4.44.1
- Accelerate version: 0.32.1
- PEFT version: 0.15.2
- Bitsandbytes version: not installed
- Safetensors version: 0.4.2
- xFormers version: 0.0.27.post2
Who can help?
No response