Skip to content

Allow users to save SDXL LoRA weights for only one text encoder #7607

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

dulacp
Copy link
Contributor

@dulacp dulacp commented Apr 8, 2024

What does this PR do?

Allow users to save only the LoRA weights for one text encoder if desired, often used in the Dreambooth realm to avoid overcooking the model too quickly.

Also, the method checks if at least one of unet, text_encoder or text_encoder_2 lora weights is passed, which was not reflected in the implementation.

Before submitting

Who can review?

I'm thinking about @sayakpaul since it's a topic close to training examples.

The method checks if at least one of unet, text_encoder and
text_encoder_2 lora weights are passed, which was not reflected in the
implentation.
Comment on lines +1392 to +1394
text_encoder_2_lora_layers (`Dict[str, torch.nn.Module]` or `Dict[str, torch.Tensor]`):
State dict of the LoRA layers corresponding to the `text_encoder_2`. Must explicitly pass the text
encoder LoRA state dict because it comes from 🤗 Transformers.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's there:

text_encoder_2_lora_layers: Dict[str, Union[torch.nn.Module, torch.Tensor]] = None,

You need to look at the right class:
https://github.com/huggingface/diffusers/blob/7e39516627c69b71f8b21a2b53689028d4733b72/src/diffusers/loaders/lora.py#L1288C7-L1288C39

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is what I aimed to fix.

Just added the missing parameter in the documentation (to match the signature) and split the following if condition:

if text_encoder_lora_layers and text_encoder_2_lora_layers:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for explaining! LGTM.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for looking into it in record time :)
Cheers

@sayakpaul sayakpaul requested a review from yiyixuxu April 8, 2024 11:00
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@yiyixuxu
Copy link
Collaborator

yiyixuxu commented Apr 8, 2024

can you fix the tests?

@yiyixuxu
Copy link
Collaborator

yiyixuxu commented Apr 8, 2024

ohh the failing test actually might not be relevant here - will take a look on our end!

@sayakpaul
Copy link
Member

Will merge after the CI is green (barring the failing test).

Copy link
Contributor

github-actions bot commented May 8, 2024

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@github-actions github-actions bot added the stale Issues that haven't received updates label May 8, 2024
@yiyixuxu yiyixuxu removed the stale Issues that haven't received updates label May 8, 2024
@yiyixuxu yiyixuxu merged commit 75aab34 into huggingface:main May 8, 2024
15 checks passed
@yiyixuxu
Copy link
Collaborator

yiyixuxu commented May 8, 2024

thank you!

sayakpaul added a commit that referenced this pull request Dec 23, 2024
SDXL LoRA weights for text encoders should be decoupled on save

The method checks if at least one of unet, text_encoder and
text_encoder_2 lora weights are passed, which was not reflected in the
implentation.

Co-authored-by: Sayak Paul <[email protected]>
Co-authored-by: YiYi Xu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants