Skip to content

[2905]: Add Karras pattern to discrete euler #2956

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Apr 6, 2023

Conversation

nipunjindal
Copy link
Contributor

@nipunjindal nipunjindal commented Apr 3, 2023

Discussion: #2905, #2064

This pull request allows for the utilization of karras_sigmas in the Euler Discrete scheduler. Additionally, happy to get feedback as we refine a set of guidelines as we expand this functionality to all the schedulers. This update introduces two key components:

  • The inclusion of karras-based sigma.
  • The mapping of sigmas to the timestamp, which will be used as input for UNET.

Also, I apologize if I overstepped any boundaries, as I was working on #2064, I realized to better setup the guidelines with a simpler scheduler.

Euler Discrete Test

from diffusers import StableDiffusionPipeline, EulerDiscreteScheduler
import torch 

seed = 33

pipe = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4")
pipe = pipe.to("mps")
pipe.scheduler = EulerDiscreteScheduler.from_config(pipe.scheduler.config, use_karras_sigmas=True)

prompt = "an astronaut riding a horse on mars"

generator = torch.Generator(device="mps").manual_seed(seed)
image = pipe(prompt, generator=generator, num_inference_steps=20).images[0]

display(image)

image

Compared with KDiffusion - sample_euler

import torch
from diffusers import StableDiffusionKDiffusionPipeline

seed = 33

pipe = StableDiffusionKDiffusionPipeline.from_pretrained('CompVis/stable-diffusion-v1-4').to('mps')
pipe.set_scheduler('sample_euler')

prompt = "an astronaut riding a horse on mars"

generator = torch.Generator(device="mps").manual_seed(seed)
image = pipe(prompt, generator=generator, num_inference_steps=20, use_karras_sigmas=True).images[0]

display(image)

image

Tensor Comparison

EulerDiscreteScheduler - Sigmas

14.6146, 11.7254,  9.3402,  7.3836,  5.7894,  4.4998,  3.4648,  2.6408, 1.9909,  1.4832,  1.0908,  0.7909,  0.5647,  0.3964,  0.2730,  0.1842, 0.1213,  0.0779,  0.0485,  0.0292,  0.0000

EulerDiscreteScheduler - Timestep to UNET

999.0000, 961.7328, 921.0421, 876.3138, 826.7906, 771.5520, 709.5297, 639.6147, 560.9754, 473.7489, 380.1317, 285.3175, 197.0806, 123.3951, 69.2569,  34.6974,  15.4816,   5.9918,   1.7830,   0.0000

sample_euler - Sigmas

14.6146, 11.7254, 9.3402, 7.3836, 5.7894, 4.4998, 3.4648, 2.6408, 1.9909, 1.4832, 1.0908, 0.7909, 0.5647, 0.3964, 0.2730, 0.1842, 0.1213, 0.0779, 0.0485, 0.0292, 0.0000

sample_euler - Timestep to UNET

998.9999, 961.7327, 921.0419, 876.3136, 826.7905, 771.5519, 709.5297, 639.6146, 560.9752, 473.7487, 380.1315, 285.3175, 197.0805, 123.3950, 69.2569, 34.6974, 15.4816, 5.9918, 1.7829, 0.0, 0.0

@nipunjindal
Copy link
Contributor Author

Hello @sayakpaul @patrickvonplaten,

Please review and provide your feedback.
Thanks so much!

def _convert_to_karras(self, in_sigmas: torch.FloatTensor) -> torch.FloatTensor:
"""Constructs the noise schedule of Karras et al. (2022)."""

sigma_min: float = in_sigmas[-1].item()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should a config param for sigma min/max be also introduced which can be utilized here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think since sigma_min and sigma_max are derived values (being derived from in_sigmas), I think it's fine to not register the, separately.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 3, 2023

The documentation is not available anymore as the PR was closed or merged.

Comment on lines 106 to 109
use_karras_sigmas (`bool`, *optional*, defaults to `False`):
Use karras sigmas. For example, specifying `sample_dpmpp_2m` to `set_scheduler` will be equivalent to
`DPM++2M` in stable-diffusion-webui. On top of that, setting this option to True will make it `DPM++2M
Karras`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we also include the reference paper that introduced it?

#2874 (comment)

Feel free to also include it in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

print(sigma_min, sigma_max)

rho = 7.0
# ramp = torch.linspace(0, 1, self.num_inference_steps)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably needs to go?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the section.


result_sum = torch.sum(torch.abs(sample))
result_mean = torch.mean(torch.abs(sample))
print(result_sum.item(), result_mean.item())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs to go?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the section.


sigma_min: float = in_sigmas[-1].item()
sigma_max: float = in_sigmas[0].item()
print(sigma_min, sigma_max)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs to go.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the section.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What a clean PR! Well done 🔥

I am wondering if it would make sense to add this utility to

so that all the compatible schedulers can use Karras sigmas without having to duplicate code.

WDYT?

@sayakpaul
Copy link
Member

Also, I am wondering if we could somehow find a way to reuse get_sigmas_karras() from k_diffusion as done in https://github.com/huggingface/diffusers/pull/2874/files#diff-aea44e0665dd1127ca6b341af8cc731a4f36817c19553ddb559da2a919d1d267R20 ?

@nipunjindal
Copy link
Contributor Author

Also, I am wondering if we could somehow find a way to reuse get_sigmas_karras() from k_diffusion as done in https://github.com/huggingface/diffusers/pull/2874/files#diff-aea44e0665dd1127ca6b341af8cc731a4f36817c19553ddb559da2a919d1d267R20 ?

Good point, utilized this function.

@nipunjindal
Copy link
Contributor Author

What a clean PR! Well done 🔥

I am wondering if it would make sense to add this utility to

so that all the compatible schedulers can use Karras sigmas without having to duplicate code.

WDYT?

I am in favor of moving it to the base so that it becomes a few line change hopefully for other schedulers.

@nipunjindal
Copy link
Contributor Author

nipunjindal commented Apr 4, 2023

Thank you for the review @sayakpaul 🙇, incorporated your feedback.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for these changes. Looks really clean.

@patrickvonplaten WDYT?

@@ -18,6 +18,7 @@

import numpy as np
import torch
from k_diffusion.sampling import get_sigmas_karras
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can import from the k_diffusion library here sadly. Could you just copy-paste the function here: https://github.com/crowsonkb/k-diffusion/blob/b43db16749d51055f813255eea2fdf1def801919/k_diffusion/sampling.py#L17 into the scheduler (ideally with a note that it's copied from the library to correctly cite)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please take a look at the original commit? Is this what works for you and I'll move it to the base class to be reusable.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Original commit lgtm

@@ -198,6 +206,11 @@ def set_timesteps(self, num_inference_steps: int, device: Union[str, torch.devic
" 'linear' or 'log_linear'"
)

if self.use_karras_sigmas:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very cool! This works for me :-)

@@ -174,3 +175,27 @@ def _get_compatibles(cls):
getattr(diffusers_library, c) for c in compatible_classes_str if hasattr(diffusers_library, c)
]
return compatible_classes

@classmethod
def _sigma_to_t(self, sigma, log_sigmas):
Copy link
Contributor

@patrickvonplaten patrickvonplaten Apr 4, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we move this function directly to the EulerDiscreteScheduler class? Not all schedulers need the sigma to t function (e.g. DDIM, DDPM, PNDM or any non-image schedulers like IPNDM won't need this)

@patrickvonplaten
Copy link
Contributor

Also, I am wondering if we could somehow find a way to reuse get_sigmas_karras() from k_diffusion as done in https://github.com/huggingface/diffusers/pull/2874/files#diff-aea44e0665dd1127ca6b341af8cc731a4f36817c19553ddb559da2a919d1d267R20 ?

k_diffusion is not a hard dependency of diffusers, so we cannot import from k_diffusion I think the best idea here to just copy-paste the function between the relevant schedulers with the #Copied from statement

@patrickvonplaten
Copy link
Contributor

@nipunjindal amazing works, it's super cool that you're getting identical results between KDiffusion and diffusers - amazing job!

Sorry for the back and forth with the review here, but it'd be nice if you could actually not add a dependency to k_diffusion inside diffusers-only schedulers and also not add _sigma_to_t to the general SchedulerMixin but directly to the specific scheduler. We can use the # Copy-from mechanism to copy-paste it everywhere.

@nipunjindal
Copy link
Contributor Author

@patrickvonplaten Thank you for the reviews! I have addressed the review comments and confirmed the results are intact.

return t

# Copied from https://github.com/crowsonkb/k-diffusion/blob/686dbad0f39640ea25c8a8c6a6e56bb40eacefa2/k_diffusion/sampling.py#L17
def _convert_to_karras(self, in_sigmas: torch.FloatTensor) -> torch.FloatTensor:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect this function works for me! Happy to just go with this function as it was before :-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Fixed.

@sayakpaul
Copy link
Member

@nipunjindal sorry for the oversight on my part and that you had to go through some repetitive cycles here.

@nipunjindal
Copy link
Contributor Author

No worries at all. Software is iteration :)

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR is good to go for me - @sayakpaul what do you think?

@sayakpaul
Copy link
Member

I was following the updates, PR is good to go for me.

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work! I only have questions about the docstring :)

Comment on lines 107 to 109
Use karras sigmas. For example, specifying `sample_dpmpp_2m` to `set_scheduler` will be equivalent to
`DPM++2M` in stable-diffusion-webui. On top of that, setting this option to True will make it `DPM++2M
Karras`. Please see equation (5) https://arxiv.org/pdf/2206.00364.pdf for more details.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might be wrong, but I think this comment is only appropriate for the custom k-diffusion pipeline, whereas these sigmas can be used in other pipelines if I understand it correctly.

Also, not sure about why we are mentioning sample_dpmpp_2m in the Euler scheduler?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, these sigmas can be used generically in other scheduler pipelines as well. Updated the READme.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for iterating 🙏 I love the new text!

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to go imo!

@sayakpaul sayakpaul merged commit 6e8e1ed into huggingface:main Apr 6, 2023
@sayakpaul
Copy link
Member

@nipunjindal we deeply appreciate this important PR from you. Really great work :)

w4ffl35 pushed a commit to w4ffl35/diffusers that referenced this pull request Apr 14, 2023
* [2905]: Add Karras pattern to discrete euler

* [2905]: Add Karras pattern to discrete euler

* Review comments

* Review comments

* Review comments

* Review comments

---------

Co-authored-by: njindal <[email protected]>
dg845 pushed a commit to dg845/diffusers that referenced this pull request May 6, 2023
* [2905]: Add Karras pattern to discrete euler

* [2905]: Add Karras pattern to discrete euler

* Review comments

* Review comments

* Review comments

* Review comments

---------

Co-authored-by: njindal <[email protected]>
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* [2905]: Add Karras pattern to discrete euler

* [2905]: Add Karras pattern to discrete euler

* Review comments

* Review comments

* Review comments

* Review comments

---------

Co-authored-by: njindal <[email protected]>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* [2905]: Add Karras pattern to discrete euler

* [2905]: Add Karras pattern to discrete euler

* Review comments

* Review comments

* Review comments

* Review comments

---------

Co-authored-by: njindal <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants