-
Notifications
You must be signed in to change notification settings - Fork 6k
[2905]: Add Karras pattern to discrete euler #2956
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Hello @sayakpaul @patrickvonplaten, Please review and provide your feedback. |
def _convert_to_karras(self, in_sigmas: torch.FloatTensor) -> torch.FloatTensor: | ||
"""Constructs the noise schedule of Karras et al. (2022).""" | ||
|
||
sigma_min: float = in_sigmas[-1].item() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should a config param for sigma min/max be also introduced which can be utilized here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think since sigma_min
and sigma_max
are derived values (being derived from in_sigmas
), I think it's fine to not register the, separately.
The documentation is not available anymore as the PR was closed or merged. |
use_karras_sigmas (`bool`, *optional*, defaults to `False`): | ||
Use karras sigmas. For example, specifying `sample_dpmpp_2m` to `set_scheduler` will be equivalent to | ||
`DPM++2M` in stable-diffusion-webui. On top of that, setting this option to True will make it `DPM++2M | ||
Karras`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we also include the reference paper that introduced it?
Feel free to also include it in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
print(sigma_min, sigma_max) | ||
|
||
rho = 7.0 | ||
# ramp = torch.linspace(0, 1, self.num_inference_steps) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably needs to go?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed the section.
|
||
result_sum = torch.sum(torch.abs(sample)) | ||
result_mean = torch.mean(torch.abs(sample)) | ||
print(result_sum.item(), result_mean.item()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs to go?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed the section.
|
||
sigma_min: float = in_sigmas[-1].item() | ||
sigma_max: float = in_sigmas[0].item() | ||
print(sigma_min, sigma_max) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs to go.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed the section.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What a clean PR! Well done 🔥
I am wondering if it would make sense to add this utility to
class SchedulerMixin: |
so that all the compatible schedulers can use Karras sigmas without having to duplicate code.
WDYT?
Also, I am wondering if we could somehow find a way to reuse |
Good point, utilized this function. |
I am in favor of moving it to the base so that it becomes a few line change hopefully for other schedulers. |
Thank you for the review @sayakpaul 🙇, incorporated your feedback. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for these changes. Looks really clean.
@patrickvonplaten WDYT?
@@ -18,6 +18,7 @@ | |||
|
|||
import numpy as np | |||
import torch | |||
from k_diffusion.sampling import get_sigmas_karras |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can import from the k_diffusion
library here sadly. Could you just copy-paste the function here: https://github.com/crowsonkb/k-diffusion/blob/b43db16749d51055f813255eea2fdf1def801919/k_diffusion/sampling.py#L17 into the scheduler (ideally with a note that it's copied from the library to correctly cite)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please take a look at the original commit? Is this what works for you and I'll move it to the base class to be reusable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Original commit lgtm
@@ -198,6 +206,11 @@ def set_timesteps(self, num_inference_steps: int, device: Union[str, torch.devic | |||
" 'linear' or 'log_linear'" | |||
) | |||
|
|||
if self.use_karras_sigmas: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very cool! This works for me :-)
@@ -174,3 +175,27 @@ def _get_compatibles(cls): | |||
getattr(diffusers_library, c) for c in compatible_classes_str if hasattr(diffusers_library, c) | |||
] | |||
return compatible_classes | |||
|
|||
@classmethod | |||
def _sigma_to_t(self, sigma, log_sigmas): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we move this function directly to the EulerDiscreteScheduler
class? Not all schedulers need the sigma to t function (e.g. DDIM, DDPM, PNDM or any non-image schedulers like IPNDM won't need this)
|
@nipunjindal amazing works, it's super cool that you're getting identical results between KDiffusion and Sorry for the back and forth with the review here, but it'd be nice if you could actually not add a dependency to |
@patrickvonplaten Thank you for the reviews! I have addressed the review comments and confirmed the results are intact. |
return t | ||
|
||
# Copied from https://github.com/crowsonkb/k-diffusion/blob/686dbad0f39640ea25c8a8c6a6e56bb40eacefa2/k_diffusion/sampling.py#L17 | ||
def _convert_to_karras(self, in_sigmas: torch.FloatTensor) -> torch.FloatTensor: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect this function works for me! Happy to just go with this function as it was before :-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Fixed.
@nipunjindal sorry for the oversight on my part and that you had to go through some repetitive cycles here. |
No worries at all. Software is iteration :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR is good to go for me - @sayakpaul what do you think?
I was following the updates, PR is good to go for me. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! I only have questions about the docstring :)
Use karras sigmas. For example, specifying `sample_dpmpp_2m` to `set_scheduler` will be equivalent to | ||
`DPM++2M` in stable-diffusion-webui. On top of that, setting this option to True will make it `DPM++2M | ||
Karras`. Please see equation (5) https://arxiv.org/pdf/2206.00364.pdf for more details. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I might be wrong, but I think this comment is only appropriate for the custom k-diffusion pipeline, whereas these sigmas can be used in other pipelines if I understand it correctly.
Also, not sure about why we are mentioning sample_dpmpp_2m
in the Euler scheduler?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, these sigmas can be used generically in other scheduler pipelines as well. Updated the READme.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for iterating 🙏 I love the new text!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good to go imo!
@nipunjindal we deeply appreciate this important PR from you. Really great work :) |
* [2905]: Add Karras pattern to discrete euler * [2905]: Add Karras pattern to discrete euler * Review comments * Review comments * Review comments * Review comments --------- Co-authored-by: njindal <[email protected]>
* [2905]: Add Karras pattern to discrete euler * [2905]: Add Karras pattern to discrete euler * Review comments * Review comments * Review comments * Review comments --------- Co-authored-by: njindal <[email protected]>
* [2905]: Add Karras pattern to discrete euler * [2905]: Add Karras pattern to discrete euler * Review comments * Review comments * Review comments * Review comments --------- Co-authored-by: njindal <[email protected]>
* [2905]: Add Karras pattern to discrete euler * [2905]: Add Karras pattern to discrete euler * Review comments * Review comments * Review comments * Review comments --------- Co-authored-by: njindal <[email protected]>
Discussion: #2905, #2064
This pull request allows for the utilization of karras_sigmas in the Euler Discrete scheduler. Additionally, happy to get feedback as we refine a set of guidelines as we expand this functionality to all the schedulers. This update introduces two key components:
Also, I apologize if I overstepped any boundaries, as I was working on #2064, I realized to better setup the guidelines with a simpler scheduler.
Euler Discrete Test
Compared with KDiffusion - sample_euler
Tensor Comparison
EulerDiscreteScheduler - Sigmas
EulerDiscreteScheduler - Timestep to UNET
sample_euler - Sigmas
sample_euler - Timestep to UNET