Skip to content

FluxTransformer2DModel does not have config and cannot set_default_attn_processor #11398

Open
@james-imi

Description

@james-imi

Describe the bug

scheduler = FlowMatchEulerDiscreteScheduler.from_pretrained(
            BFL_REPO, subfolder="scheduler", revision=REVISION
        )
        text_encoder = CLIPTextModel.from_pretrained(
            "openai/clip-vit-large-patch14", torch_dtype=DTYPE
        )
        tokenizer = CLIPTokenizer.from_pretrained(
            "openai/clip-vit-large-patch14", torch_dtype=DTYPE
        )
        text_encoder_2 = T5EncoderModel.from_pretrained(
            BFL_REPO, subfolder="text_encoder_2", torch_dtype=DTYPE, revision=REVISION
        )
        tokenizer_2 = T5TokenizerFast.from_pretrained(
            BFL_REPO, subfolder="tokenizer_2", torch_dtype=DTYPE, revision=REVISION
        )
        vae = AutoencoderKL.from_pretrained(
            BFL_REPO, subfolder="vae", torch_dtype=DTYPE, revision=REVISION
        )
        transformer = FluxTransformer2DModel.from_pretrained(
            BFL_REPO, subfolder="transformer", torch_dtype=DTYPE, revision=REVISION
        )

        quantize(transformer, weights=qfloat8)
        freeze(transformer)

        quantize(text_encoder_2, weights=qfloat8)
        freeze(text_encoder_2)

        pipe = FluxPipeline(
            scheduler=scheduler,
            text_encoder=text_encoder,
            tokenizer=tokenizer,
            text_encoder_2=None,
            tokenizer_2=tokenizer_2,
            vae=vae,
            transformer=None,
        )
        pipe.text_encoder_2 = text_encoder_2
        pipe.transformer = transformer
        pipe.enable_model_cpu_offload()

Running the attention ops without SDPA

pipe.transformer.set_default_attn_processor()

Returns an error

  1727     if name in modules:
   1728         return modules[name]
-> 1729 raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")

AttributeError: 'FluxTransformer2DModel' object has no attribute 'set_default_attn_processor'

And also, you cannot do pipe.transformer.config

Reproduction

Latest diffusers

Logs

System Info

Latest diffusion version (0.33.0)
A100
pytorch/pytorch:2.4.1-cuda12.4-cudnn9-devel
accelerate, peft, transformers are also all up to date

Moreover, torch.compile do not work

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions