Skip to content

[mlir] Add apply_patterns.linalg.generalize_pack_unpack TD Op #116373

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,18 @@ def ApplyEraseUnnecessaryInputsPatternsOp : Op<Transform_Dialect,
let assemblyFormat = "attr-dict";
}

def ApplyGeneralizeTensorPackUnpackPatternsOp
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for asking this question while the naming is following the existing style. I'm confused about the difference between Generalization and Decomposition. Should we unify the terminology? How do you interpret them?

(I could be the root cause who invented the terms when I wrote the patterns. And I feel that they could be renamed to decomposition now.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm confused about the difference between Generalization and Decomposition.

Same and I am glad that you've raised this :) "Decompose" would make more sense to me - "Generalization" tends to be use in the context of Linalg named Ops (e.g. linalg.matmul -> linalg.generic). I am happy to rename these as "Decomposition" 👍🏻

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, that's probably why it put me to use generalization in the first place -- because they were prototyped in linalg_ext, and I could be thinking that linalg_ext -> linalg is a generalization. Anyway, I'm +1 on renaming it to decomposition!

Copy link
Contributor Author

@banach-space banach-space Nov 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Renamed in the 2nd commit ... that should probably become a separate PR. WDYT?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having a separate PR sounds good to me. I was in meetings, and I'll take a look at your other replies after lunch.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you revert the last two commits in this case?

: Op<Transform_Dialect, "apply_patterns.linalg.generalize_pack_unpack",
[DeclareOpInterfaceMethods<PatternDescriptorOpInterface>]> {
let description = [{
Collect patterns to generalize tensor.pack and tensor.unpack (i.e. to
decompose it into e.g. tensor::PadOp, linalg::transposeOp etc). Requires
all outer dims to be unit.
}];

let assemblyFormat = "attr-dict";
}

def ApplyFoldUnitExtentDimsViaReshapesPatternsOp : Op<Transform_Dialect,
"apply_patterns.linalg.fold_unit_extent_dims_via_reshapes",
[DeclareOpInterfaceMethods<PatternDescriptorOpInterface>]> {
Expand Down
9 changes: 7 additions & 2 deletions mlir/include/mlir/Dialect/Linalg/Transforms/Transforms.h
Original file line number Diff line number Diff line change
Expand Up @@ -1516,8 +1516,8 @@ struct GeneralizePadOpPattern : public OpRewritePattern<tensor::PadOp> {
};

/// Rewrites a tensor::PackOp into a sequence of:
/// * tensor::PadOp + linalg::TransposeOp +
/// tensor::EmptyOp + tensor::InsertSliceOp ops.
/// * tensor::PadOp + linalg::TransposeOp + tensor::EmptyOp +
/// tensor::InsertSliceOp ops.
///
/// Required that all the outer dims of the input tensor::PackOp are 1.
///
Expand Down Expand Up @@ -1683,6 +1683,11 @@ void populateLinalgGenericOpsSpecializationPatterns(
void populateDecomposeConvolutionPatterns(RewritePatternSet &patterns,
PatternBenefit benefit = 1);

/// Populates patterns to decompose tensor.pack and tensor.unpack Ops into e.g.
/// tensor.pad, linalg.transpose, tensor.{insert|extract}_slice. Require all
/// outer dims to be unit.
void populateGeneralizePatterns(RewritePatternSet &patterns);

/// Populates patterns to transform linalg.conv_2d_xxx operations into
/// linalg.generic (for img2col packing) and linalg.matmul.
/// \see rewriteInIm2Col for more details.
Expand Down
5 changes: 5 additions & 0 deletions mlir/lib/Dialect/Linalg/TransformOps/LinalgTransformOps.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,11 @@ void transform::ApplyEraseUnnecessaryInputsPatternsOp::populatePatterns(
linalg::populateEraseUnnecessaryInputsPatterns(patterns);
}

void transform::ApplyGeneralizeTensorPackUnpackPatternsOp::populatePatterns(
RewritePatternSet &patterns) {
linalg::populateGeneralizePatterns(patterns);
}

void transform::ApplyFoldUnitExtentDimsViaReshapesPatternsOp::populatePatterns(
RewritePatternSet &patterns) {
linalg::ControlDropUnitDims options;
Expand Down
5 changes: 5 additions & 0 deletions mlir/lib/Dialect/Linalg/Transforms/Transforms.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -1618,3 +1618,8 @@ void linalg::populateDecomposeConvolutionPatterns(RewritePatternSet &patterns,
DownscaleSizeOneWindowed2DConvolution<PoolingNchwMaxOp, PoolingNcwMaxOp>>(
patterns.getContext(), benefit);
}

void linalg::populateGeneralizePatterns(RewritePatternSet &patterns) {
// TODO: Add and test patterns for tensor.unpack
patterns.add<GeneralizeOuterUnitDimsPackOpPattern>(patterns.getContext());
}
3 changes: 1 addition & 2 deletions mlir/test/Dialect/Linalg/generalize-tensor-pack.mlir
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
// RUN: mlir-opt -split-input-file --test-linalg-transform-patterns="test-generalize-tensor-pack" %s | FileCheck %s

// RUN: mlir-opt --transform-preload-library='transform-library-paths=%p/td/generalize-pack.mlir' -split-input-file --transform-interpreter %s | FileCheck %s

func.func @simple_KCRS_to_KCRSsr(%arg0: tensor<?x?xi32>, %arg1: tensor<1x1x?x1xi32>) -> tensor<1x1x?x1xi32> {
%c8 = arith.constant 8 : index
Expand Down
2 changes: 2 additions & 0 deletions mlir/test/Dialect/Linalg/lit.local.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Skip the directory with input TD sequences
config.excludes = ["td"]
12 changes: 12 additions & 0 deletions mlir/test/Dialect/Linalg/td/generalize-pack.mlir
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
module @transforms attributes { transform.with_named_sequence } {
transform.named_sequence @__transform_main(%module: !transform.any_op {transform.readonly}) {
%pack = transform.structured.match ops{["tensor.pack"]} in %module : (!transform.any_op) -> !transform.any_op

%1 = transform.get_parent_op %pack {isolated_from_above} : (!transform.any_op) -> !transform.any_op
transform.apply_patterns to %1 {
transform.apply_patterns.linalg.generalize_pack_unpack
} : !transform.any_op

transform.yield
}
}
Loading