LLVM Discussion Forums

[RFP] Propose a better name for the Linalg dialect

This post introduces the rationale for the Linalg dialect.

One point of discussion is that the Linalg name is not a representative name for these StructuredOps abstractions:

Given the evolution of the scope, it becomes apparent that a better name than “Linalg” could remove some of the confusions related to the dialect (and the underlying approach), its goals and limitations.

So far, @antiagainst proposed the following to get the discussion going:

  • Structured payload container ops ( spc ): emphasizes the structured op’s principle and echos the payload ops perspective.
  • Parallel domain innate ops ( pdi ): emphasizes the point that ops have loop iterators built in to the op structurally.

If it helps, I am personally thinking of this as “Tensor Comprehensions in the IR”, so a sort of TensorComprehensions++ (tcpp?).

Others have floated the name HLO++ in the past.

Would anyone care to propose other names?

Thank you!

tcpp seems the most descriptive to me. spc is too general (it can be applied to many other domain) and pdi focuses on the parallel aspect which does not describe well what Linalg is as far as I understand.

Any discussion I was involved that mentioned “HLO++” was about extensions to HLO in MLIR, but with a starting point that would be HLO. So a clear super-set of HLO with the same root in the design, I wouldn’t use this for Linalg.

What about STC for Structured Tensor Comprehensions? It follows the Structured Ops terminology by narrowing the set of ops to tensor comprehensions.

Alternatively, Structured Tensor Ops or STOps, leading to the stops dialect :-), avoiding the comprehensions name which comes with its own bagage, and accommodating future non-rectangular, sparse extensions.

Note: the name tensor comprehensions does not specifically refers to the Tensor Comprehensions language, but more generally an open-ended equational language capturing, among others, tensor contractions, tensor convolutions in the hyper-rectangular dense case, as well as tensor concatenation and conditional/predicated definitions. It is open-ended in the sense that this may be extended in the future (non-rectangular, sparse, recurrences).

1 Like

Or TTOps Top-level Tensor Ops :slight_smile:

+1 to tcpp or some permutation of “tc”.

Agreed that “spc” is too general and applies to many things.

Other options in that theme:
“tcm” - tensor comprehensions in MLIR
“stc” - structured tensor comprehensions (somewhat redundant but draws some concepts together)

The reason I would lean on a moniker like “tensor comprehensions”, other than the lineage and instead of a highly generic name, is that it establishes a nice brand without being exclusive relative to other things that might have a similar aim and different approach. I think it is important to not sit on too much of the generic namespace.

Other questions:

  • currently, this all ties together nicely for dense tc-like things. Do you anticipate this same moniker to apply to sparse, or will/should that be a different enough abstraction to warrant a different name? If so, may want to include “dense” in the name or have something that allows you to tack a “sparse” onto it as you expand in that direction.
  • do we want to enforce some name distance between what we’ve been discussing as “tensor compute primitives” and this? The dialect-currently-known-as-linalg is a good candidate for something “tcp” like but I’m hesitant to say it is exclusively “tcp”. If we think there should be naming separation to better enable evolution, then “tcpp” is going to be confusing.

I have a strong preference for something which is not yet another TLA.
Linalg may not be perfect from a meaning perspective, but it’s a whole lot better in my opinion than any TLA.