IREE's Input Dialect and ML Frontends ODM?

Hi folks, over the past month on the IREE side, we have completed our frontend layering fixup so that all of our ML frontend integrations do not depend on IREE proper. This involved both build/repo maintenance and finishing the last bits of IR layering needed to disconnect them from certain concepts that were only represented in IREE internal dialects. We did this for our own reasons: it lets us build the core compiler in a separable way with tight dependencies and defined boundaries. However, in the back of my mind, I also had the agenda of getting this separated enough that the MLIR community could grok the layers and we could discuss whether it is time to start more of a central ML/Frontends sub-project or set of dialects.

I’m not going to claim perfection on what we have, but I will claim a fairly comprehensive starting point that covers a wide swath of programs from:

  • TFLite: See tests (currently being upstreamed to our main repo/test suite). This was a joint effort between IREE and the folks at ARM working on TOSA.
  • Torch-MLIR: Now at the level where it is running its first non-trivial programs on IREE.
  • JAX: Pure Python integration with Jax to IREE compilation and runtime.
  • PyDM: Experimental MLIR dialect and Python interface for extracting Python programs at the frontend level. Intended use is for robustly representing native Python semantics in a lossless way for frontends that include such things.
  • TensorFlow: This was our original integration and has the most history to it. It was also the last to get its layering cleaned up from our side (which just landed this week).

In order to make such a set of things reusable, layering is key, and I was wondering if there is interest from the community in an ODM presentation where I document and walk through the rationale and details of each, highlighting upstream opportunities? We’ve got a lot of history here and I would rather such a comprehensive view vs just a disconnected discussion of individual details on list.

One of the big topics of such a presentation would be a walk-through of IREE’s Input Dialect, which is a standalone dialect (can be used via cp and has no dependencies on anything else in the project) which contains “the rest of the story” we have needed to surface over time to represent everything. These are “thin” ops that we transform at the beginning of our pipeline to a variety of internal constructs, and they are not opinion free: some of them represent specific opinions of IREE’s runtime model and carry additional semantics than anything in MLIR core. We will probably always have an “iree_input” dialect as an overflow area for new things, but I would like to start flushing the queue on as much of this as possible and seeing if it has a better home upstream.

Please let me know if there is interest and I will schedule something.
Stella

13 Likes

Seems like a great topic! Thanks for offering to share :slight_smile:

I’ve dropped a slot on the agenda doc for 2/17 to talk about this.

3 Likes