Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there an export path from TensorFlow models to VHLO? #2708

Open
ScottTodd opened this issue Feb 5, 2025 · 3 comments
Open

Is there an export path from TensorFlow models to VHLO? #2708

ScottTodd opened this issue Feb 5, 2025 · 3 comments
Assignees

Comments

@ScottTodd
Copy link
Member

Request

Is there an official/supported path from TensorFlow models to StableHLO (ideally using VHLO for backward compatibility guarantees)?

I see these tutorials already:

I scanned through https://github.com/tensorflow/tensorflow and https://github.com/openxla/stablehlo looking for a VHLO export path but did not find one. Ideally I'd be looking for some export API in some readily available Python package like tensorflow or stablehlo (if published to pypi and available on Linux + macOS + Windows).

Background

I'm one of the maintainers on the IREE project (https://iree.dev/). We've supported compiling TensorFlow programs since the early days of MHLO in XLA/TF, through the migration to StableHLO, and up to today. We've supported a tool for exporting from TensorFlow SavedModel programs to StableHLO that our compiler can consume. We publish this tool in our https://pypi.org/project/iree-tools-tf/ package. That tool originally had a source dependency on TensorFlow's C++ code and we later migrated to [experimental] Python APIs.

Today we use the these APIs from https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/pywrap_mlir.py / https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/compiler/mlir/mlir.py:

The latest source is here: https://github.com/iree-org/iree/blob/main/integrations/tensorflow/python_projects/iree_tf/iree/tools/tf/scripts/iree_import_tf/__main__.py. That export path has been "working", but I see it emitting stablehlo and not vhlo. We'd like to rather direct users to a non-experimental API from upstream sources with stronger compatibility properties, instead of maintaining our own import pipeline built on experimental APIs with a constant risk of version drift.

In iree-org/iree#19852, I tried following the instructions in https://openxla.org/stablehlo/compatibility to serialize portable artifacts from the stablehlo that we do emit, but I ran into opaque "ValueError: failed to serialize module" style errors, possibly due to the stablehlo exports containing both stablehlo and ml_program operations, or the IR being produced by a mismatched version of TF and StableHLO. Piecing together a functional pipeline from disconnected packages is tricky, so I'd rather depend on an official API that has all the right code available in a single process.

We may remove our usage of the experimental APIs soon (iree-org/iree#19917) and drop support for TensorFlow entirely if no replacement is available using upstream (TF/StableHLO) packages. At that point our only usage of StableHLO would be for JAX or other sources of StableHLO programs and given low usage that might mean dropping StableHLO support entirely. I'd like to keep supporting these frameworks and StableHLO if possible, since supporting a variety of program sources is generally a good way to keep code quality high.

@GleasonK
Copy link
Member

GleasonK commented Feb 6, 2025

I believe @sdasgup3 worked on an API packaged with TF. There is also experimental_get_compiler_ir which can emit stablehlo from a tf function (name may be typo, will confirm shortly)

@GleasonK
Copy link
Member

GleasonK commented Feb 6, 2025

https://github.com/tensorflow/tensorflow/blob/b4885f395f65e5418f02e90a73a0ac6d6e288113/tensorflow/compiler/mlir/tensorflow_to_stablehlo/python/integration_test/tensorflow_to_stablehlo_test.py#L67

There's the TF to StableHLO bytecode. As for the VHLO part, you can use StableHLO bindings built with TF for that, probably safe to pass "1.0.0" for TF workloads, I don't think any post-1.0 features have been integrated into TF (may see new features if using JAX2TF created savedmodels:

https://github.com/tensorflow/tensorflow/blob/b4885f395f65e5418f02e90a73a0ac6d6e288113/tensorflow/compiler/tests/xla_call_module_test.py#L41

@ScottTodd
Copy link
Member Author

Awesome! That gives me what I need, thanks so much!

Tested a bit here: https://colab.research.google.com/gist/ScottTodd/d7c5328a0a383cd8cc0dc2a2da827f7f/tf-to-vhlo-to-iree.ipynb (merged those code samples with https://colab.research.google.com/github/iree-org/iree/blob/main/samples/colab/tensorflow_hub_import.ipynb)

  1. Download a pretrained SavedModel
  2. Preprocess the model to give it a serving signature
  3. Convert to StableHLO
  4. Serialize to VHLO (MLIR bytecode)
  5. Install the IREE tools
  6. Dump the VHLO bytecode to text for testing
  7. Compile using the IREE compiler (accepting the VHLO input)
  8. Run using the IREE runtime

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants