Skip to content

A standalone version of the next PyTorch ONNX exporter

License

Notifications You must be signed in to change notification settings

justinchuby/torch-onnx

Repository files navigation

PyTorch to ONNX Exporter

PyPI version

Experimental torch ONNX exporter. Compatible with torch>=2.1,<2.6.

Warning

This is an experimental project and is not designed for production use. Use torch.onnx.export(..., dynamo=True) for these purposes. The main logic from this project has been merged into PyTorch.

Installation

pip install --upgrade torch-onnx

Usage

import torch
import torch_onnx
from onnxscript import ir
import onnx

# Get an exported program with torch.export
exported = torch.export.export(...)
model = torch_onnx.exported_program_to_ir(exported)
proto = ir.to_proto(model)
onnx.save(proto, "model.onnx")

# Or patch the torch.onnx export API
# Set error_report=True to get a detailed error report if the export fails
torch_onnx.patch_torch(report=True, verify=True, profile=True)
torch.onnx.export(...)

# Use the analysis API to print an analysis report for unsupported ops
torch_onnx.analyze(exported)

Design

{dynamo/jit} -> {ExportedProgram} -> {torchlib} -> {ONNX IR} -> {ONNX}

  • Use ExportedProgram
    • Rely on robustness of the torch.export implementation
    • Reduce complexity in the exporter
    • This does not solve dynamo limitations, but it avoids introducing additional breakage by running fx passes
  • Flat graph; Scope info as metadata, not functions
    • Because existing tools are not good at handling them
  • Eager optimization where appropriate
    • Because exsiting tools are not good at optimizing
  • Drop in replacement for torch.onnx.export
    • Minimum migration effort
  • Build graph eagerly in the exporter
    • Give the exporter full control over the graph being built

Why is this doable?