Closed
Description
Add Link
https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html
Describe the bug
Running
# Input to the model
x = torch.randn(batch_size, 1, 224, 224, requires_grad=True)
torch_out = torch_model(x)
# Export the model
torch.onnx.export(torch_model, # model being run
x, # model input (or a tuple for multiple inputs)
"super_resolution.onnx", # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
opset_version=10, # the ONNX version to export the model to
do_constant_folding=True, # whether to execute constant folding for optimization
input_names = ['input'], # the model's input names
output_names = ['output'], # the model's output names
dynamic_axes={'input' : {0 : 'batch_size'}, # variable length axes
'output' : {0 : 'batch_size'}})
results in
ModuleNotFoundError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/torch/onnx/_internal/onnx_proto_utils.py](https://localhost:8080/#) in _add_onnxscript_fn(model_bytes, custom_opsets)
220 try:
--> 221 import onnx
222 except ImportError as e:
ModuleNotFoundError: No module named 'onnx'
The above exception was the direct cause of the following exception:
OnnxExporterError Traceback (most recent call last)
3 frames
[/usr/local/lib/python3.10/dist-packages/torch/onnx/_internal/onnx_proto_utils.py](https://localhost:8080/#) in _add_onnxscript_fn(model_bytes, custom_opsets)
221 import onnx
222 except ImportError as e:
--> 223 raise errors.OnnxExporterError("Module onnx is not installed!") from e
224
225 # For > 2GB model, onnx.load_fromstring would fail. However, because
OnnxExporterError: Module onnx is not installed!
Describe your environment
CoLab
CUDA/GPU
Torch 2.1.0+cu118
cc @sekyondaMeta @svekars @carljparker @NicolasHug @kit1980 @subramen