Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to export onnx model from flow #399

Open
zhangyike opened this issue Sep 14, 2024 · 1 comment
Open

Failed to export onnx model from flow #399

zhangyike opened this issue Sep 14, 2024 · 1 comment

Comments

@zhangyike
Copy link

env:
torch.version = 2.0.1+cu118
onnx.version = '1.16.0'
command:
python cosyvoice/bin/export_onnx.py --model_dir $dir
error logs:
/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/diffusers/models/attention_processor.py:645: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if current_length != target_length:
/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/diffusers/models/attention_processor.py:660: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if attention_mask.shape[0] < batch_size * head_size:
/apdcephfs_cq8/share_784792/users/yikezhang/public_repo/CosyVoice/examples/libritts/cosyvoice/cosyvoice/bin/../../third_party/Matcha-TTS/matcha/models/components/decoder.py:149: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
assert inputs.shape[1] == self.channels
============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 1 ERROR ========================
ERROR: missing-standard-symbolic-function

Exporting the operator 'aten::scaled_dot_product_attention' to ONNX opset version 18 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub: https://github.com/pytorch/pytorch/issues.
None

Traceback (most recent call last):
File "cosyvoice/bin/export_onnx.py", line 112, in
main()
File "cosyvoice/bin/export_onnx.py", line 68, in main
torch.onnx.export(
File "/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/torch/onnx/utils.py", line 506, in export
_export(
File "/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/torch/onnx/utils.py", line 1548, in _export
graph, params_dict, torch_out = _model_to_graph(
File "/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/torch/onnx/utils.py", line 1117, in _model_to_graph
graph = _optimize_graph(
File "/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/torch/onnx/utils.py", line 665, in _optimize_graph
graph = _C._jit_pass_onnx(graph, operator_export_type)
File "/root/miniconda3/envs/cosyvoice/lib/python3.8/site-packages/torch/onnx/utils.py", line 1901, in _run_symbolic_function
raise errors.UnsupportedOperatorError(
torch.onnx.errors.UnsupportedOperatorError: Exporting the operator 'aten::scaled_dot_product_attention' to ONNX opset version 18 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub: https://github.com/pytorch/pytorch/issues.

@aluminumbox
Copy link
Collaborator

use torch==2.2+, but onnx model rtf is not very stable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants