Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

examples/industrial_data_pretraining/fsmn_vad_streaming/export.py导出错误 #2313

Open
Culturenotes opened this issue Dec 16, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@Culturenotes
Copy link

funasr version: 1.2.0.
Check update of funasr, and it would cost few times. You may disable it by set disable_update=True in AutoModel
You are using the latest version of funasr-1.2.0
Downloading Model to directory: C:\Users\50510.cache\modelscope\hub\iic/speech_fsmn_vad_zh-cn-16k-common-pytorch
2024-12-16 18:56:25,122 - modelscope - WARNING - Using branch: master as version is unstable, use with caution
Traceback (most recent call last):
File "c:\Users\50510\Downloads\ASR-LLM-TTS-master\vad\a.py", line 5, in
res = model.export(type="onnx", quantize=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\funasr\auto\auto_model.py", line 664, in export
export_dir = export_utils.export(model=model, data_in=data_list, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\funasr\utils\export_utils.py", line 24, in export
_onnx(
File "C:\Users\50510\anaconda3\Lib\site-packages\funasr\utils\export_utils.py", line 80, in _onnx
torch.onnx.export(
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\onnx\utils.py", line 551, in export
_export(
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\onnx\utils.py", line 1648, in _export
graph, params_dict, torch_out = _model_to_graph(
^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\onnx\utils.py", line 1170, in _model_to_graph
graph, params, torch_out, module = _create_jit_graph(model, args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\onnx\utils.py", line 1046, in _create_jit_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\onnx\utils.py", line 950, in _trace_and_get_graph_from_model
trace_graph, torch_out, inputs_states = torch.jit._get_trace_graph(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\jit_trace.py", line 1497, in _get_trace_graph
outs = ONNXTracedModule(
^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\jit_trace.py", line 141, in forward
graph, out = torch._C._create_graph_by_tracing(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\jit_trace.py", line 132, in wrapper
outs.append(self.inner(*trace_inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1543, in _slow_forward
result = self.forward(*input, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\funasr\models\fsmn_vad_streaming\export_meta.py", line 28, in export_forward
scores, out_caches = self.encoder(feats, *args)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\torch\nn\modules\module.py", line 1543, in _slow_forward
result = self.forward(*input, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\50510\anaconda3\Lib\site-packages\funasr\models\fsmn_vad_streaming\encoder.py", line 333, in forward
in_cache = args[i]
~~~~^^^
IndexError: tuple index out of range

@Culturenotes Culturenotes added the bug Something isn't working label Dec 16, 2024
@LauraGPT LauraGPT self-assigned this Dec 16, 2024
@slin000111
Copy link

torch                         2.3.1
onnx                          1.17.0
funasr                        1.1.16

导出成功。

pip install funasr -U

报错如下:

Traceback (most recent call last):
  File "/mnt/workspace/export_vad.py", line 1, in <module>
    from funasr import AutoModel
  File "/usr/local/lib/python3.10/site-packages/funasr/__init__.py", line 39, in <module>
    from funasr.auto.auto_model import AutoModel
  File "/usr/local/lib/python3.10/site-packages/funasr/auto/auto_model.py", line 30, in <module>
    from funasr.utils import export_utils
  File "/usr/local/lib/python3.10/site-packages/funasr/utils/export_utils.py", line 5, in <module>
    from onnxconverter_common import float16
ModuleNotFoundError: No module named 'onnxconverter_common'
torch                             2.4.0
onnx                              1.17.0
funasr                            1.2.0

报错如下:

Traceback (most recent call last):
  File "/mnt/workspace/ms_issue/export_vad.py", line 13, in <module>
    res = model.export(type="onnx", quantize=False)
  File "/mnt/workspace/FunASR/funasr/auto/auto_model.py", line 664, in export
    export_dir = export_utils.export(model=model, data_in=data_list, **kwargs)
  File "/mnt/workspace/FunASR/funasr/utils/export_utils.py", line 24, in export
    _onnx(
  File "/mnt/workspace/FunASR/funasr/utils/export_utils.py", line 80, in _onnx
    torch.onnx.export(
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/onnx/utils.py", line 551, in export
    _export(
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/onnx/utils.py", line 1648, in _export
    graph, params_dict, torch_out = _model_to_graph(
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/onnx/utils.py", line 1170, in _model_to_graph
    graph, params, torch_out, module = _create_jit_graph(model, args)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/onnx/utils.py", line 1046, in _create_jit_graph
    graph, torch_out = _trace_and_get_graph_from_model(model, args)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/onnx/utils.py", line 950, in _trace_and_get_graph_from_model
    trace_graph, torch_out, inputs_states = torch.jit._get_trace_graph(
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/jit/_trace.py", line 1497, in _get_trace_graph
    outs = ONNXTracedModule(
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/jit/_trace.py", line 141, in forward
    graph, out = torch._C._create_graph_by_tracing(
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/jit/_trace.py", line 132, in wrapper
    outs.append(self.inner(*trace_inputs))
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1543, in _slow_forward
    result = self.forward(*input, **kwargs)
  File "/mnt/workspace/FunASR/funasr/models/fsmn_vad_streaming/export_meta.py", line 28, in export_forward
    scores, out_caches = self.encoder(feats, *args)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/mnt/workspace/miniconda3/envs/funasr_export/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1543, in _slow_forward
    result = self.forward(*input, **kwargs)
  File "/mnt/workspace/FunASR/funasr/models/fsmn_vad_streaming/encoder.py", line 333, in forward
    in_cache = args[i]
IndexError: tuple index out of range

测试代码如下:

from funasr import AutoModel

model = AutoModel(model="iic/speech_fsmn_vad_zh-cn-16k-common-pytorch")

res = model.export(type="onnx", quantize=False)
print(res)

@LauraGPT
Copy link
Collaborator

Bugfix: 1e5ef6e

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants