ONNX version of open-ai/whisper large model #4723
Unanswered
Kirankumar2609
asked this question in
Q&A
Replies: 1 comment
-
Once you have an onnx file, you can use onnxruntime to compute the prediction (https://onnxruntime.ai/docs/). The documentation for the python binding is here: https://onnxruntime.ai/docs/api/python/tutorial.html. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Has anyone tried to export open-ai/whisper large model to ONNX and then use it for transcription/translation as mentioned in the below hf website.
https://huggingface.co/docs/transformers/serialization#configuration-based-approach
Need help in getting the prediction from whisper_model.onnx file
Beta Was this translation helpful? Give feedback.
All reactions