-
-
Notifications
You must be signed in to change notification settings - Fork 582
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] how to export huggingface whisper models to onnx ? #100
Comments
Hi, in the next week I will probably write an article on how I converted whisper in onnx and optimized it. |
@niedev Hello, do you have any plans to release the Whisper model converted to ONNX? I’m very eager to know. Thank you. |
@laixiaofeng0 all the models used by RTranslator are here in the assets. |
Hi, author. First of all, thank you for this wonderful project. Could I ask when you will release the "article" about converting whisper in onnx (including kv-cache and quantization)? since I want to load my fine-tuned whisper model into R-translator. |
@Eric-Edf Thank you! I haven't had much time lately so I haven't even managed to start it, sorry, but as soon as I can do it I'll link it here |
Thank you for your reply. Looking forward to it! |
On huggingface , we need to use some safetensors file
here :
https://huggingface.co/openai/whisper-large-v3-turbo
how can we convert these models to onnx that is compatible with this library ?
Also for NLLB onnx
We need to fine tune and change onnx files.
But your onnx files different from whisper turbo v3 models
The text was updated successfully, but these errors were encountered: