Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] <title> #305

Open
LeeLaugh-jason opened this issue Dec 6, 2024 · 4 comments
Open

[QUESTION] <title> #305

LeeLaugh-jason opened this issue Dec 6, 2024 · 4 comments
Assignees

Comments

@LeeLaugh-jason
Copy link

Question or Issue

下载模型完成后,在网页尝试转文字,报错
Error during audio transcription: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from D:\Nexa-SDK_internal\faster_whisper\assets\silero_encoder_v5.onnx failed:Load model D:\Nexa-SDK_internal\faster_whisper\assets\silero_encoder_v5.onnx failed. File doesn't exist

OS

Windows10

Python Version

3.12

Nexa SDK Version

No response

GPU (if using one)

No response

@Davidqian123
Copy link
Collaborator

which model did you try?

@yorkew-east8
Copy link

yorkew-east8 commented Dec 13, 2024

Same problem.

model: faster-whisper-base:bin-cpu-fp16
Env: macOS 15.1.1 (M4 Pro CPU)
Nexa SDK Version: 0.0.9.6
Error Message:
Error during audio transcription: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from /Applications/Nexa.app/Contents/Frameworks/faster_whisper/assets/silero_encoder_v5.onnx failed:Load model /Applications/Nexa.app/Contents/Frameworks/faster_whisper/assets/silero_encoder_v5.onnx failed. File doesn't exist

I check the files in the error message and nothing was found,
ls: /Applications/Nexa.app/Contents/Frameworks/faster_whisper/: No such file or directory

I checked withnexa list , model's location shows:
/Users/xxxxxx/.cache/nexa/hub/official/faster-whisper-base/bin-cpu-fp16

so, wrong directory?

@yorkew-east8
Copy link

I'm not sure if this is related to the issue, but I keep getting a 403 error whenever I try to pull the model now...

$ nexa pull omniVLM:q8_0
Downloading omniVLM/q8_0.gguf...
An error occurred while downloading or processing the model: An unexpected error occurred: 403 Client Error: Forbidden for url: https://public-storage.nexa4ai.com/omniVLM/q8_0.gguf
Failed to pull model omniVLM:q8_0. If you are using local path, be sure to add --local_path and --model_type flags.

@xsxszab
Copy link
Collaborator

xsxszab commented Feb 5, 2025

This issue has been resolved. You should now be able to run nexa pull omniVLM:q8_0 without any problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants