-
Notifications
You must be signed in to change notification settings - Fork 448
Multi modal (Just libraries) #1220
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…nto MultiModal
It looks like llama.cpp no longer offers |
Test run for CUDA 12.4 only started here: https://github.com/martindevans/LLamaSharp/actions/runs/16153517357, will PR that if it passes, then hopefully this PR should compile too |
I've created a pull request to this branch (see here). It changes over to CUDA 12.4 build. |
Switched to CUDA 12.4 build only
I'm going to merge this and kick off a test build. |
@martindevans, I will be working on the changes during the next days. |
The test run (https://github.com/SciSharp/LLamaSharp/actions/runs/16231321874) mostly passed, just a minor issue with file naming at the end. So we've got a working CUDA build for all your bit whenever they're ready :) |
This are the changes to build and copy the right dynamic libraries as first step to introduce code changes.
@martindevans, CUDA compilation in Windows doesn´t work. If I see the problem correctly it requires CUDA 12.4. But I'm not sure, and I cannot test it on Windows.