CUDA error with AMD GPU? #7867
mikeperalta1
started this conversation in
General
Replies: 3 comments 4 replies
-
Same here, commenting to bump I am trying to run it on an AMD RX7700XT Logs below
|
Beta Was this translation helpful? Give feedback.
3 replies
-
Have you solved this problem ? |
Beta Was this translation helpful? Give feedback.
1 reply
-
Running into this exact same issue on docker
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey all,
Trying to figure out what I'm doing wrong. Trying to run llama with an AMD GPU (6600XT) spits out a confusing error, as I don't have an NVIDIA GPU:
It does detect my GPU:
Currently building with
make LLAMA_HIPBLAS=1 LLAMA_HIP_UMA=1 AMDGPU_TARGETS=gfx1032 -j8
but I've tried other variants to no avail. Here's my build script:Clearly I'm just brute-forcing and don't know how this is supposed to work. Any advice?
Beta Was this translation helpful? Give feedback.
All reactions