Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eval bug: CANNOT LINK EXECUTABLE "./llama-cli": library "libomp.so" not found: needed by main executable #11979

Open
Krallbe68 opened this issue Feb 20, 2025 · 5 comments

Comments

@Krallbe68
Copy link

Name and Version

./llama-cli --version
CANNOT LINK EXECUTABLE "./llama-cli": library "libomp.so" not found: needed by main executable

Operating systems

Other? (Please let us know in description)

GGML backends

Vulkan

Hardware

Qualcomm snapdragon 8 gen 1
I use opencl!!!
I am using termux

Models

Llama 3.21b q4

Problem description & steps to reproduce

./llama-cli -m /sdcard/download/0.gguf CANNOT LINK EXECUTABLE "./llama-cli": library "libomp.so" not found: needed by main executable when I try to use any binary package I'm getting this error

First Bad Commit

No response

Relevant log output

ls
android-ndk-r27b  android-ndk-r27b-aarch64.zip  dev  llama.cpp  st.sh
~ $ cd dev
~/dev $ cd llm
~/dev/llm $ ls
OpenCL-Headers  OpenCL-ICD-Loader  llama.cpp
~/dev/llm $ cd llama.cpp
~/dev/llm/llama.cpp $ l
No command l found, did you mean:
 Command ld in package binutils
 Command ld in package binutils-is-llvm
 Command c in package c-script
 Command [ in package coreutils
 Command lp in package cups
 Command dl in package gatling
 Command k in package kona
 Command lf in package lf
 Command pl in package libgnustep-base
 Command lr in package lr
 Command ml in package mercury
 Command al in package mono
 Command lz in package mtools
 Command o in package o-editor
 Command ol in package ol
 Command q in package q-dns-client
 Command sl in package sl
 Command ul in package util-linux
 Command X in package xorg-server
~/dev/llm/llama.cpp $ ls
AUTHORS            build-android                  flake.nix    pyproject.toml
CMakeLists.txt     ci                             ggml         pyrightconfig.json
CMakePresets.json  cmake                          gguf-py      requirements
CODEOWNERS         common                         grammars     requirements.txt
CONTRIBUTING.md    convert_hf_to_gguf.py          include      scripts
LICENSE            convert_hf_to_gguf_update.py   media        spm-headers
Makefile           convert_llama_ggml_to_gguf.py  models       src
Package.swift      convert_lora_to_gguf.py        mypy.ini     tests
README.md          docs                           pocs
SECURITY.md        examples                       poetry.lock
Sources            flake.lock                     prompts
~/dev/llm/llama.cpp $ build-android
build-android: command not found
~/dev/llm/llama.cpp $ cd build-android
~/.../llama.cpp/build-android $ ls
CMakeCache.txt         bin                    ggml                  pocs
CMakeFiles             build.ninja            install_manifest.txt  src
CTestTestfile.cmake    cmake_install.cmake    lib                   tests               DartConfiguration.tcl  common                 llama-config.cmake
Testing                compile_commands.json  llama-version.cmake
autogenerated          examples               llama.pc                                  ~/.../llama.cpp/build-android $ cd bi
bash: cd: bi: No such file or directory
~/.../llama.cpp/build-android $ cd bin
~/.../build-android/bin $ ls
llama-batched                  llama-lookup-merge        test-autorelease               llama-batched-bench            llama-lookup-stats        test-backend-ops
llama-bench                    llama-minicpmv-cli        test-barrier
llama-cli                      llama-parallel            test-c
llama-convert-llama2c-to-ggml  llama-passkey             test-chat
llama-cvector-generator        llama-perplexity          test-chat-template
llama-embedding                llama-q8dot               test-gguf
llama-eval-callback            llama-quantize            test-grammar-integration
llama-export-lora              llama-quantize-stats      test-grammar-parser
llama-gbnf-validator           llama-qwen2vl-cli         test-json-schema-to-grammar
llama-gen-docs                 llama-retrieval           test-llama-grammar
llama-gguf                     llama-run                 test-log
llama-gguf-hash                llama-save-load-state     test-model-load-cancel
llama-gguf-split               llama-server              test-quantize-fns
llama-gritlm                   llama-simple              test-quantize-perf
llama-imatrix                  llama-simple-chat         test-rope
llama-infill                   llama-speculative         test-sampling
llama-llava-cli                llama-speculative-simple  test-tokenizer-0
llama-llava-clip-quantize-cli  llama-tokenize            test-tokenizer-1-bpe
llama-lookahead                llama-tts                 test-tokenizer-1-spm
llama-lookup                   llama-vdot
llama-lookup-create            test-arg-parser
~/.../build-android/bin $ ./llama-cli -m /sdcard/download/0.gguf
CANNOT LINK EXECUTABLE "./llama-cli": library "libomp.so" not found: needed by main executable
~/.../build-android/bin $
@ngxson
Copy link
Collaborator

ngxson commented Feb 20, 2025

According to https://github.com/ggml-org/llama.cpp/blob/master/docs/android.md

You can turn it off by -DGGML_OPENMP=OFF

$ cmake \
  -DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK/build/cmake/android.toolchain.cmake \
  -DANDROID_ABI=arm64-v8a \
  -DANDROID_PLATFORM=android-28 \
  -DCMAKE_C_FLAGS="-march=armv8.7a" \
  -DCMAKE_CXX_FLAGS="-march=armv8.7a" \
  -DGGML_OPENMP=OFF \
  -DGGML_LLAMAFILE=OFF \
  -B build-android

@Krallbe68
Copy link
Author

cmake .. -G Ninja
-DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK/build/cmake/android.toolchain.cmake
-DANDROID_ABI=arm64-v8a
-DANDROID_PLATFORM=android-28
-DGGML_OPENMP=OFF
-DGGML_LLAMAFILE=OFF
-DGGML_OPENCL=ON

I used this code to compile again and this time I get an error like this: ~/.../build-android/bin $ ./llama-cli CANNOT LINK EXECUTABLE "./llama-cli": library "libOpenCL.so.1" not found: needed by main executable

@belog2867
Copy link

If you use termux

LD_LIBRARY_PATH=/vendor/lib64:$PREFIX/lib ./llama-cli

@belog2867
Copy link

If it still doesn't work, maybe your mobile phone doesn't support opencl.

@Krallbe68
Copy link
Author

My device is Xiaomi Mi 10t Pro, it has Snapdragon 865 processor and Adreno 650 graphics unit. According to what I looked on the internet, it supports opencl 2.0, but there is no libopencl.co in vendor/lib64.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants