Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README file to include Intel Arc Graphics as supported under the "Hardware Supported" title #844

Open
n3thshan opened this issue Feb 17, 2025 · 7 comments

Comments

@n3thshan
Copy link

No description provided.

@ericcurtin
Copy link
Collaborator

ericcurtin commented Feb 17, 2025

Could you open a PR for this @n3thshan ? In general if you are able to make the change you suggest, just open the PR, don't worry about the issue

@n3thshan
Copy link
Author

n3thshan commented Feb 18, 2025

@ericcurtin haha im sorry. i will do one. However, i also ran into an issue when trying out an 8b model. Im not sure if this is a ramalama or an issue with something else but i get this error when i try to run it:

get_memory_info: [warning] ext_intel_free_memory is not supported (export/set ZES_ENABLE_SYSMAN=1 to support), use total memory as free memory

i tried to export the variable but it keeps outputting this message. This may sound completely irrelevant but im reluctant to create a PR because of this. Any suggestions? (i installed ramalama through homebrew btw). This is what my ramalama info outputs:

        "Detected GPUs": [
            {
                "Details": "00:02.0 VGA compatible controller: Intel Corporation Meteor Lake-P [Intel Arc Graphics] (rev 08)",
                "Env": "ONEAPI_DEVICE_SELECTOR",
                "GPU": "Intel GPU",
                "VRAM": "Unknown"
            }```

@rhatdan
Copy link
Member

rhatdan commented Feb 19, 2025

The ramalama info and ramalama run are using different ways of looking at the GPUs, so the Info might not match what is actually happening. I did fix some of ramalama info GPU detection, but there very well could be other issues.

@rhatdan
Copy link
Member

rhatdan commented Feb 19, 2025

Here is the code that ramalama info uses for Intel GPU detection:

    def get_intel_gpu(self):
        """Detect Intel GPUs using `lspci` and `/sys/class/drm/` for VRAM info."""
        gpus = []

        # Step 1: Use lspci to detect Intel GPUs
        try:
            output = subprocess.check_output("lspci | grep -i 'VGA compatible controller'", shell=True, text=True)
            for line in output.splitlines():
                if "Intel Corporation" in line:
                    gpu_info = {"GPU": "Intel", "Details": line.strip()}
                    gpus.append(gpu_info)
        except subprocess.CalledProcessError:
            pass  # No Intel GPU found

        # Step 2: Use `/sys/class/drm/` to read VRAM info
        vram_info = self._read_gpu_memory(
            '/sys/class/drm/card*/device/mem_info_vram_total', "Intel GPU", "ONEAPI_DEVICE_SELECTOR"
        )

        # If lspci found an Intel GPU, add VRAM info
        if gpus:
            for gpu in gpus:
                gpu.update(vram_info)
        else:
            gpus.append(vram_info)  # If no lspci match, return VRAM data anyway

        return gpus

@rhatdan
Copy link
Member

rhatdan commented Feb 19, 2025

While intel detection in ramalama run is using.


    # INTEL iGPU CASE (Look for ARC GPU)
    igpu_num = 0
    for fp in sorted(glob.glob('/sys/bus/pci/drivers/i915/*/device')):
        with open(fp, 'rb') as file:
            content = file.read()
            if b"0x7d55" in content:
                igpu_num += 1

    if igpu_num:
        os.environ["INTEL_VISIBLE_DEVICES"] = str(igpu_num)

@n3thshan
Copy link
Author

@hanthor suggested to rebuild the intel image with the ZES_ENABLE_SYSMAN=1 to see if it will work to clear out the error message. Another issue is that any graphical system monitor will not detect the model being loaded while btop shows it. (gguf models require btop to be run in sudo while normal ones only show up in btop regardless of sudo or not)

@rhatdan
Copy link
Member

rhatdan commented Feb 19, 2025

Try that out and see if it works, then open a PR to better handle Intel.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants