Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Unsloth: Please file a bug report! Error patching SFTTrainer #1673

Open
IMJONEZZ opened this issue Feb 11, 2025 · 6 comments
Open

Comments

@IMJONEZZ
Copy link

IMJONEZZ commented Feb 11, 2025

Running the newer GRPO examples unchanged on WSL with Pixi for package management. For anyone unfamiliar with Pixi, it's just uv + conda.

Here's the pixi.toml for all the packages:

[project]
channels = ["https://prefix.dev/conda-forge", "nvidia", "pytorch", "xformers"]
description = "Add a short description here"
name = "Unsloth Demo"
platforms = ["linux-64", "win-64"]
version = "0.1.0"

[tasks]
checkxform = 'python -m xformers.info'
cudacheck = { cmd = 'python -c "import torch; print(torch.cuda.is_available()); print(torch.__version__)"', depends-on = ["checkxform"] }
start_small = { cmd = 'minimal_grpo.py', depends-on = ["cudacheck"] }
start_big = { cmd = 'python llama3_1_\(8b\)_grpo.py', depends-on = ["cudacheck"] }


[system-requirements]
cuda = "12.4"

[dependencies]
python = "3.11.9*"
black = ">=25.1.0,<26"
cuda-version = "==12.4"

[pypi-dependencies]
vllm="==0.7.2"
diffusers="==0.32.2"
peft="==0.14.0"
accelerate="==1.3.0"
bitsandbytes="==0.45.2"
trl = { git = "git+https://github.com/huggingface/trl.git@e95f9fb74a3c3647b86f251b7e230ec51c64b72b" }
xformers = ">=0.0.28.post3, <0.0.30"
torch = { version = "==2.5.1", index = "https://download.pytorch.org/whl/cu124" }
torchvision = { version = "==0.20.1", index = "https://download.pytorch.org/whl/cu124" }
setuptools = ">=75.8.0, <76"
unsloth = ">=2025.2.5, <2026"
unsloth-zoo = ">=2025.2.3, <2026"

Please let me know if there's an issue with one of these packages.

Here's the output of xformers.info and torch.cuda.is_available as well as version:

xFormers 0.0.28.post3                                                                                                                                                                                                                                                                                                        
memory_efficient_attention.ckF:                    unavailable                                                                                                                                                                                                                                                               
memory_efficient_attention.ckB:                    unavailable
memory_efficient_attention.ck_decoderF:            unavailable
memory_efficient_attention.ck_splitKF:             unavailable
memory_efficient_attention.cutlassF-pt:            available
memory_efficient_attention.cutlassB-pt:            available
[email protected]:         available
[email protected]:         available
[email protected]:             unavailable
[email protected]:             unavailable
memory_efficient_attention.triton_splitKF:         available
indexing.scaled_index_addF:                        available
indexing.scaled_index_addB:                        available
indexing.index_select:                             available
sequence_parallel_fused.write_values:              available
sequence_parallel_fused.wait_values:               available
sequence_parallel_fused.cuda_memset_32b_async:     available
sp24.sparse24_sparsify_both_ways:                  available
sp24.sparse24_apply:                               available
sp24.sparse24_apply_dense_output:                  available
sp24._sparse24_gemm:                               available
[email protected]:                 available
[email protected]:                        available
swiglu.dual_gemm_silu:                             available
swiglu.gemm_fused_operand_sum:                     available
swiglu.fused.p.cpp:                                available
is_triton_available:                               True
pytorch.version:                                   2.5.1+cu124
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3090
dcgm_profiler:                                     unavailable
build.info:                                        available
build.cuda_version:                                1201
build.hip_version:                                 None
build.python_version:                              3.11.10
build.torch_version:                               2.5.1+cu121
build.env.TORCH_CUDA_ARCH_LIST:                    6.0+PTX 7.0 7.5 8.0+PTX 9.0a
build.env.PYTORCH_ROCM_ARCH:                       None
build.env.XFORMERS_BUILD_TYPE:                     Release
build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS:        None
build.env.NVCC_FLAGS:                              -allow-unsupported-compiler
build.env.XFORMERS_PACKAGE_FROM:                   wheel-v0.0.28.post3
build.nvcc_version:                                12.1.66
source.privacy:                                    open source
True
2.5.1+cu124

And finally, here's the output of running the minimal GRPO example:

$ pixi run python minimal_grpo.py
🦥 Unsloth: Will patch your computer to enable 2x faster free finetuning.                                                                                                                                                                                                                                                    
🦥 Unsloth Zoo will now patch everything to make training faster!                                                                                                                                                                                                                                                            
Traceback (most recent call last):
  File "/mnt/c/Users/me/Unsloth Demo/.pixi/envs/default/lib/python3.11/site-packages/unsloth/tokenizer_utils.py", line 1061, in <module>
    exec(trainer_text, globals())
  File "<string>", line 4
    model = <class 'inspect._empty'>,
            ^
SyntaxError: invalid syntax

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/mnt/c/Users/me/Unsloth Demo/minimal_grpo.py", line 1, in <module>
    from unsloth import FastLanguageModel, PatchFastRL
  File "/mnt/c/Users/me/Unsloth Demo/.pixi/envs/default/lib/python3.11/site-packages/unsloth/__init__.py", line 212, in <module>
    from .models import *
  File "/mnt/c/Users/me/Unsloth Demo/.pixi/envs/default/lib/python3.11/site-packages/unsloth/models/__init__.py", line 16, in <module>
    from .granite import FastGraniteModel
  File "/mnt/c/Users/me/Unsloth Demo/.pixi/envs/default/lib/python3.11/site-packages/unsloth/models/granite.py", line 15, in <module>
    from .llama import *
  File "/mnt/c/Users/me/Unsloth Demo/.pixi/envs/default/lib/python3.11/site-packages/unsloth/models/llama.py", line 36, in <module>
    from ..tokenizer_utils import *
  File "/mnt/c/Users/me/Unsloth Demo/.pixi/envs/default/lib/python3.11/site-packages/unsloth/tokenizer_utils.py", line 1063, in <module>
    raise RuntimeError(f"Unsloth: Please file a bug report! Error patching {trainer_name}")
RuntimeError: Unsloth: Please file a bug report! Error patching SFTTrainer

Is this because of the WSL mounted file system? Or maybe something else?

@Yazooliu
Copy link

I installed trl-0.14.0 to fix . previse version is 0.15.0.dev0
And my other key package and version is:
unsloth 2025.2.4
unsloth_zoo 2025.2.3
torch 2.5.1
torchaudio 2.5.1
torchvision 0.20.1
vllm 0.7.2
xformers 0.0.28.post3
xgrammar 0.1.11

@shimmyshimmer
Copy link
Collaborator

Thanks we'll investigate

@danielhanchen
Copy link
Contributor

Oh yep working on multiple fixes to resolve this!

@mbx10br
Copy link

mbx10br commented Feb 13, 2025

I installed trl-0.14.0 to fix . previse version is 0.15.0.dev0 And my other key package and version is: unsloth 2025.2.4 unsloth_zoo 2025.2.3 torch 2.5.1 torchaudio 2.5.1 torchvision 0.20.1 vllm 0.7.2 xformers 0.0.28.post3 xgrammar 0.1.11

TY, it worked !

@ZILECAO
Copy link

ZILECAO commented Feb 13, 2025

I installed trl-0.14.0 to fix . previse version is 0.15.0.dev0 And my other key package and version is: unsloth 2025.2.4 unsloth_zoo 2025.2.3 torch 2.5.1 torchaudio 2.5.1 torchvision 0.20.1 vllm 0.7.2 xformers 0.0.28.post3 xgrammar 0.1.11

legend, this fixed the dependency nightmare

@kkailaasa
Copy link

This issue is also being tracked here #1699 (comment) with a patch pushed, but I think it has opened up another issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants