You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running finetuning on llama-13b "llmtune finetune --model llama-13b-4bit". Things seem to be working but at the end I get
Traceback (most recent call last):
File "/home/xxx/miniconda3/envs/llmtune/bin/llmtune", line 33, in
sys.exit(load_entry_point('llmtune==0.1.0', 'console_scripts', 'llmtune')())
File "/home/xxx/miniconda3/envs/llmtune/lib/python3.10/site-packages/llmtune-0.1.0-py3.10.egg/llmtune/run.py", line 101, in main
File "/home/xxx/miniconda3/envs/llmtune/lib/python3.10/site-packages/llmtune-0.1.0-py3.10.egg/llmtune/run.py", line 147, in finetune
File "/home/xxx/miniconda3/envs/llmtune/lib/python3.10/site-packages/llmtune-0.1.0-py3.10.egg/llmtune/executor.py", line 131, in finetune
AttributeError: 'Finetune4bConfig' object has no attribute 'adapter'
Which is probably why I get neither adapter_model.bin nor adapter_config.json in the end.
Funny enough, checkpoints are created but the adapter weights don't seem to be there. What's the point of checkpointing then?
I'm on py310_cu118 and nightly torch but other then that per requirements.txt
The text was updated successfully, but these errors were encountered:
I'm running finetuning on llama-13b "llmtune finetune --model llama-13b-4bit". Things seem to be working but at the end I get
Traceback (most recent call last):
File "/home/xxx/miniconda3/envs/llmtune/bin/llmtune", line 33, in
sys.exit(load_entry_point('llmtune==0.1.0', 'console_scripts', 'llmtune')())
File "/home/xxx/miniconda3/envs/llmtune/lib/python3.10/site-packages/llmtune-0.1.0-py3.10.egg/llmtune/run.py", line 101, in main
File "/home/xxx/miniconda3/envs/llmtune/lib/python3.10/site-packages/llmtune-0.1.0-py3.10.egg/llmtune/run.py", line 147, in finetune
File "/home/xxx/miniconda3/envs/llmtune/lib/python3.10/site-packages/llmtune-0.1.0-py3.10.egg/llmtune/executor.py", line 131, in finetune
AttributeError: 'Finetune4bConfig' object has no attribute 'adapter'
Which is probably why I get neither adapter_model.bin nor adapter_config.json in the end.
Funny enough, checkpoints are created but the adapter weights don't seem to be there. What's the point of checkpointing then?
I'm on py310_cu118 and nightly torch but other then that per requirements.txt
The text was updated successfully, but these errors were encountered: