[HELP] Trouble Loading Fine-Tuned Lama-2-13b Model in LocalAI #952
Unanswered
maxiannunziata
asked this question in
Q&A
Replies: 1 comment
-
Hi Max I found this resource for fine tuning I wanted to ask you for advice Have you found a solution? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello everyone,
I'm facing an issue with loading a fine-tuned Lama-2-13b model into LocalAI and I could really use some help.
What I Have Done:
I have fine-tuned a Lama-2-13b model using AutoTrain Advanced in Google Colab.
Followed the guide to create a folder called data and placed my train.csv in it.
Adjusted the parameters and started the model training successfully.
What I Have:
At the end of the fine-tuning process, I received a .bin file, separated into three parts.
My Issue:
I am unsure how to proceed with loading this multi-part .bin model into LocalAI.
What I've Tried:
I have looked through the documentation but haven't found a straightforward method for loading a fine-tuned model into LocalAI, particularly one that has been separated into multiple parts.
My Question:
How can I load this multi-part .bin model into LocalAI?
How can I load an single .bin model into LocalAI?
Any assistance or guidance on this matter would be highly appreciated.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions