This repository has been archived by the owner on May 12, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 161
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Richard Guo
authored and
Richard Guo
committed
May 11, 2023
1 parent
463037e
commit 8d8470d
Showing
2 changed files
with
89 additions
and
76 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,82 +1,13 @@ | ||
# PyGPT4All | ||
# Pygpt4all | ||
|
||
Official Python CPU inference for [GPT4All](https://github.com/nomic-ai/gpt4all) language models based on [llama.cpp](https://github.com/ggerganov/llama.cpp) and [ggml](https://github.com/ggerganov/ggml) | ||
We've moved Python bindings with the main gpt4all repo. | ||
|
||
[](https://opensource.org/licenses/MIT) | ||
[](https://pypi.org/project/pygpt4all/) | ||
Future development, issues, and the like will be handled in the main repo. | ||
|
||
<!-- TOC --> | ||
This repo will be archived and set to read-only. | ||
|
||
- [Installation](#installation) | ||
- [Tutorial](#tutorial) | ||
- [Model instantiation](#model-instantiation) | ||
- [Simple generation](#simple-generation) | ||
- [Interactive Dialogue](#interactive-dialogue) | ||
- [API reference](#api-reference) | ||
- [License](#license) | ||
<!-- TOC --> | ||
The main repo is here: https://github.com/nomic-ai/gpt4all | ||
|
||
# Installation | ||
The subdirectory with this repo is here: https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/python | ||
|
||
```bash | ||
pip install pygpt4all | ||
``` | ||
|
||
# Tutorial | ||
|
||
You will need first to download the model weights, you can find and download all the supported models from [here](https://github.com/nomic-ai/gpt4all-chat#manual-download-of-models). | ||
|
||
### Model instantiation | ||
|
||
Once the weights are downloaded, you can instantiate the models as follows: | ||
|
||
- GPT4All model | ||
|
||
```python | ||
from pygpt4all import GPT4All | ||
|
||
model = GPT4All('path/to/ggml-gpt4all-l13b-snoozy.bin') | ||
``` | ||
|
||
- GPT4All-J model | ||
|
||
```python | ||
from pygpt4all import GPT4All_J | ||
|
||
model = GPT4All_J('path/to/ggml-gpt4all-j-v1.3-groovy.bin') | ||
``` | ||
|
||
### Simple generation | ||
|
||
The `generate` function is used to generate new tokens from the `prompt` given as input: | ||
|
||
```python | ||
for token in model.generate("Tell me a joke ?\n"): | ||
print(token, end='', flush=True) | ||
``` | ||
|
||
### Interactive Dialogue | ||
|
||
You can set up an interactive dialogue by simply keeping the `model` variable alive: | ||
|
||
```python | ||
while True: | ||
try: | ||
prompt = input("You: ") | ||
if prompt == '': | ||
continue | ||
print(f"AI:", end='') | ||
for token in model.generate(prompt): | ||
print(f"{token}", end='', flush=True) | ||
print() | ||
except KeyboardInterrupt: | ||
break | ||
``` | ||
|
||
# API reference | ||
|
||
You can check the [API reference documentation](https://nomic-ai.github.io/pygpt4all/) for more details. | ||
|
||
# License | ||
|
||
This project is licensed under the MIT [License](./LICENSE). | ||
Thank you! |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,82 @@ | ||
# PyGPT4All | ||
|
||
Official Python CPU inference for [GPT4All](https://github.com/nomic-ai/gpt4all) language models based on [llama.cpp](https://github.com/ggerganov/llama.cpp) and [ggml](https://github.com/ggerganov/ggml) | ||
|
||
[](https://opensource.org/licenses/MIT) | ||
[](https://pypi.org/project/pygpt4all/) | ||
|
||
<!-- TOC --> | ||
|
||
- [Installation](#installation) | ||
- [Tutorial](#tutorial) | ||
- [Model instantiation](#model-instantiation) | ||
- [Simple generation](#simple-generation) | ||
- [Interactive Dialogue](#interactive-dialogue) | ||
- [API reference](#api-reference) | ||
- [License](#license) | ||
<!-- TOC --> | ||
|
||
# Installation | ||
|
||
```bash | ||
pip install pygpt4all | ||
``` | ||
|
||
# Tutorial | ||
|
||
You will need first to download the model weights, you can find and download all the supported models from [here](https://github.com/nomic-ai/gpt4all-chat#manual-download-of-models). | ||
|
||
### Model instantiation | ||
|
||
Once the weights are downloaded, you can instantiate the models as follows: | ||
|
||
- GPT4All model | ||
|
||
```python | ||
from pygpt4all import GPT4All | ||
|
||
model = GPT4All('path/to/ggml-gpt4all-l13b-snoozy.bin') | ||
``` | ||
|
||
- GPT4All-J model | ||
|
||
```python | ||
from pygpt4all import GPT4All_J | ||
|
||
model = GPT4All_J('path/to/ggml-gpt4all-j-v1.3-groovy.bin') | ||
``` | ||
|
||
### Simple generation | ||
|
||
The `generate` function is used to generate new tokens from the `prompt` given as input: | ||
|
||
```python | ||
for token in model.generate("Tell me a joke ?\n"): | ||
print(token, end='', flush=True) | ||
``` | ||
|
||
### Interactive Dialogue | ||
|
||
You can set up an interactive dialogue by simply keeping the `model` variable alive: | ||
|
||
```python | ||
while True: | ||
try: | ||
prompt = input("You: ") | ||
if prompt == '': | ||
continue | ||
print(f"AI:", end='') | ||
for token in model.generate(prompt): | ||
print(f"{token}", end='', flush=True) | ||
print() | ||
except KeyboardInterrupt: | ||
break | ||
``` | ||
|
||
# API reference | ||
|
||
You can check the [API reference documentation](https://nomic-ai.github.io/pygpt4all/) for more details. | ||
|
||
# License | ||
|
||
This project is licensed under the MIT [License](./LICENSE). |