You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
provider: "sambanova", // or together, fal-ai, replicate, cohere …
43
43
});
44
44
45
45
awaitinference.textToImage({
46
46
model: "black-forest-labs/FLUX.1-dev",
47
+
provider: "replicate",
47
48
inputs: "a picture of a green bird",
48
49
});
49
50
@@ -54,7 +55,7 @@ await inference.textToImage({
54
55
55
56
This is a collection of JS libraries to interact with the Hugging Face API, with TS types included.
56
57
57
-
-[@huggingface/inference](packages/inference/README.md): Use HF Inference API (serverless), Inference Endpoints (dedicated) and third-party Inference Providers to make calls to 100,000+ Machine Learning models
58
+
-[@huggingface/inference](packages/inference/README.md): Use HF Inference API (serverless), Inference Endpoints (dedicated) and all supported Inference Providers to make calls to 100,000+ Machine Learning models
58
59
-[@huggingface/hub](packages/hub/README.md): Interact with huggingface.co to create or delete repos and commit / download files
59
60
-[@huggingface/agents](packages/agents/README.md): Interact with HF models through a natural language interface
60
61
-[@huggingface/gguf](packages/gguf/README.md): A GGUF parser that works on remotely hosted files.
Copy file name to clipboardexpand all lines: packages/inference/README.md
+11-11
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
# 🤗 Hugging Face Inference
2
2
3
-
A Typescript powered wrapper for the HF Inference API (serverless), Inference Endpoints (dedicated), and third-party Inference Providers.
4
-
It works with [Inference API (serverless)](https://huggingface.co/docs/api-inference/index) and [Inference Endpoints (dedicated)](https://huggingface.co/docs/inference-endpoints/index), and even with supported third-party Inference Providers.
3
+
A Typescript powered wrapper for the HF Inference API (serverless), Inference Endpoints (dedicated), and all supported Inference Providers.
4
+
It works with [Inference API (serverless)](https://huggingface.co/docs/api-inference/index) and [Inference Endpoints (dedicated)](https://huggingface.co/docs/inference-endpoints/index), and even with all supported third-party Inference Providers.
5
5
6
6
Check out the [full documentation](https://huggingface.co/docs/huggingface.js/inference/README).
❗**Important note:** Using an access token is optional to get started, however you will be rate limited eventually. Join [Hugging Face](https://huggingface.co/join) and then visit [access tokens](https://huggingface.co/settings/tokens) to generate your access token for **free**.
42
42
43
43
Your access token should be kept private. If you need to protect it in front-end applications, we suggest setting up a proxy server that stores the access token.
44
44
45
-
### Third-party inference providers
45
+
### All supported inference providers
46
46
47
47
You can send inference requests to third-party providers with the inference client.
48
48
@@ -63,7 +63,7 @@ To send requests to a third-party provider, you have to pass the `provider` para
63
63
```ts
64
64
const accessToken ="hf_..."; // Either a HF access token, or an API key from the third-party provider (Replicate in this example)
65
65
66
-
const client =newHfInference(accessToken);
66
+
const client =newInferenceClient(accessToken);
67
67
awaitclient.textToImage({
68
68
provider: "replicate",
69
69
model:"black-forest-labs/Flux.1-dev",
@@ -93,7 +93,7 @@ This is not an issue for LLMs as everyone converged on the OpenAI API anyways, b
93
93
94
94
### Tree-shaking
95
95
96
-
You can import the functions you need directly from the module instead of using the `HfInference` class.
96
+
You can import the functions you need directly from the module instead of using the `InferenceClient` class.
0 commit comments