You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`funcchain` is the *most pythonic* way of writing cognitive systems. Leveraging pydantic models as output schemas combined with langchain in the backend allows for a seamless integration of llms into your apps.
17
-
It works perfect with OpenAI Functions or LlamaCpp grammars (json-schema-mode).
17
+
It utilizes perfect with OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output.
18
+
In the backend it compiles the funcchain syntax into langchain runnables so you can easily invoke, stream or batch process your pipelines.
18
19
19
20
[](https://codespaces.new/ricklamers/funcchain-demo)
20
21
@@ -94,7 +95,7 @@ match lst:
94
95
## Vision Models
95
96
96
97
```python
97
-
fromPILimport Image
98
+
fromfuncchainimport Image
98
99
from pydantic import BaseModel, Field
99
100
from funcchain import chain, settings
100
101
@@ -132,7 +133,7 @@ from pydantic import BaseModel, Field
132
133
from funcchain import chain, settings
133
134
134
135
# auto-download the model from huggingface
135
-
settings.llm ="gguf/openhermes-2.5-mistral-7b"
136
+
settings.llm ="ollama/openchat"
136
137
137
138
classSentimentAnalysis(BaseModel):
138
139
analysis: str
@@ -153,32 +154,38 @@ print(poem.analysis)
153
154
154
155
## Features
155
156
156
-
- minimalistic and easy to use
157
-
- easy swap between openai and local models
158
-
- write prompts as python functions
159
-
- pydantic models for output schemas
160
-
- langchain core in the backend
161
-
- fstrings or jinja templates for prompts
162
-
- fully utilises OpenAI Functions or LlamaCpp Grammars
157
+
- pythonic
158
+
- easy swap between openai or local models
159
+
- dynamic output types (pydantic models, or primitives)
160
+
- vision llm support
161
+
- langchain_core as backend
162
+
- jinja templating for prompts
163
+
- reliable structured output
164
+
- auto retry parsing
163
165
- langsmith support
164
-
- async and pythonic
165
-
- auto gguf model download from huggingface
166
-
- streaming support
166
+
- sync, async, streaming, parallel, fallbacks
167
+
- gguf download from huggingface
168
+
- type hints for all functions and mypy support
169
+
- chat router component
170
+
- composable with langchain LCEL
171
+
- easy error handling
172
+
- enums and literal support
173
+
- custom parsing types
167
174
168
175
## Documentation
169
176
170
-
Highly recommend to try out the examples in the `./examples` folder.
177
+
[Checkout the docs here](https://shroominic.github.io/funcchain/) 👈
171
178
172
-
Coming soon... feel free to add helpful .md files :)
179
+
Also highly recommend to try and run the examples in the `./examples` folder.
173
180
174
181
## Contribution
175
182
176
-
You want to contribute? That's great! Please run the dev setup to get started:
183
+
You want to contribute? Thanks, that's great!
184
+
For more information checkout the [Contributing Guide](docs/contributing/dev-setup.md).
Asyncronous promgramming is a way to easily parallelize processes in python.
6
+
This is very useful when dealing with LLMs because every request takes a long time and the python interpreter should do alot of other things in the meantime instead of waiting for the request.
7
+
8
+
Checkout [this brillian async tutorial](https://fastapi.tiangolo.com/async/) if you never coded in an asyncronous way.
9
+
10
+
## Async in FuncChain
11
+
12
+
You can use async in funcchain by creating your functions using `achain()` instead of the normal `chain()`.
13
+
It would then look like this:
14
+
15
+
```python
16
+
from funcchain import achain
17
+
18
+
asyncdefgenerate_poem(topic: str) -> str:
19
+
"""
20
+
Generate a poem inspired by the given topic.
21
+
"""
22
+
returnawait achain()
23
+
```
24
+
25
+
You can then `await` the async `generate_poem` function inside another async funtion or directly call it using `asyncio.run(generate_poem("birds"))`.
26
+
27
+
## Async in LangChain
28
+
29
+
When converting your funcchains into a langchain runnable you can use the native langchain way of async.
30
+
This would be `.ainvoke(...)`, `.astream(...)` or `.abatch(...)` .
31
+
32
+
## Async Streaming
33
+
34
+
You can use langchains async streaming interface but also use the `stream_to(...)` wrapper (explained [here](../concepts/streaming.md#strem_to-wrapper)) as an async context manager.
0 commit comments