Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 63 additions & 0 deletions docs/aws-lambda.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# Deploying a Workflow to AWS Lambda

This guide shows how to package a simple NodeTool workflow and run it as an AWS Lambda function.

## Lambda Handler Example

Create a handler that executes your workflow. Save the following as `examples/aws_lambda_handler.py`:

```python
#!/usr/bin/env python3
import asyncio
import json
from typing import Any, Dict

from nodetool.dsl.graph import graph, run_graph
from nodetool.dsl.providers.openai import ChatCompletion
from nodetool.metadata.types import OpenAIModel

async def run_workflow(prompt: str) -> str:
workflow = ChatCompletion(
model=OpenAIModel(model="gpt-4o"),
messages=[{"role": "user", "content": prompt}],
)
return await run_graph(graph(workflow))


def lambda_handler(event: Dict[str, Any], context: Any) -> Dict[str, Any]:
prompt = event.get("prompt", "Hello from NodeTool!")
result = asyncio.run(run_workflow(prompt))
return {"statusCode": 200, "body": json.dumps({"result": result})}
```

## Packaging and Deployment

1. Install NodeTool Core and its dependencies into a local directory:

```bash
pip install nodetool-core -t ./package
```

2. Create the deployment package:

```bash
cd package
zip -r9 ../function.zip .
cd ..
zip -g function.zip examples/aws_lambda_handler.py
```

3. Deploy using the AWS CLI:

```bash
aws lambda create-function \
--function-name nodetool-example \
--runtime python3.11 \
--handler aws_lambda_handler.lambda_handler \
--zip-file fileb://function.zip \
--role arn:aws:iam::123456789012:role/lambda-execution-role
```

Replace the role ARN with your own IAM role that has permission to run Lambda functions.

Once deployed, invoke the function by sending a JSON payload with a `prompt` field.
1 change: 1 addition & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ The documentation is organized into the following sections:
- [**Agents**](agents.md) - Multi-step agent framework
- [**Chat Module**](chat.md) - Conversational interface
- [**Chat Providers**](chat-providers.md) - Supported LLM backends
- [**AWS Lambda Example**](aws-lambda.md) - Deploy workflows to Lambda
- [**Examples**](../examples/README.md) - Example workflows

## Community
Expand Down
7 changes: 7 additions & 0 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,13 @@ Before running these examples, you'll need to:
- Integrates OpenAI's capabilities with web search
- Shows combined AI and search functionality

### 5. Deployment

#### AWS Lambda Handler (`aws_lambda_handler.py`)

- Demonstrates deploying a workflow as an AWS Lambda function
- Provides a simple `lambda_handler` entry point

## Running the Examples

To run any of these examples:
Expand Down
36 changes: 36 additions & 0 deletions examples/aws_lambda_handler.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
#!/usr/bin/env python3
"""AWS Lambda handler example for running a NodeTool workflow.

This script defines a minimal `lambda_handler` function that executes a simple
ChatCompletion workflow using NodeTool Core. You can package this file with its
dependencies and deploy it to AWS Lambda.
"""

import asyncio
import json
from typing import Any, Dict

from nodetool.dsl.graph import graph, run_graph
from nodetool.dsl.providers.openai import ChatCompletion
from nodetool.metadata.types import OpenAIModel


async def run_workflow(prompt: str) -> str:
"""Execute the workflow using the provided prompt."""
workflow = ChatCompletion(
model=OpenAIModel(model="gpt-4o"),
messages=[{"role": "user", "content": prompt}],
)
return await run_graph(graph(workflow))


def lambda_handler(event: Dict[str, Any], context: Any) -> Dict[str, Any]:
"""Entry point for AWS Lambda."""
prompt = event.get("prompt", "Hello from NodeTool!")
result = asyncio.run(run_workflow(prompt))
return {"statusCode": 200, "body": json.dumps({"result": result})}


if __name__ == "__main__":
# Simple local test
print(lambda_handler({"prompt": "Test"}, None))