diff --git a/docs/user-guide/concepts/model-providers/portkey.md b/docs/user-guide/concepts/model-providers/portkey.md new file mode 100644 index 00000000..adeb8fff --- /dev/null +++ b/docs/user-guide/concepts/model-providers/portkey.md @@ -0,0 +1,64 @@ +# Portkey + +[Portkey](https://docs.portkey.ai) is a gateway platform that enables seamless integration with multiple language model providers such as OpenAI, Anthropic, and Amazon Bedrock. The Strands Agents SDK includes a Portkey-based provider, allowing you to run agents across OpenAI-compatible models (i.e., those using OpenAI’s API schema) and other supported models via a unified interface. + +## Installation + +Portkey is configured as an optional dependency in Strands Agents. To install, run: + +```bash +pip install 'strands-agents[portkey]' +``` + +## Usage + +After installing `portkey`, you can import and initialize the Strands Agents' Portkey provider as follows: + +```python +from strands import Agent +from strands.models.portkey import PortkeyModel +from strands_tools import calculator + +# Portkey for all models +model = PortkeyModel( + api_key="", + model_id="anthropic.claude-3-5-sonnet-20241022-v2:0", #Example + virtual_key="", # Required for providers like Bedrock. See https://docs.portkey.ai for other provider-specific notes. + provider="bedrock", # You can set the provider to 'anthropic', 'bedrock', 'openai' depending on your model and API setup. + base_url="http://portkey-service-gateway.service.prod.example.com/v1", +) + +agent = Agent(model=model, tools=[calculator]) +# You can also add tools like web_search, code_interpreter, etc. +response = agent("What is 2+2") +print(response) +``` + + +## Configuration + +### Client Configuration + +The `client_args` configure the underlying OpenAI client. For a complete list of available arguments, please refer to the OpenAI [source](https://github.com/openai/openai-python). + +### Model Configuration + +The `model_config` configures the underlying model selected for inference. The supported configurations are: + +| Parameter | Description | Example | Options | +|-------------------|--------------------------------------|--------------------------------------------|------------------------------------------------------------------------| +| `model_id` | ID of a model to use | `anthropic.claude-3-5-sonnet-20241022-v2:0` | [reference](https://docs.portkey.ai/docs/llm-routing/overview) | +| `base_url` | Base URL for Portkey service | `http://portkey-service-gateway.service.prod.example.com/v1` | [reference](https://portkey-ai.com/docs) | +| `provider` | Model provider | `bedrock` | `openai`, `bedrock`, `anthropic`, etc. | +| `virtual_key` | Virtual key for authentication | `` | [reference](https://portkey-ai.com/docs/authentication) | + +## Troubleshooting + +### Module Not Found + +If you encounter the error `ModuleNotFoundError: No module named 'portkey'`, this means you haven't installed the `portkey` dependency in your environment. To fix, run `pip install 'strands-agents[portkey]'`. + +## References + +- [API](../../../api-reference/models.md) +- [PortkeyAI](https://docs.portkey.ai) diff --git a/mkdocs.yml b/mkdocs.yml index cbd8c8c5..1413e2f3 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -87,6 +87,7 @@ nav: - LlamaAPI: user-guide/concepts/model-providers/llamaapi.md - Ollama: user-guide/concepts/model-providers/ollama.md - OpenAI: user-guide/concepts/model-providers/openai.md + - PortKey: user-guide/concepts/model-providers/portkey.md - Custom Providers: user-guide/concepts/model-providers/custom_model_provider.md - Streaming: - Async Iterators: user-guide/concepts/streaming/async-iterators.md