Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 46 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,8 @@ With provider integrations:
```bash
pip install "ethicore-engine-guardian[openai]"
pip install "ethicore-engine-guardian[anthropic]"
pip install "ethicore-engine-guardian[openai,anthropic]"
pip install "ethicore-engine-guardian[minimax]"
pip install "ethicore-engine-guardian[openai,anthropic,minimax]"
```

---
Expand Down Expand Up @@ -279,6 +280,50 @@ guardian = Guardian(config=GuardianConfig(api_key="my-app"))
client = guardian.wrap(anthropic.Anthropic())
```

### MiniMax

[MiniMax](https://www.minimax.io) provides powerful LLM models (M2.7, M2.5) through an
OpenAI-compatible API. Guardian protects MiniMax calls the same way it protects OpenAI.

```python
import openai
from ethicore_guardian import Guardian, GuardianConfig
from ethicore_guardian.providers.minimax_provider import MiniMaxProvider

guardian = Guardian(config=GuardianConfig(api_key="my-app"))

# Create an OpenAI client pointed at MiniMax
minimax_client = openai.OpenAI(
api_key="your-minimax-api-key",
base_url="https://api.minimax.io/v1",
)

# Wrap with Guardian protection
provider = MiniMaxProvider(guardian)
client = provider.wrap_client(minimax_client)

# Use exactly like normal — Guardian intercepts every input
response = client.chat.completions.create(
model="MiniMax-M2.7",
messages=[{"role": "user", "content": user_input}]
)
```

Or use the one-step convenience factory:

```python
from ethicore_guardian.providers.minimax_provider import create_protected_minimax_client

client = create_protected_minimax_client(
api_key="your-minimax-api-key",
guardian_api_key="ethicore-...",
)
response = client.chat.completions.create(
model="MiniMax-M2.7",
messages=[{"role": "user", "content": user_input}]
)
```

### Ollama (local LLMs)

```python
Expand Down
4 changes: 4 additions & 0 deletions ethicore_guardian/providers/base_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -299,6 +299,10 @@ def get_provider_for_client(client: Any) -> str:

# Check client type and module for provider indicators
if 'openai' in client_type or 'openai' in client_module:
# Check if the client is configured for MiniMax (OpenAI-compatible API)
base_url = str(getattr(client, 'base_url', '') or '')
if 'minimax' in base_url.lower():
return 'minimax'
return 'openai'
elif 'anthropic' in client_type or 'anthropic' in client_module:
return 'anthropic'
Expand Down
Loading