Skip to content

Add MiniMax M2.7 tutorial notebook and litellm integration#7

Open
octo-patch wants to merge 1 commit intocuriousily:masterfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax M2.7 tutorial notebook and litellm integration#7
octo-patch wants to merge 1 commit intocuriousily:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add minimax-m2.7.ipynb: hands-on tutorial notebook for MiniMax M2.7 (204K context) covering text generation, streaming, JSON structured output, Pydantic validation, tool/function calling, summarization, and data labelling via OpenAI-compatible API
  • Add MiniMax M2.7 as a third provider in notebook 26 (multiple LLM providers with litellm), alongside GPT-4.1-mini and Gemini 2.5 Flash
  • Update README.md with Model Explorations section
  • Add 28 unit tests and 6 integration tests (34 total)

Why MiniMax?

MiniMax M2.7 is a powerful LLM with a 204K context window and an OpenAI-compatible API (https://api.minimax.io/v1). It supports:

  • Chat completions with system prompts
  • Streaming responses
  • JSON structured output
  • Function/tool calling
  • Both standard (MiniMax-M2.7) and highspeed (MiniMax-M2.7-highspeed) variants

This makes it a great addition to the bootcamp's multi-provider tutorial, showing learners how easy it is to switch between different LLM providers.

Files Changed

File Change
minimax-m2.7.ipynb New tutorial notebook (32 cells)
26.multiple-llm-providers-with-litellm.ipynb Added MiniMax M2.7 via litellm
README.md Added Model Explorations section
tests/test_minimax.py 28 unit tests
tests/test_minimax_integration.py 6 integration tests
tests/__init__.py Test package init

Test Plan

  • 28 unit tests pass (config, format, structured output, tools, notebooks)
  • 6 integration tests pass (completion, system prompt, JSON, streaming, highspeed, usage)
  • Verify notebooks run correctly in Google Colab with MINIMAX_API_KEY set

…ntegration

- Create minimax-m2.7.ipynb: hands-on tutorial covering text generation, streaming,
  JSON structured output, Pydantic validation, tool/function calling, summarization,
  and data labelling using MiniMax M2.7 via OpenAI-compatible API
- Add MiniMax M2.7 to notebook 26 (multiple LLM providers with litellm) as a third
  provider alongside GPT-4.1-mini and Gemini 2.5 Flash
- Update README.md with Model Explorations section listing MiniMax M2.7
- Add 28 unit tests and 6 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant