Skip to content

feat: add MiniMax provider support to RAG templates#127

Open
octo-patch wants to merge 8 commits intopathwaycom:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support to RAG templates#127
octo-patch wants to merge 8 commits intopathwaycom:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

This PR adds MiniMax as an alternative LLM provider option to the Pathway RAG templates.

MiniMax offers an OpenAI-compatible API at https://api.minimax.io/v1, which means it can be used as a drop-in replacement for OpenAI models via pw.xpacks.llm.llms.OpenAIChat by setting base_url and api_key.

Changes

  • Added commented MiniMax configuration examples to app.yaml in adaptive_rag, question_answering_rag, multimodal_rag, and slides_ai_search templates
  • Added MINIMAX_API_KEY to the corresponding .env.example files

How to use MiniMax

Uncomment the MiniMax section in any template's app.yaml and set MINIMAX_API_KEY in your .env file.

Note: MiniMax requires temperature to be in the range (0.0, 1.0].

API Reference: https://platform.minimax.io/docs/api-reference/text-openai-api

@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants