[feat] [model] add MiniMax as LLM provider protocol by octo-patch · Pull Request #466 · coze-dev/coze-loop

@octo-patch

Add MiniMax as the 10th LLM provider in coze-loop model protocol system.
MiniMax offers OpenAI-compatible API and is integrated via the existing
eino-ext openai component with MiniMax default base URL.

Changes:
- Add ProtocolMiniMax constant and ProtocolConfigMiniMax struct in entity
- Add miniMaxBuilder in eino/init.go using openai component with MiniMax defaults
- Update thrift IDL with protocol_minimax and ProtocolConfigMiniMax
- Update kitex_gen constants and frontend TypeScript types
- Add 10 unit tests and 3 integration tests

Supported models: MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed