Modern C++23 LLM client built with modules
| English - 简体中文 - 繁體中文 |
|---|
| Documentation - English Docs - 中文文档 - 繁體中文文件 |
llmapi provides a typed Client<Provider> API for chat, streaming, embeddings, tool calls, and conversation persistence. The default config alias Config maps to OpenAI-style providers, so the common case does not need an explicit openai::OpenAI wrapper.
Features
import mcpplibs.llmapiwith C++23 modules- Strongly typed messages, tools, and response structs
- Sync, async, and streaming chat APIs
- Embeddings via the OpenAI provider
- Conversation save/load helpers
- OpenAI-compatible endpoint support through
openai::Config::baseUrl
Quick Start
import mcpplibs.llmapi; import std; int main() { using namespace mcpplibs::llmapi; auto apiKey = std::getenv("OPENAI_API_KEY"); if (!apiKey) { std::cerr << "OPENAI_API_KEY not set\n"; return 1; } auto client = Client(Config{ .apiKey = apiKey, .model = "gpt-4o-mini", }); client.system("You are a concise assistant."); auto resp = client.chat("Explain why C++23 modules are useful in two sentences."); std::cout << resp.text() << '\n'; return 0; }
Providers
openai::OpenAIfor OpenAI chat, streaming, embeddings, and OpenAI-compatible endpointsanthropic::Anthropicfor Anthropic chat and streamingConfigas a convenient alias foropenai::Config
Compatible endpoints can reuse the OpenAI provider:
auto provider = openai::OpenAI({ .apiKey = std::getenv("DEEPSEEK_API_KEY"), .baseUrl = std::string(URL::DeepSeek), .model = "deepseek-chat", });
Build And Run
xmake xmake run hello_mcpp xmake run basic xmake run chat
Package Usage
add_repositories("mcpplibs-index https://github.com/mcpplibs/mcpplibs-index.git") add_requires("llmapi 0.1.0") target("demo") set_kind("binary") set_languages("c++23") set_policy("build.c++.modules", true) add_files("src/*.cpp") add_packages("llmapi")
See docs/en/getting-started.md, docs/en/providers.md, and docs/en/README.md for more setup and readiness detail.
License
Apache-2.0 - see LICENSE