SDK Reference
Supported Libraries

Supported Libraries

Maev automatically instruments all major LLM libraries. You do not need to change your existing code or add any library-specific configuration.

LLM providers

LibraryAuto-instrumented
OpenAIYes
AnthropicYes
Google GeminiYes
AWS BedrockYes
CohereYes
MistralYes
GroqYes
Together AIYes
OllamaYes

Frameworks

FrameworkAuto-instrumented
LangChainYes
LlamaIndexYes
HaystackYes
CrewAIYes
AutoGenYes
LiteLLMYes

What gets captured per library

For every LLM call, Maev captures:

  • Model name and version
  • Input prompt (truncated at 500 characters per event)
  • Output completion (truncated at 500 characters per event)
  • Token counts (prompt, completion, total)
  • Cost (computed from token counts and model pricing)
  • Latency in milliseconds
  • Any error messages

For tool/function calls, Maev additionally captures:

  • Tool name
  • Tool input arguments
  • Tool output or error

Not seeing your library?

Maev captures telemetry at the OpenTelemetry layer, which means any library that emits standard OpenTelemetry spans will work automatically. If your library is not in the list above, open an issue or contact support.

All captured data is processed server-side. Nothing is stored locally on your machine.