Phoenix Client
API for the Phoenix platform
Phoenix OTEL
OpenTelemetry tracing with Phoenix defaults
Phoenix Evals
LLM evaluation and metrics toolkit
OpenInference
Instrumentation and tracing helpers
Installation
Install all packages together or individually based on your needs:Environment Variables
All packages respect common Phoenix environment variables for seamless configuration:| Variable | Description | Used By |
|---|---|---|
PHOENIX_COLLECTOR_ENDPOINT | Trace collector URL | OTEL |
PHOENIX_BASE_URL | Phoenix server URL | Client |
PHOENIX_API_KEY | API key for authentication | Client, OTEL |
PHOENIX_PROJECT_NAME | Default project name | OTEL |
PHOENIX_CLIENT_HEADERS | Custom HTTP headers | Client, OTEL |
Phoenix Client
- Prompts — Create, version, and invoke prompt templates with variable substitution
- Datasets — Build evaluation datasets from DataFrames, CSV files, or dictionaries
- Experiments — Run evaluations and track experiment results over time
- Spans — Query and analyze traces with powerful filtering capabilities
- Annotations — Add human feedback and automated evaluations to spans
- Projects — Organize your work across multiple AI applications
Phoenix OTEL
- Zero-config tracing — Enable
auto_instrument=Trueto automatically trace AI libraries - Phoenix-aware defaults — Reads
PHOENIX_COLLECTOR_ENDPOINT,PHOENIX_API_KEY, and other environment variables - Production ready — Built-in batching and authentication support
- Tracing decorators —
@tracer.chain,@tracer.tool, and more for manual instrumentation - OpenTelemetry compatible — Works with existing OTel infrastructure
Phoenix Evals
- Model adapters — Works with OpenAI, LiteLLM, LangChain, and other providers
- Pre-built metrics — Hallucination detection, relevance, toxicity, and more
- Input mapping — Powerful binding for complex data structures
- Native instrumentation — OpenTelemetry tracing for observability
- High performance — Up to 20x speedup with built-in concurrency and batching
OpenInference
- Decorators — Use
@tracer.agent,@tracer.chain,@tracer.toolto trace custom functions - Context managers — Wrap code blocks with
using_helpers for fine-grained control - Data masking — Redact sensitive information from traces with built-in masking utilities
- Framework instrumentors — Auto-trace OpenAI, LangChain, LlamaIndex, Anthropic, and more

