Skip to main content
Phoenix home page
English
Search...
⌘K
Ask AI
TypeScript API
Python API
GitHub
Phoenix Cloud
Phoenix Cloud
Search...
Navigation
07.2025
07.25.2025: Average metrics in experiment comparison table
Documentation
Self-Hosting
Phoenix Cloud
Cookbooks
Integrations
SDK & API Reference
Release Notes
Community
Blog
Release Notes
12.2025
11.2025
10.2025
09.2025
08.2025
07.2025
07.29.2025: Google GenAI evals
07.25.2025: Project dashboards
07.25.2025: Average metrics in experiment comparison table
07.21.2025: Project and trace management via GraphQL
07.18.2025: OpenInference Java
07.13.2025: Experiments module in phoenix-client
07.09.2025: Baseline for experiment comparisons
07.07.2025: Database disk usage monitor
07.03.2025: Cost summaries in trace headers
07.02.2025: Cursor MCP button
06.2025
05.2025
04.2025
03.2025
02.2025
01.2025
2024
On this page
07.25.2025
07.2025
07.25.2025: Average metrics in experiment comparison table
Copy page
Available in Phoenix 11.12+
Copy page
07.25.2025
Average Metrics in Experiment Comparison Table
Your browser does not support the video tag.
The
experiment comparison table
now displays
average experiment run data
in the table headers, making it easier to spot high-level differences across runs at a glance.
feat(experiments): display average experiment run data in headers of experiment compare table by axiomofjoy · Pull Request #8737 · Arize-ai/phoenix
GitHub
07.25.2025: Project dashboards
07.21.2025: Project and trace management via GraphQL
⌘I