This is documentation for the next version of Grafana Tempo documentation. For the latest stable release, go to the latest version.
Model Context Protocol (MCP) Server
Tempo includes an MCP (Model Context Protocol) server that provides AI assistants and Large Language Models with direct access to distributed tracing data through TraceQL queries and other endpoints.
Configuration
Enable the MCP server in your Tempo configuration:
query_frontend:
mcp_server:
enabled: true
Warning
Be aware that using this feature will likely cause tracing data to be passed to an LLM or LLM provider. Consider the content of your tracing data and organizational policies when enabling this.
Quick start
To experiment with the MCP server using dummy data and Claude Code:
- Run the local docker-compose example in
/example/docker-compose
. This will expose the mcp server at http://localhost:3200/api/mcp - Run
claude mcp add -t stdio -s user tempo npx mcp-remote http://localhost:3200/api/mcp
to add a reference to Claude Code. - Run
claude
and ask some questions
This MCP server has also been tested succesfully in cursor using the mcp-remote
package.