This repo is a small playground that combines a standard Kedro example project with an MCP server and two different MCP agents.
The underlying project structure was generated with the kedro new command using the Spaceflights example. If you want a full walkthrough of Kedro concepts (project layout, pipelines, configuration, CLI, etc.), follow the official course instead of this README:
Below we focus only on the MCP/MCP-agent pieces: mcp_server.py, agent.py, and agent_langgraph.py.
mcp_server.py turns the Kedro project into a Model Context Protocol (MCP) server using FastMCP:
- Bootstraps the Kedro project in the current directory using
bootstrap_project. - Uses Kedro’s pipeline registry (
kedro.framework.project.pipelines) to discover available pipelines. - Exposes Kedro operations as MCP tools, for example:
list_pipelines– returns names and descriptions for all registered pipelines (descriptions are taken from each pipeline module’screate_pipelinedocstring when available).run_pipeline– runs a selected pipeline inside aKedroSession.
- Communicates over stdio, so it can be launched by any MCP-compatible client (including the agents in this repo).
You typically don’t run mcp_server.py by hand; instead, the agents start it using StdioServerParameters and stdio_client from the MCP Python SDK.
agent.py is a simple but fully traced MCP agent:
- Starts
mcp_server.pyas a subprocess over stdio and opens an MCPClientSession. - Calls
session.list_tools()to discover the tools exposed by the server. - Implements a manual tool-calling loop on top of the OpenAI Chat Completions API:
- Converts MCP tool schemas into OpenAI tool definitions.
- Sends user prompts and tool definitions to the model.
- Reads
tool_callsfrom the model’s response and executes them on the MCP server withsession.call_tool. - Feeds tool outputs back into the LLM until it returns a final natural language answer.
- Integrates Langfuse for observability in an “old style” way:
- Imports
AsyncOpenAIfromlangfuse.openai. - All calls to
self.llm.chat.completions.create(...)are automatically traced to your Langfuse project, assuming the usualLANGFUSE_*andOPENAI_API_KEYenvironment variables are set.
- Imports
This file is the reference implementation if you want to see a working, end‑to‑end integration of MCP + Kedro + OpenAI + Langfuse without any LangChain/LangGraph abstractions.
Run it from an activated virtual environment:
python agent.pyYou’ll get an interactive prompt where you can ask questions; the agent will inspect and run Kedro pipelines via MCP tools, and all LLM calls will appear as traces in Langfuse.
agent_langgraph.py is an alternative agent built on LangGraph and LangChain. It keeps the same MCP server but delegates orchestration to a graph-based agent:
- Starts and connects to
mcp_server.pyvia the MCP stdio client (same asagent.py). - Uses
langchain_mcp_adapters.tools.load_mcp_toolsto turn MCP tools into LangChain tools automatically. - Creates a ReAct-style agent using LangGraph / LangChain:
- LLM:
ChatOpenAIfromlangchain_openai. - Tools: the MCP tools returned by
load_mcp_tools.
- LLM:
- Provides:
process_query(...)– run a single query through the LangGraph agent.chat()– interactive loop with conversation memory (state carries previous messages across turns).stream_query(...)– stream partial responses as they’re produced.
Run it with:
python agent_langgraph.pyYou’ll get a LangGraph-based conversational agent that can still inspect and run Kedro pipelines via MCP.