Skip to content

DimedS/kedro-mcp-agent

Repository files navigation

kedro-mcp-agent

This repo is a small playground that combines a standard Kedro example project with an MCP server and two different MCP agents.

The underlying project structure was generated with the kedro new command using the Spaceflights example. If you want a full walkthrough of Kedro concepts (project layout, pipelines, configuration, CLI, etc.), follow the official course instead of this README:

https://docs.kedro.org/en/stable/getting-started/course/

Below we focus only on the MCP/MCP-agent pieces: mcp_server.py, agent.py, and agent_langgraph.py.


mcp_server.py – MCP server over a Kedro project

mcp_server.py turns the Kedro project into a Model Context Protocol (MCP) server using FastMCP:

  • Bootstraps the Kedro project in the current directory using bootstrap_project.
  • Uses Kedro’s pipeline registry (kedro.framework.project.pipelines) to discover available pipelines.
  • Exposes Kedro operations as MCP tools, for example:
    • list_pipelines – returns names and descriptions for all registered pipelines (descriptions are taken from each pipeline module’s create_pipeline docstring when available).
    • run_pipeline – runs a selected pipeline inside a KedroSession.
  • Communicates over stdio, so it can be launched by any MCP-compatible client (including the agents in this repo).

You typically don’t run mcp_server.py by hand; instead, the agents start it using StdioServerParameters and stdio_client from the MCP Python SDK.


agent.py – MCP agent with Langfuse tracing

agent.py is a simple but fully traced MCP agent:

  • Starts mcp_server.py as a subprocess over stdio and opens an MCP ClientSession.
  • Calls session.list_tools() to discover the tools exposed by the server.
  • Implements a manual tool-calling loop on top of the OpenAI Chat Completions API:
    • Converts MCP tool schemas into OpenAI tool definitions.
    • Sends user prompts and tool definitions to the model.
    • Reads tool_calls from the model’s response and executes them on the MCP server with session.call_tool.
    • Feeds tool outputs back into the LLM until it returns a final natural language answer.
  • Integrates Langfuse for observability in an “old style” way:
    • Imports AsyncOpenAI from langfuse.openai.
    • All calls to self.llm.chat.completions.create(...) are automatically traced to your Langfuse project, assuming the usual LANGFUSE_* and OPENAI_API_KEY environment variables are set.

This file is the reference implementation if you want to see a working, end‑to‑end integration of MCP + Kedro + OpenAI + Langfuse without any LangChain/LangGraph abstractions.

Run it from an activated virtual environment:

python agent.py

You’ll get an interactive prompt where you can ask questions; the agent will inspect and run Kedro pipelines via MCP tools, and all LLM calls will appear as traces in Langfuse.


agent_langgraph.py – LangGraph-based MCP agent (no Langfuse)

agent_langgraph.py is an alternative agent built on LangGraph and LangChain. It keeps the same MCP server but delegates orchestration to a graph-based agent:

  • Starts and connects to mcp_server.py via the MCP stdio client (same as agent.py).
  • Uses langchain_mcp_adapters.tools.load_mcp_tools to turn MCP tools into LangChain tools automatically.
  • Creates a ReAct-style agent using LangGraph / LangChain:
    • LLM: ChatOpenAI from langchain_openai.
    • Tools: the MCP tools returned by load_mcp_tools.
  • Provides:
    • process_query(...) – run a single query through the LangGraph agent.
    • chat() – interactive loop with conversation memory (state carries previous messages across turns).
    • stream_query(...) – stream partial responses as they’re produced.

Run it with:

python agent_langgraph.py

You’ll get a LangGraph-based conversational agent that can still inspect and run Kedro pipelines via MCP.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages