Skip to content

Creating an MCP Server with n8n for Intelligent Automation

Difficulty Level: 27

Estimated Duration: minutes

Tools Required:

The synergy between workflow automation and artificial intelligence is paving the way for increasingly sophisticated and autonomous systems. This article delves into creating a Model Context Protocol (MCP) Server using n8n, a powerful open-source workflow automation tool, and demonstrates how to apply a simple AI-driven automation through this setup. This approach allows Large Language Models (LLMs) and other AI agents to interact with your n8n workflows, opening up a new realm of possibilities for intelligent task execution.

Understanding MCP Servers and n8n’s Role

What is an MCP Server?

The Model Context Protocol (MCP) is an open standard designed to enable AI models, particularly LLMs, to interact seamlessly with external tools, data sources, and systems. An MCP Server acts as this crucial intermediary, exposing specific capabilities or “tools” that an AI can call upon to fetch information or perform actions in the real world. These servers are typically lightweight and focused on providing standardized access to these external functionalities.

How n8n Fits In

n8n is a versatile, node-based workflow automation tool that allows users to connect various applications and services to automate complex processes with minimal or no code. Significantly, n8n can be configured to act as an MCP Server itself or interact with other MCP Servers. This is primarily achieved through:

  • MCP Server Trigger Node: This built-in n8n node allows you to expose an entire n8n workflow as an MCP Server endpoint. When an AI model sends a request to this endpoint, the n8n workflow is triggered, processes the request (potentially using other integrated AI services), and can return a result to the AI.
  • MCP Client Node: Conversely, this node enables your n8n workflows to interact with external MCP Servers, calling upon tools they expose.
  • Standalone n8n-mcp-server: For more advanced scenarios, a dedicated n8n-mcp-server (available on GitHub) can be set up. This server interacts with your n8n instance’s API, allowing AI assistants to manage n8n workflows (list, create, execute, etc.) through natural language.

For this article, we will focus on using n8n’s MCP Server Trigger node to create an MCP Server endpoint directly within an n8n workflow.

Setting Up Your n8n Instance

Before creating an MCP Server, you need a running n8n instance. You have several options:

  • n8n Cloud: The quickest way to get started, managed by n8n.
  • Self-Hosting: Provides full control and data privacy. Common methods include:
    • Docker: The recommended and often easiest method for beginners. n8n provides an official Docker image.
    • npm: Installing n8n globally using Node Package Manager.

Basic Self-Hosting Steps (using Docker):

  1. Install Docker: Download and install Docker Desktop for your operating system (Windows, macOS, or Linux).
  2. Pull the n8n Docker Image: Open your terminal or command prompt and run:
    docker pull n8nio/n8n
  3. Run the n8n Container:
    docker run -it --rm --name n8n -p 5678:5678 -v n8n_data:/home/node/.n8n n8nio/n8n

    This command starts n8n, maps port 5678 on your host to the container, and creates a volume (n8n_data) to persist your workflow data.

  4. Access n8n: Open your web browser and navigate to http://localhost:5678. You’ll be prompted to set up an owner account.

Creating an MCP Server Endpoint with n8n

Let’s create a simple n8n workflow that acts as an MCP Server. This server will accept a text input, use an AI model (e.g., OpenAI’s GPT or Deepseek) to summarize it, and return the summary.

Workflow Steps:

  1. New Workflow: In your n8n instance, create a new, empty workflow.
  2. Add the MCP Server Trigger Node:
    • Click the + button to add your first node.
    • Search for “MCP Server Trigger” and select it.
    • Configuration:
      • Path: n8n will generate a unique path (e.g., /mcp/my-summary-tool). You can customize this. This path, combined with your n8n base URL, will be your MCP endpoint. For a local setup, it would be something like http://localhost:5678/webhook/mcp/my-summary-tool (Note: Webhook paths are often used, ensure you copy the correct full URL provided by the node).
      • Authentication: For simplicity in this example, you can set it to “None.” In a production environment, you should secure this endpoint using API Key or other authentication methods n8n supports for its webhook nodes.
      • Tools (Optional but good practice for MCP): You can define the “tools” this MCP server provides. Click “Add Tool” and define:
        • Name: summarizeText
        • Description: Summarizes the input text using an AI model.
        • Input Schema (JSON): Define what input this tool expects. For example:
          {
            "type": "object",
            "properties": {
              "textToSummarize": {
                "type": "string",
                "description": "The text that needs to be summarized."
              }
            },
            "required": ["textToSummarize"]
          }
                          
        • Output Schema (JSON): Define the output format. (Optional for this simple example).
  3. Add an AI Node (e.g., OpenAI):
    • Click the + button after the MCP Server Trigger node and search for “OpenAI” (or your preferred AI model provider like Hugging Face, Azure OpenAI, etc.).
    • Authentication: You’ll need to add your OpenAI API key. Click “Create New” under Credentials, give it a name, and paste your API key.
    • Configuration:
      • Resource: Chat
      • Operation: Message
      • Model: Choose a suitable model (e.g., gpt-3.5-turbo).
      • Messages:
        • Role: User
        • Content Type: Text
        • Text: Here, you’ll use an expression to get the input text from the MCP Server Trigger. Click the “Add Expression” button (looks like fx) and navigate to Nodes > MCP Server Trigger > Output Data > JSON > body > tool_input > textToSummarize. The expression will look something like: {{ $json.body.tool_input.textToSummarize }}.
        • You can prepend a prompt instruction, for example: Summarize the following text: {{ $json.body.tool_input.textToSummarize }}
  4. Respond to the MCP Request (Implicit):The MCP Server Trigger node, by default, will return the output of the last node connected to it (or the data you explicitly configure in its “Response Data” parameter). In this case, the output from the OpenAI node (the summary) will be sent back.
  5. Activate and Save:
    • Save your workflow.
    • Toggle the workflow to “Active” using the switch at the top left.

Your n8n workflow is now an active MCP Server endpoint! The MCP Server Trigger node will display the URL you need to use to call this “summarizeText” tool.