USB in the AI ​​world! An introduction to the MCP protocol

Written by
Caleb Hayes
Updated on:July-11th-2025
Recommendation

A new breakthrough in AI technology, the MCP protocol brings you the infinite possibilities of AI.

Core content:
1. Basic concepts and functions of the MCP protocol
2. MCP architecture design and protocol layering
3. Security mechanism and permission control

Yang Fangxian
Founder of 53AI/Most Valuable Expert of Tencent Cloud (TVP)

"When you tell Claude to send the quarterly report to Slack, AI can automatically complete the entire process of reading the file, generating summaries, and sending messages! This is driven by Anthropic's latest MCP protocol - this technology, known as the 'USB-C in the AI ​​world', is triggering a revolution in smart applications..."

This article will give you a comprehensive understanding of MCP from architecture design to code practice: what it is, how it works, what it can do, and how to implement it.


What is MCP


MCP (Model Context Protocol) is an open standard launched by Anthropic to unify the communication protocol between large language models (LLMs) and external data sources and tools. Its purpose is to unify the way AI interacts with the outside world. Through a set of standardized rules, it allows AI to safely and conveniently call local files, databases, API services and other resources without having to develop adaptation code for each data source separately.

Analogy : Just as the USB-C interface allows mobile phones and computers to connect to various peripherals (keyboards, hard drives, etc.), MCP allows AI models to seamlessly connect to different tools (such as Excel, GitHub, and blockchain).


MCP Architecture

Architecture Definition

The MCP architecture follows a client-server architecture:

  • Hosts

Refers to a Large Language Model (LLM) application, such as Claude Desktop or an Integrated Development Environment (IDE). As the initiator of a protocol connection, it is responsible for initializing the communication link with external resources.

  • Clients

Runs inside the host application and maintains a 1:1 exclusive connection with the server. Each client instance is bound to only one server, ensuring session isolation and resource permission control.

  • Servers

A lightweight service program that provides three types of functions to the client through a standardized interface:

  • Context : structured or unstructured data resources (such as files, database records).

  • Tools : executable function interfaces (such as API calls and local operations).

  • Prompts : Predefined task templates and interaction logic.

The server supports access to local (such as file systems) and remote (such as cloud services) resources.

MCP architecture (Figure 1)

Architecture Details

  • Protocol Layering
  • Protocol layer: The message framework is defined based on the JSON-RPC 2.0 specification, including four message formats: request, notification, result, and error .

  • Transport layer: supports two communication mechanisms

  1. Standard input and output (stdio) : suitable for local inter-process communication

  2. Server-Sent Events (SSE) : Used for remote real-time two-way interaction.

  • Connection lifecycle management
  • Initialization phase : negotiate the protocol version and function support range through the initialize request

  • Conversation phase : Message exchange supports two modes: request-response and one-way notification mode.

  • Termination phase : supports explicit shutdown or timeout disconnection mechanism.

  • Safety Mechanism
  • Permission hierarchical control: for example, restricting LLM to read only files in a specific directory

  • Operation approval process: sensitive tool calls require explicit user authorization

  • Credential isolation: The server independently manages authentication information to prevent it from being leaked to the client.

Architecture Advantages

  • Modular scalability

Plug-in development, new functions are deployed through independent servers without modifying the host application code.

  • Cross-platform interoperability

The standardized protocol supports multi-language SDKs (Python/JavaScript, etc.) and is adaptable to heterogeneous systems.

  • Real-time interactive capabilities

The two-way communication mechanism supports LLM to actively trigger server operations (such as code submission, device control)


Comparison with Function Call


Function Call is also an important member of the AI ​​Agent tool . What are the specific differences between it and the MCP protocol? Here are some comparisons:

Comparison Dimensions
MCP Protocol
Function Call
Communication Protocol
Standardized JSON-RPC 2.0 or gRPC
Manufacturer-defined HTTP/WebSocket interface
Tool Lifecycle Management
Supports dynamic loading/unloading (similar to Linux kernel modules) without restarting the service
The model service needs to be restarted to update the tool
Cross-platform capabilities
The tool service can be deployed in any environment that supports the protocol (local/remote/containerized)
Limited to the model vendor’s runtime environment (such as OpenAI’s specific function library)
Security Boundary
The tool runs in an independent process/container, isolating execution risks through sandboxing
The tool and model share process memory space, which may lead to the risk of API key leakage.

The intuitive feeling is that it is equivalent to the difference between microservices in the microservice architecture and functional functions in monolithic applications.


Application Scenario


The MCP protocol can turn large models into "executors". The following are some common application scenarios:

  • File service : The Claude large model can be used to organize files in the specified directory.

  • Data query : Through the MCP server, you can query the specified data resources, such as mysql, es, mongodb, etc.

  • Network search : Through the MCP server, a network search engine is built, and the large model can realize the function of network search.

In theory, within an enterprise, all systems can be interconnected through the MCP protocol.


Quick Start


Next, we will use Python to develop a simple weather query tool based on the MCP protocol and call it through the Cursor client to give you an intuitive feel of the capabilities of MCP.

1. Install the Python package management tool uv
# Install uv
powershell -ExecutionPolicy ByPass -c  "irm https://astral.sh/uv/install.ps1 | iex"
# Verify uv
uv --version
2. Initialize the project
# Initialize the project directory and use the Python 3.12.0 environment
uv init weather --python 3.12.0
cd  weather
# Create a virtual environment
uv venv
# Activate the virtual environment
.venv\Scripts\activate
# Install project dependencies
uv add mcp[cli] httpx
# Create the weather.py file
new-item weather.py
3. Write the weather query tool server code
# Use the query interface of Hefeng Weather to query the weather
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP

# Initialize FastMCP server
mcp = FastMCP( "weather" )

# Constants
API_BASE =  "https://devapi.qweather.com/v7"
API_KEY =  "your api key"


async def query_weather(url: str) -> dict[str, Any] | None:
    "" "Make a request to the NWS API with proper error handling." ""
    headers = {
        "X-QW-Api-Key" : API_KEY,
    }
    async with httpx.AsyncClient() as client:
        try:
            response = await client.get(url, headers=headers, timeout=30.0)
            response.raise_for_status()
            return  response.json()
        except Exception:
            return  None

@mcp.tool()
async def get_forecast(latitude:  float , longitude:  float ) -> str:
    "" "Get weather forecast for a location.

    Args:
        latitude: latitude of the location
        longitude: Longitude of the location
    "
""
    # First get the forecast grid endpoint
    weather_url = f "{API_BASE}/weather/7d?location={longitude},{latitude}"
    weather_data = await query_weather(weather_url)

    forecasts = []
    for  period  in  weather_data[ 'daily' ]:   # Only show next 5 periods
        forecast = f "" "
{period['fxDate']} {period['textDay']}:
Temperature: {period['tempMin']}~{period['tempMax']}°C
Wind: {period['windSpeedDay']} {period['windDirDay']}
"
""
        forecasts.append(forecast)

    return "\n---\n" .join(forecasts)


if  __name__ ==  "__main__" :
    #Initialize and run the server
    mcp.run(transport= "stdio" )

4. Verify that the service is correct
uv run weather.py

If no error is reported, it means that there is no problem with the server code for the time being, and Cursor will be used for calling test later.

5. Cursor registers MCP Server
6. Add MCP Server parameters

This demonstration is a local server, so type is command. The next article will explore the implementation of remote server. Fill in the start command of weather.py in Command, and pay attention to the full path of the directory where weather.py is located.

uv --directory G:\\project\\mcp\\weather run weather.py

If everything goes well, the MCP Server should be registered with the Cursor client normally.

7. Use big models to validate weather query tools

Select agent mode in the Compose mode of Cursor, enter "What's the weather like in Shanghai" in the chat box, and the model will pop up a prompt to run the weather query tool. Click Run and wait for the model output.

As shown in the figure, the large model has correctly called the weather query tool and sorted out the results to give a specific weather query response. At this point, a simple weather query tool has been implemented.

Calling process


Conclusion


This article mainly introduces the relevant concepts of the MCP protocol and its core architecture, and also makes a comparative analysis of Function Call. Finally, a weather query demo is made based on actual operation. I hope that through this article, everyone can have a deep understanding of MCP, understand and think about how to apply it to actual application scenarios. Thank you for watching.