From 0.1 to ∞: The technical humility and ecological ambition of the MCP protocol

MCP protocol: an expansion dock for AI models, opening a new era of technology.
Core content:
1. MCP protocol definition and function overview
2. Case analysis of MCP in actual application scenarios
3. New opportunities for MCP workflow and middleware
MCP is a great idea, Model Context Protocol, the Model Context Protocol is an open protocol launched by Anthropic at the end of 2024 to standardize the way applications provide context to large language models.
If you imagine the MCP Server as an expansion dock for LLM, then the MCP protocol is the USB-C interface. Just as USB-C provides a standardized way for devices to connect to various peripherals and accessories, MCP provides a standardized way for AI models to connect to different data sources and tools.
The reason why docking stations such as MCP exist is that the model parameters of LLM contain a wealth of general knowledge, but usually fail to grasp the following two types of information:
1. LLM cannot access your private content, such as files in the file system, orders in the database, or text in private wikis and notes.
2. Without an internet connection, LLM will not be able to obtain real-time information, such as current stock prices, latest financial reports, tomorrow's weather forecast, or cutting-edge technology news.
1. Current popular application scenarios of MCP
Example 1: Claude's desktop application The following shows how the Resend MCP server works with multiple MCP clients.
Issue an instruction to the big model to send an email to the specified mailbox. The big model uses the MCP Client to check which tools are available. When it finds that send_email can implement the request, the big model initiates a call to the send_email tool to complete the instruction to send the email.
Case 2: LLM + Amap MCP Server to achieve travel planning.
The big model calls each MCP Tool to obtain the location information and nearby hotel information in the instruction, thereby completing the travel planning instruction.
2. MCP Workflow
After starting the client, the client reads the configuration file, connects to the server and obtains the tool list according to the protocol. Unlike traditional question-and-answer or reasoning models, when there are available MCP tools, the list of available tools needs to be sent when the user question is sent. LLM will determine whether it is necessary to call the tool to complete the task and return this instruction to the client. If the client receives an instruction to call the tool, it will configure the parameters and contact the server to call the tool according to the instructions of LLM and the calling method specified in MCP, and send the call result to LLM again to organize the final answer.
MCP workflow:
3. MCP+Middleware
MCP brings new development opportunities to middleware.
Nacos released MCP Registry, which enables the upgrade of existing application interfaces to the MCP protocol with “0 changes”: https://mp.weixin.qq.com/s/MuK-YTVhuBqPzH7iz6Ep7A
mcp-kafka service:
https://github.com/kanapuli/mcp-kafka/blob/main/static/demo.gif
The large model can use the mcp kafka tool to create topics, generate messages, and use messages for Kafka .
Conclusion
MCP may be a small step in the development of technology, but it has a lot of room for imagination and may be a turning point in the development of AI. I believe that as more and more MCPs are developed, they will expand the functions of hardware such as Rabbit R1.