MCP and RAG, and let us use MCP's Tool to rush through everything!

How does the MCP protocol turn LLM from a "talker" into a "doer", and why Tools become the protagonist in real projects.
Core content:
1. A detailed analysis of the three core concepts of MCP: Resources, Tools, and Prompts
2. Analysis of the advantages and limitations of Tools in real projects
3. Discussion on the potential and ecological maturity of Prompts and Resources
Deconstructing the three core concepts of MCP: How to turn LLM from "empty talker" to "doer"
1. Resources (Resource Manager) - Install a "memory plug-in" for LLM
Core role :
RAG-style data provision : Like a librarian, tell LLM **"what data is available here"** (database/folder/API, etc.). Dynamic notification mechanism : When resources are updated, the client is notified proactively (for example, when a new product document is added, it is synchronized to the LLM in real time).
Typical scenarios :
Enterprise knowledge base Q&A (automatically obtain the latest version of the manual) Real-time data query (stock, weather and other dynamic information)
"Resources allows LLM to no longer rely on training data, but to call up the latest information at any time."
2. Tools (tool set) - let LLM learn to "hands-on"
Skimmed but crucial :
Standardized tool calls : Python functions, APIs, etc. are encapsulated through the MCP protocol. LLM only needs to say "help me do XX" without worrying about the implementation details. Example : # Tool defined by MCP protocol (YAML format)
- name: "send_email"
endpoint: "http://api.example.com/mail"
params: ["recipient", "subject" , "content" ]
"Tools are the 'hands' of LLM, turning 'I want to send an email' into a real action."
3. Prompts (Template Master) - "Write a script" for LLM
Special features :
It is not an ordinary prompt word , but a reusable interactive template , like the "storyboard" of a movie. The server predefines the reasoning process (such as the "Five-step Troubleshooting Method"), and the client only needs to fill in the parameters. Workflow example :
The server defines the prompt template: "You are a customer service representative. Please answer questions about {product} in a friendly tone, refer to {resources}, and finally ask the user if they need further help."
Fill in the parameters when the client calls: prompt = get_prompt( "Customer Service Template" , product= "iPhone15" , resources= "Latest Product Manual" )
Why is it included in the MCP protocol?
Standardized interactions : Avoid redesigning prompts every time and agree on template structures and parameters through protocols. Link with Tools : For example, embed in Prompt {{call tool=search_docs}}
, directly trigger the tool call.
"Prompts is the 'script' of LLM, turning free play into controllable industrial production."
Tool's "dominance": Why do people only rely on Tool in real projects?
Core conclusions :
Prompts and Resources are great in theory, but in reality Tools are sufficient and even more flexible. Cherry (MCP reference implementation) did not take Prompts/Resources seriously , which led to everyone assuming that "Tool is everything". But Prompts and Resources still have potential, but the ecosystem is not mature yet .
1. Why are Prompts and Resources left out?
(1) Prompts: Ideals are full of hope, but reality is very bleak.
Theory : pre-define interaction templates and let LLM follow the script. ? Reality :
LLM itself has a strong ability to understand context , and can handle temporary prompts. Too dynamic : Business needs are ever-changing, and fixed templates may limit performance. Cherry's implementation is very simple , which makes everyone think "it's better to write a tool directly".
✅ Tool alternatives :
Encapsulate common prompt logic into a tool, for example: def generate_response (prompt_template, **kwargs) :
return llm.run(prompt_template.format(**kwargs))The effect is the same, but more free , without relying on MCP's Prompts mechanism.
(2) Resources: Another name for RAG
Theory : Dynamic data sources allow LLM to obtain the latest information in real time. ? Reality :
RAG is mature enough and can be implemented directly using VectorDB + retrieval API. Cherry's Resources implementation is too simple and has no advantages over traditional RAG. Tool can directly encapsulate data query , for example: def query_knowledge_base (question) :
docs = vector_db.search(question)
return format_docs(docs)More controllable, does not rely on the Resources mechanism of MCP .
2. Why Tool can "dominate the market"
(1) Highest flexibility
Tool can simulate Prompts and Resources : Need a fixed interaction mode → encapsulate into a Tool (such as customer_service_prompt
).Dynamic data is needed → encapsulated into Tool (such as query_latest_news
).It does not rely on the additional abstraction of MCP and directly controls the code, which is more in line with the thinking of engineers.
(2) Better ecological support
Frameworks such as LangChain/LlamaIndex/AutoGen are all designed around Tools , while MCP's Prompts/Resources seem "non-mainstream". Cherry (MCP reference implementation) did not take Prompts/Resources seriously , which led to everyone assuming that "Tool is enough".
(3) Real project requirements are more in line with the Tool
Enterprise-level applications : Precise control of processes is required, and Tools are more reliable than Prompts. Complex logic : For example, "check the database first, then call the API, and finally generate a report" is more intuitive using Tool combinations than Prompts.
3. But Prompts and Resources still have potential
(1) Tips: If standardization is achieved, duplication of work can be reduced
In the future, it may become a "reusable skill package" , such as: Customer service script template
,Code Review Template
Wait, call directly without writing repeatedly.A more powerful MCP implementation is needed (Cherry is not currently sufficient).
(2) Resources: If dynamic subscription can be achieved, it will be more flexible than traditional RAG
Ideal scenario : LLM can "subscribe" to data sources and obtain updates in real time (such as automatic push of stock quotes). But currently RAG + Tool is sufficient , so there is no motivation to use MCP's Resources.
Realistic advice: Focus on Tools, and wait and see about Prompts/Resources
Prioritize using Tools to implement functions , which are flexible, controllable, and have a mature ecosystem. If you find a lot of repeated prompt logic , you can consider encapsulating it into MCP Prompts (but Cherry may not be enough). If you need dynamic data , use RAG + Tool directly, don't wait for MCP Resources.
Summarize :
Now : Tools are king, Prompts/Resources have not really shown their value yet. Future : If the MCP ecosystem matures, Prompts/Resources may become an “advanced gameplay”, but currently the Tool can solve 99% of the problems.
(So, don’t worry about it, just use the Tool to rush through everything first! ?)