Cherry Replaces Manus: AI Coordinates Multiple MCPs to Process Excel Locally and Generate Visual Reports

Use Cherry instead of Manus to implement a new method for AI to locally process Excel and generate visual reports.
Core content:
1. Introduce the importance of AI Agent processing logic and tool combination
2. The functions of Cherry Studio and how to install MCP
3. Demonstrate the complete process of using Cherry and MCP to implement Excel data analysis reports
Yang Fangxian
Founder of 53A/Most Valuable Expert of Tencent Cloud (TVP)
Manus can be called "hot" in the Agent field, but many people still can't use it due to the limitations of the network and the pitiful points.
The processing logic of AI Agent: it is nothing more than using AI to plan what to do according to the user's needs, and then keep calling different tools to realize it.
With this logic, we know that if we want to realize the needs of a certain scenario, we just need to find a combination of different tools for AI to call.
As for the tools, various MCPs are already mature and can be called directly in the AI dialog.
Wouldn't we be able to hand rub a "Manus" by ourselves locally?
Just do it! Or a typical AI data analysis scenario: do analysis on an Excel table, and then automatically generate a data analysis report.
Previously I shared a visual data analysis workflow done by dedify" data-itemshowtype="0" linktype="text" data-linktype="2">with Dify, but for many students, the workflow is still difficult, so this time, we try to use a pure conversation ➕MCP in the simplest way to realize it, see how it works?
Where to do it? Here is a recommended open-source, free AI tool - Cherry Studio
What is Cherry?
It was used in one of the examples I shared before: DeepSeek transforms into a thesis learning tool, allowing AI to automatically search for thesis, read, download, and realize thesis freedom" data-itemshowtype="0" linktype=" text" data-linktype="2">ArXiv MCP babysitting tutorials | DeepSeek transforms into a thesis learning artifact, allowing AI to automatically search for, read, and download papers to achieve thesis freedom
It can call almost all the large language models through the API, or a local knowledge base, can generate graphs, support various formats of results, etc., very comprehensive.
More information about Cherry, including installation can be seen on the official website: https: //www.cherry-ai.com/
For the little white man, the only thing he may have to overcome is how to get the API of the large language model, it is recommended to solve the problem through channels such as silicon-based flow, volcano engine, openrouter, etc. You can search for it. If you want to accomplish the transition from the little white man to the depth of playing AI, learning to call the API is a must.
MCP Installation Preparation
Specifically, what is MCP? You can see the introduction I wrote before, here will not repeat: an article to make clear Agent, MCP, Function Call, with practical code examples
Find MCP
Next, you need to solve the problem of where to find multiple MCPs.
You can go to some MCP marketplaces, such as https://www.modelscope.cn/ and https://mcp.so/ in the Magic Tower community .
Here is an example of the former: open the "MCP Plaza" and search for keywords directly below.
In today's case, we need to use at least 4 MCP Servers:
-
1. Excel-mcp-server
Used to allow AI to read and write to Excel
-
2. Sequential thinking
This is a tool for AI to force deep thinking, which solves the problem of incomplete thinking of many large language models.
-
3. QuickChart-MCP-Server
This is a tool that allows AI to generate visualization charts directly.
-
4. File system
This is a tool that allows AI to read and write files on our local computer. Here I directly use Cherry built-in, basically AI programming tools (such as cursor) will be built-in, generally do not need to set up a separate
But in Cherry, after all, it is a dialog tool, you need to configure a usable file path , after all, you can not let it change the computer folder.
Installing MCP
How to install the MCP Server in Cherry?
As shown in the picture, click Settings - MCP Server - Add Server - Import from JSON .
At this time, a box will appear, and we will use this to copy the configuration of the corresponding tool into it:
This is for excel-mcp-server
{
"mcpServers": {
"excel": {
"command": "npx",
"args": ["--yes", "@negokaz/excel-mcp-server"],
"env": {
"EXCEL_MCP_PAGING_CELLS_LIMIT": "4000"
}
}
}
}
This is from quickchart-mcp-server
{
"mcpServers":{
"quickchart-server":{
"command":"npx",
"args":[
"-y",
"@gongrzhe/quickchart-mcp-server"
]
}
}
}
If you get an error after copying it in, just click back in and click on open again, and it will install automatically.
The other two I found directly in Cherry's internal search, and they installed automatically.
So far, the 4 MCP tools we need are installed, make sure each one is green to work.
Data Preparation
Here I use the order form, which I used to do a SQL query to test, move the Excel file to the path set by file system , Cherry can only move in this folder.
Large Model Selection
Because this case is a pure MCP operation, and MCP test is the ability of large model scheduling tool, so at least you need to choose a function call ability of the tool, and in the programming aspect of the effect is better (generally scheduling ability will also be better), such as gemini 2.5 pro, claude3.5, grok3, deepseek v3
The second thing is to try to choose a long context , otherwise, the large language model will report errors when the Excel data is larger.
For example, if my table has 3 years of data and I want AI to help me count the number of orders for each month, it will be a tragedy.
This is the upper limit of the model context length. Grok is relatively long. For longer ones, you can only find Gemini.
But Gemini 2.5 pro is not free, I tried 2.5 flash is very outrageous: do not care about the data, directly to my simulation!
Here I use grok3 to test.
Starting a Dialog
Select a character, open MCP
Create a new dialog in Cherry. You can search for "data analysis" in the SmartBody Marketplace, where you will find all the characters configured with prompts.
Choose a suitable one and enter the dialog box as below:
Under the input box, tap MCP Configuration, and select several MCP tools that we need to use, so that the AI can use them inside this dialog.
Finally, start to do data analysis
First, let's simply ask what data this table has and test the capabilities of Excel MCP.
You can see that it will first call the file system tools, list the files in the folder, and then find the Excel file we specified.
Then call excel-mcp-serve tool to read Excel, to give the answer.
No problem, let’s move on to the next step and complete the Excel analysis and visualization report in one step:
Please make a data analysis report HTML based on this data, requiring as many dimensions and chart types as possible, and save the finished html file locally.
The answer is too long, just do some intercepts, you can see here start to call quickchart-mcp 's ability to start making charts:
Eventually, after making seven charts, make the HTML:
At this point, in the previous we set up Cherry's folder and will see an HTML file.
Double-click to open it, and there are various charts based on Excel.
At this point, we have completed a number of MCP calls in Cherry and generated a data analysis report.
Although the final report is relatively ugly, this is because I wanted to show the logic, I did not deliberately to write the prompt word; you can refer to the tips I shared before to write HTML prompt words; it is good.
AI to do HTML, the ultimate program, a set of prompt word templates to handle all applications: PPT, resume, high-fidelity prototype diagrams, knowledge cards, dynamic interactive component,s and so on!
Support for running in Python
Doing data analysis in Cherry, in addition to the above, you can also invoke its Python capabilities.
Yes, that's right, it can run Python code directly.
That means, before we had to copy the code given by the AI to the editor to run it, now we can run it directly in Cherry.
See how it works, for example:
Count the number of orders per month for me, just give me the python code.
At this point, the AI can't see where to run the Python code in the reply; you need to do some configuration.
Open the "Settings" on the left side - where the code block is set, open the "Code Execution", and the following function.
At this point, the button to run the code appears on the right side of the code.
However, it is important to note that since Python is running in Cherry's sandbox environment, it is not actually interoperable with your local computer.
That means that the above doesn't work with Python processing local files?
But it is possible to run other Python code.
After all, this is a new feature, and it will take time for Cherry to iterate on it if we want to truly realize local file interaction.
Problems
OK, that's it for today's example.
But in reality, this scenario is really only used for small data plays.
And remember what I said last time? It's actually not even a contextual problem; letting the AI go directly to the computation itself has a big illusion problem.
The correct solution is to let the AI perform processing operations, such as Python code, and then we go and run it.
This was also implemented in the last define workflow I shared: using AI to make cool data kanban HTML and steadily updating the data to land on the real thing!
Nonetheless, starting from 0 to do today's case in its entirety, and being able to familiarize yourself with tools such as Cherry, MCP, etc., I believe that your mastery of AI can be deepened again.