CherryStudio configures local Ollama connection to use local mistral dialogue model and bge-m3 embedded model

Written by
Silas Grey
Updated on:June-23rd-2025
Recommendation

In-depth analysis of the seamless connection between Cherry Studio and Ollama model, making local AI dialogue more convenient!

Core content:
1. The configuration process of Cherry Studio and Ollama model
2. The application and advantages of local model in Cherry Studio
3. The author's caring guidance and suggestions for technical novices

Yang Fangxian
Founder of 53A/Most Valuable Expert of Tencent Cloud (TVP)

In yesterday's article, we have already talked about how to download and install Ollama software from the official website of Ollama, configure Windows environment variables, open firewall ports, etc., and finally download and run the "mistral-small3.1" model (this model has LLM reasoning dialogue, vision, and tools functions), and the "bge-m3" model (this model is an embedded model and can be used for knowledge base). Now everything is ready, let's see how to configure Cherry Studio, an "AI assistant" or "AI dialog box" that directly interacts with users, to match the underlying model.

First, go to "Model Service" from "Settings" of Cherry Studio, select "Ollama", fill in ["http://localhost:11434"] in the "API Address" of the detailed configuration page (this address is the interface address of the model providing services), and then click "Manage".
Enter the Ollama model detailed configuration page and you can see the "bge-m3:latest" and "mistral-small3.1:latest" models that have been downloaded and installed. Simply click the "+" sign on the right to add the model.
The knowledge base built using the API KEY method in the early stage cannot be replaced with an embedded model. You can only create a new knowledge base and add it. In the "Embedded Model" position, directly select Ollama's "bge-m3:latest" model.

Finally, go to Cherry Studio settings and adjust the "Default Model" to use the "mistral-small3.1:latest" model deployed by Ollama.

Now, you can enjoy the convenience brought by the local model in the Cherry Studio dialog box without restrictions.

There are many screenshots of step-by-step operations in my articles, and the content is a little easier to understand. This is mainly because "tools are used, and the more ordinary people know how to use them, the more efficient they will be", and I will take more care of the understanding level of ordinary people. Technical masters can ignore this...