Obsidian AI Best Practices: OpenRouter integrates 90+ large models with zero threshold

Obsidian AI and OpenRouter have joined forces to create a new intelligent knowledge management experience that allows local seamless calls to 90+ large models.
Core content:
1. OpenRouter gateway integrates 93+ large models to achieve seamless cross-model workflows
2. Quick configuration in 5 minutes, visually adjust parameters, and get free credits
3. With the chatgpt-md plug-in, conversation records are embedded in Markdown notes, doubling efficiency
Are you still manually switching AI platforms to process your notes?
Obsidian+OpenRouter
The golden combination helps you seamlessly call 93+ large models such as Claude and GPT-4 locally. 5 minutes of configuration can achieve: automatic connection of cross-model workflows, visual parameter adjustment,Free quota
You can alsoBypassing geographical restrictions
.Cooperatechatgpt-md
Plug-in, conversation records are directly embedded in Markdown notes, doubling the efficiency of knowledge management.Personal AI plug-in pitfall record
, teach you how to build your own intelligent knowledge base.Obsidian AI Best Practices: OpenRouter integrates 90+ large models with zero threshold
As an Obsidian Deep Notes user, do you also face these pain points?
• After organizing your notes, you need to Jump Switch
Only third-party platforms can call on AI for summary analysis?• Want to experience top models such as Claude/GPT-4, but Limited by network
Access barriers?• When configuring the ObsidianAI plugin, do you always need to consult different API documents to confirm the model name? • Model switching requires manual configuration changes, including temperature/top_p, etc. Core parameters
Nowhere to adjust?
Today we introduce a new solution: Through the OpenRouter gateway, you can: ✓ Integrate 47+ mainstream AI service providers (Google/OpenAi/DeepSeek/Ollam, etc.) ✓ One-click call 93+ mainstream large models (GPT-4/Claude/DeepSeek, etc.) ✓ Eliminate the troubles of regional restrictions and API key management ✓ Visually adjust all conversation parameters ✓ Achieve true seamless integration of cross-model workflows
The following article will explain in detail how to build your all-round AI knowledge assistant in 5 minutes and be efficient together.
How to use the large model
1. App or website : Many domestic large-model apps (such as Doubao and Kimi) provide completely free services. Some foreign products (such as OpenAI and Gemini) also have free trials of suboptimal models, but the kx Internet access problem needs to be solved. 2. API integrates the big model interface. In the big model cloud service, just register and add a new API key;
Benefits of API calls to large models
APP is suitable for ordinary users, while API provides more flexibility and functions and is suitable for users who need deep customization.
1. API can be deeply integrated with workflow to support automated tasks. 2. Provide a larger input context window to overcome the input limitations of the App. 3. Facilitates seamless switching between different models and improves flexibility.
What do I do with the API Big Model?
1. Generate a summary of your diary/notes 2. When you first learn an unfamiliar field, initialize a knowledge graph and then delve deeper into each point yourself; 3. Use structured methods to restructure my messy oral expressions 4. Provide modification suggestions for output content
Commonly used large model API service providers
1. OpenRouter
• Features: A wide variety of models, covering both commercial and open source options. • Note: The free quota is relatively limited, and only some models are marked with the word "free".
2. Siliconflow • Features: Provide free Qwen, gemma and SD (Shengtu) series models. • Advantages: Domestic users can access it directly without kx internet access. 3. Google-AI-Studio (need kx internet access) • Features: It provides its most advanced models (such as the Gemini series), with 1,500 free requests per day and a maximum rate limit of 1 million tokens and 15 requests per minute. • Advantages: Individual users usually enjoy a credit limit that never expires. 4. groq (need kx to access the internet) • Features: Supports llama series and gemma series, most models provide a free quota of 500k tokens per day. • Free model version : limited to ID ending with :free, up to 20 requests per minute. • Balance less than $10 : Daily request limit is 50. • Account balance above $10 : Daily request limit increased to 1000. • Plugin ID: chatgpt-md • Project address: https://github.com/bramses/chatgpt-md • Project Description: Make ChatGPT and Obsidian integrate (almost) seamlessly. • Support for configuring 3 types of models: Each AI service now has its own dedicated URL parameter in settings and metadata • openaiUrl is used for OpenAI
API• openrouterUrl is used for OpenRouter.ai
• ollamaUrl is used for Ollama
Local Model• Features: • Create notes based on prompt word templates • Use yaml parameters as large model parameters, and switch large models from the online list through command modification • Conversation records are stored in notes in the form of role:user and role:assitant 1. Install ChatGPT MD : Go to Settings > Community Plugins > Browse
,searchChatGPT MD
and clickInstall
;2. Add an API key to the openrouter website; 3. Add OpenAI API key : Add OpenAI API key in plugin settings, or install Ollama and local LLMs of your choice; 4. Start chatting : Use ChatGPT MD: Chat
Order (cmd + p
orctrl + p
) Start a conversation from any note.
How to use the Big Model API for free
This article focuses on teaching the public how to use Openrouter to fleece the capitalists. The above service providers can access it. OpenRouter is a large model API router designed to connect various AI model services.Integrated into a unified interface
OpenRouter lowers the threshold for using AI technology, allowing more people to easilyChoose the right large model
Solve practical problems.
Openrouter model ranking
Model token rankings: https://openrouter.ai/rankings
Openrouter free use limitations
Openrouter integrates third-party service provider model
Integration address: https://openrouter.ai/settings/integrations
For example, I integratedGoogle AI Studio
,groq
,Anthropic
These service providers have free quotas and can avoid kx Internet access. Of course, there are also domestic ones.DeepSeek
, it is also cheap and good.
user this key as a fallback
When the rate limit is reached or a failure occurs, it will use itself first.API Key
How to use AI smoothly in Obsidian
Obsidian AI plugin
chatgpt-md usage examples
ctrl+p, switch modelYou can refer to openrouter's model ranking to select a model
chatgpt-md Quick Start
Get started in just a few easy steps:
Adjusting the generation parameters of large models
Ordinary users can use the default parameters
Use the yaml parameters in the note header as the large model parameters. You can switch the large model from the online list through command modification
---
system_commands: [ 'I am a helpful assistant.' ]
temperature: 0.3
top_p: 1
max_tokens: 4096
presence_penalty: 0.5
frequency_penalty: 0.5
stream: true
stransform: translateY( null
n: 1
model: openrouter@deepseek/deepseek-chat-v3-0324
# Service-specific URLs (optional, will use global settings if not specified)
openaiUrl: https://api.openai.com
# openrouterUrl: https://openrouter.ai
# ollamaUrl: http://localhost:11434