Table of Content
Error logging when localizing large models into RagFlow

Updated on:July-01st-2025
Recommendation
Sharing of practical experience in connecting RagFlow to the localized large model QwQ-32B.
Core content:
1. Basic steps for connecting RagFlow to the QwQ-32B model
2. Network configuration errors and solutions
3. API-Key setting problems and solutions
Yang Fangxian
Founder of 53AI/Most Valuable Expert of Tencent Cloud (TVP)
First, make sure the host machine can connect to the large model service. Log in to the ragflo w container to test the connection. Sure enough, the connection failed
docker exec -it 0b527d272baa /bin/bash curl -I http://10.10.10.10:8080
I suspected that there was something wrong with the container's network configuration, so I checked the relevant configuration file docker-compose-CN-oc9.yml, and found the problem. In order to avoid port conflicts when deploying ragflow, I changed port 80 in the original file (docker-compose.yml) to 8090, that is, port 80 of the container was mapped to port 8090 of the host, but I did not change the configuration in docker-compose-CN-oc9.yml!!!
After the modification is completed, restart the Docker service
sudo systemctl restart docker
Re-enter the ragflow container to verify connectivity
curl -X POST http://10.128.32.23:8080/v1/chat/completions \> -H "Content-Type: application/json" \> -d '{> "model": "QWQ-32B",> "messages": [{"role": "user", "content": "Hello"}]> }'
The connection was successful, and I thought everything was fine, but...
I continued to investigate and found that although I did not set the API-Key when deploying QwQ-32B, this API-Key must be filled in when connecting with code or framework! ! ! After adding it, the connection was successful!