When testing and running knowledge retrieval for the created knowledge base chatflow, there is no feedback, it directly returns empty results. It seems unresponsive.
Are there only three nodes, and is the first one the start node? Please show me your LLM configuration screenshot and the reply node screenshot.
I found the reason: the model was not loaded. It might be that our side (the user) was responsible for loading the model.
Thanks, great hero, thanks, great hero, thanks, great hero, thanks, great hero
@Dify_Forum_Helper Summarize it
This post can be summarized as follows:
Problem Description
- When users were working on “Knowledge Base Chatflow”, during the testing and running phases:
- The knowledge retrieval node provided no feedback;
- It output
result: [], which appeared to be “unresponsive”.
Troubleshooting Process (inferred from conversation)
- The Admin first confirmed:
- Does the Chatflow only have 3 nodes? Is the first one the start node?
- Asked the user to provide: screenshots of the LLM node configuration and the reply node, to troubleshoot whether it was a workflow or model configuration issue.
Root Cause
- The user later discovered through self-troubleshooting:
The reason was that the model was not loaded.
It might be that we, as the user, tried to load the model. - In other words: the underlying LLM model was not actually in an available/loaded state, which caused the knowledge retrieval and answer nodes to execute but fail to receive a model response, thus outputting an empty array.
Solution
- On the model management / invocation side, correctly load and enable the corresponding model, ensuring that:
- The model selected in the Chatflow’s LLM node is already available in the current environment;
- The model service itself has successfully started, connected, and has no errors;
- (If it’s a local or self-built model) Confirm that the backend inference service is running.
After the user loaded the model, the problem was resolved, and the user expressed gratitude multiple times in the post.
Reference Checklist for Similar Issues for Future Users
If you encounter situations where “knowledge base retrieval result is empty” or “appears unresponsive”, you can prioritize checking:
-
Is the model truly online and available?
- Check the model configuration page to see if the model is displayed as available / connected;
- If using a local model, confirm that the inference service process is running.
-
Is the LLM node in the Chatflow correctly selecting the model?
- Not selecting a model, or selecting a non-existent/deleted model, can also lead to empty results.
-
Are there any errors in the logs / console?
- If the model is not connected, the backend usually shows connection errors or timeout errors, which can help with troubleshooting.
The final conclusion of this post: Empty results in Knowledge Base Chatflow were due to the model not being correctly loaded; loading the model resolved the issue.
@Dify_Forum_Helper This is a post about Ollama. When summarizing, don’t forget to mention the vendor in relation to the story point.
The story points for this post can be organized as follows (with added Ollama provider information):
1. Symptoms
- Scenario: User is working on a ‘Knowledge Base Chatflow’.
- Phenomenon:
- During testing and actual operation, the knowledge retrieval node appears to have ‘no feedback’;
- The returned content shows
result: [], an empty array, which looks like ‘no response at all’.
2. Environment and Provider
- Application Form: Dify Knowledge Base Chatflow.
- LLM Provider: Ollama (This post is about the Ollama scenario, which needs to be noted in the story points).
- Model Form: Local / Self-built model, loaded and run by the user on the Ollama side.
3. Investigation
- Admin starts by checking the flow configuration, confirming:
- Chatflow general structure: whether there are only 3 nodes, and if the first one is the start node;
- Ask the user to provide:
- Screenshot of the LLM node configuration;
- Screenshot of the reply node.
- Investigation focuses on:
- Whether the flow connections are correct;
- Whether there are issues with the LLM model configuration, especially in local model / Ollama scenarios, if the model is truly in an available state.
4. Root Cause
-
The user later checked on their own and found:
The reason was that the model was not loaded.
It might be that we, the user, loaded the model. -
Summary:
- The Ollama model selected in the Chatflow was not truly ‘loaded / started’ on the Ollama side;
- This led to the knowledge retrieval and answer nodes executing, but failing to get a response from the LLM, ultimately outputting an empty
result: [].
5. Resolution
-
Complete correct loading and activation on both the Ollama + Dify model configuration sides:
-
On the Ollama side
- Confirm the corresponding model has been
pulled and is in an available state; - The Ollama service process is running and port configuration is correct;
- Confirm the corresponding model has been
-
On the Dify side
- The corresponding Ollama model connection in model management is normal, with no errors;
- In the Chatflow’s LLM node, this available Ollama model has been selected.
-
-
After the user completed the above operations, the problem disappeared, Chatflow output returned to normal, and the user expressed gratitude multiple times.
6. Checklist for Future Users (with Ollama focus)
When encountering ‘knowledge base retrieval no response / result is empty’, especially in Ollama scenarios, you can troubleshoot in the following order:
-
Is the Ollama model correctly loaded?
- Confirm in Ollama:
- The model has been pulled and loaded;
- The Ollama process is running normally, with no port conflicts or abnormal exits.
- Confirm in Ollama:
-
Ollama model connection status in Dify
- Check on the model configuration page:
- Does this Ollama model show as ‘Available’?
- If there are connection errors or timeouts, prioritize resolving connection issues.
- Check on the model configuration page:
-
Does the Chatflow LLM node configuration point to this model?
- Confirm that a deleted / non-existent model has not been selected;
- If there are multiple providers, confirm that the current node is using the Ollama path.
-
Are the flow connections complete?
- Start node → Knowledge retrieval node → LLM/Reply node;
- Is the retrieval result correctly passed to the LLM, and is the LLM output connected to the final reply?
-
Check logs
- If the interface only shows
result: [], check backend logs / browser console for:- Ollama call failed;
- Model not loaded, connection refused, timeout, or other errors.
- If the interface only shows
-
Confirm the knowledge base itself is fine
- Are KB documents successfully vectorized / indexed?
- Does the query content have matching text?
Story point key tags:
- Feature: Knowledge Base Chatflow
- Provider: Ollama
- Symptoms: Retrieval node no feedback, returns
result: [] - Root Cause: Ollama model not loaded / unavailable on the provider side
- Resolution: Load and enable the model on the Ollama side, then select the available model in Dify to restore normal operation.
