Dify cannot connect to locally deployed large models

  1. Local deployment of Dify 1.11.4 using Docker;
  2. Local deployment of the AI large model server (not using Ollama), where Python can normally access the AI model via OpenAI (base_url=dsv_ai_url, api_key=api_key);
  3. The AI large model and Dify are deployed on two different internal enterprise servers, and they can ping each other.

Issue: When using the Dify platform with the OpenAI-API-compatible plugin or Vllm plugin, and configuring the relevant URL, API key, and model name, the connection fails with the error:
req_id: 5cb05fbd36 PluginInvokeError: {"args":{},"error_type":"UnboundLocalError","message":"cannot access local variable 'response' where it is not associated with a value"}

Could you please advise on how to resolve this?

Encountered the same problem, may I ask if the original poster has solved it.

Someone also submitted an issue, and according to the provided solution, it still couldn’t be resolved, but this issue was closed, and no one continued to follow up. openAI compatible add-on faild to add LLM · Issue #2203 · langgenius/dify-official-plugins · GitHub