- Local deployment of Dify 1.11.4 using Docker;
- Local deployment of the AI large model server (not using Ollama), where Python can normally access the AI model via OpenAI (base_url=dsv_ai_url, api_key=api_key);
- The AI large model and Dify are deployed on two different internal enterprise servers, and they can ping each other.
Issue: When using the Dify platform with the OpenAI-API-compatible plugin or Vllm plugin, and configuring the relevant URL, API key, and model name, the connection fails with the error:
req_id: 5cb05fbd36 PluginInvokeError: {"args":{},"error_type":"UnboundLocalError","message":"cannot access local variable 'response' where it is not associated with a value"}
Could you please advise on how to resolve this?