[HELP] Error When Use Gemini AI Model

Hello everyone,

I’m currently facing an issue when using the Gemini AI model in my chatbot flow.
Today, when I run the workflow, the LLM/classifier node that uses Gemini shows this error message:

”Run failed: req_id: 5bad4ad747 PluginInvokeError: {“args”:{},“error_type”:“RuntimeError”,“message”:“Cannot send a request, as the client has been closed.”}”

However, when I switch the model to DeepSeek, it works fine without any errors.

Has anyone experienced this issue before or knows how to fix it?
Thanks in advance for your help!

Hello Freddi, I’d like to first confirm if you’re using the latest version of the Gemini Model Plugin? You can click “install” here to see if an update is needed. If the problem persists after updating, we can continue our discussion.

it will be fixed in this PR fix httpx.client unexpected auto closed by the genai.client by hjlarry · Pull Request #2007 · langgenius/dify-official-plugins · GitHub

1 Like