Hello everyone,
I’m currently facing an issue when using the Gemini AI model in my chatbot flow.
Today, when I run the workflow, the LLM/classifier node that uses Gemini shows this error message:
”Run failed: req_id: 5bad4ad747 PluginInvokeError: {“args”:{},“error_type”:“RuntimeError”,“message”:“Cannot send a request, as the client has been closed.”}”
However, when I switch the model to DeepSeek, it works fine without any errors.
Has anyone experienced this issue before or knows how to fix it?
Thanks in advance for your help!
