LLM model input keeps erroring, see post for details, also happens with built-in templates.

req_id: 1eb791beb7 PluginInvokeError: {“args”:{“description”:“[models] Error: ‘TextPromptMessageContent’ object has no attribute ‘startswith’”},“error_type”:“InvokeError”,“message”:“[models] Error: ‘TextPromptMessageContent’ object has no attribute ‘startswith’”}