I installed two models in ollama: one for inference (qwen) and one for text embedding (bge). The large model for inference installs successfully, but the text embedding model always fails to integrate. The error message is: “get tei model extra parameter failed”. Even after copying the model name from the ollama list output, I still get this error.