Could not connect to Weaviate:Connection to Weaviate failed. Details: .

When I try to upload a new file to KB, I can’t upload it and get this error message.
If anyone knows how to fix this issue, please let me know.

You’re running a self-hosted environment, right?

You’re encountering an error where Weaviate cannot be connected to, but with just the fragment you’ve provided, it’s difficult to identify the cause. It would be helpful if you could share the full error message or a screenshot.

Also, depending on the error, I recommend checking the container’s startup status and reviewing the logs of the api, worker, and weaviate containers.

Yes, I am using the self-hosted version (built with Docker).

When uploading files, I’m getting the following error:
(!) The content is: “Could not connect to Weaviate: Connection to Weaviate failed. Details: .”

■ Container Status

From the result of docker compose ps, I confirmed that all containers including api, worker, and weaviate are in the Up state.

(Results are as follows)

docker compose ps
NAME                     IMAGE                                       COMMAND                  SERVICE         CREATED        STATUS                  PORTS
docker-api-1             langgenius/dify-api:1.11.2                  “/bin/bash /entrypoi…”   api             16 hours ago   Up 16 hours             5001/tcp
docker-db_postgres-1     postgres:15-alpine                          “docker-entrypoint.s…”   db_postgres     16 hours ago   Up 16 hours (healthy)   5432/tcp
docker-nginx-1           nginx:latest                                “sh -c 'cp /docker-e…”   nginx           16 hours ago   Up 16 hours             0.0.0.0:80->80/tcp, [::]:80->80/tcp, 0.0.0.0:443->443/tcp, [::]:443->443/tcp
docker-pgvector-1        pgvector/pgvector:pg16                      “/docker-entrypoint.…”   pgvector        5 weeks ago    Up 6 days (healthy)     5432/tcp
docker-plugin_daemon-1   langgenius/dify-plugin-daemon:0.5.2-local   “/bin/bash -c /app/e…”   plugin_daemon   16 hours ago   Up 16 hours             0.0.0.0:5003->5003/tcp, [::]:5003->5003/tcp
docker-redis-1           redis:6-alpine                              “docker-entrypoint.s…”   redis           16 hours ago   Up 16 hours (healthy)   6379/tcp
docker-sandbox-1         langgenius/dify-sandbox:0.2.12              “/main”                  sandbox         16 hours ago   Up 16 hours (healthy)
docker-ssrf_proxy-1      ubuntu/squid:latest                         “sh -c 'cp /docker-e…”   ssrf_proxy      16 hours ago   Up 16 hours             3128/tcp
docker-weaviate-1        semitechnologies/weaviate:1.27.0            “/bin/weaviate --hos…”   weaviate        16 hours ago   Up 16 hours             0.0.0.0:8080->8080/tcp, [::]:8080->8080/tcp
docker-web-1             langgenius/dify-web:1.11.2                  “/bin/sh ./entrypoin…”   web             16 hours ago   Up 16 hours             3000/tcp
docker-worker-1          langgenius/dify-api:1.11.2                  “/bin/bash /entrypoi…”   worker          16 hours ago   Up 16 hours             5001/tcp
docker-worker_beat-1     langgenius/dify-api:1.11.2                  “/bin/bash /entrypoi…”   worker_beat     16 hours ago   Up 16 hours             5001/tcp

■ Container Logs

1. api container

When checking docker logs docker-api-1, there are no startup errors or Weaviate connection errors, and communication with plugin_daemon appears to be working normally.

2. worker container

docker logs docker-worker-1 --tail=100
File “/usr/local/lib/python3.12/concurrent/futures/_base.py”, line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.12/concurrent/futures/_base.py”, line 401, in __get_result
raise self._exception
File “/usr/local/lib/python3.12/concurrent/futures/thread.py”, line 59, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/app/api/core/indexing_runner.py”, line 684, in _process_chunk
index_processor.load(
File “/app/api/core/rag/index_processor/processor/parent_child_index_processor.py”, line 125, in load
vector = Vector(dataset)
^^^^^^^^^^^^^^^
File “/app/api/core/rag/datasource/vdb/vector_factory.py”, line 45, in __init__
self._vector_processor = self._init_vector()
^^^^^^^^^^^^^^^^^^^
File “/app/api/core/rag/datasource/vdb/vector_factory.py”, line 65, in _init_vector
return vector_factory_cls().init_vector(self._dataset, self._attributes, self._embeddings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/app/api/core/rag/datasource/vdb/weaviate/weaviate_vector.py”, line 463, in init_vector
return WeaviateVector(
^^^^^^^^^^^^^^^
File “/app/api/core/rag/datasource/vdb/weaviate/weaviate_vector.py”, line 79, in __init__
self._client = self._init_client(config)
^^^^^^^^^^^^^^^^^^^^^^^^^
File “/app/api/core/rag/datasource/vdb/weaviate/weaviate_vector.py”, line 122, in _init_client
client = weaviate.connect_to_custom(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/helpers.py”, line 347, in connect_to_custom
return __connect(
^^^^^^^^^^^
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/helpers.py”, line 371, in __connect
raise e
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/helpers.py”, line 367, in __connect
client.connect()
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/client_executor.py”, line 149, in connect
return executor.execute(
^^^^^^^^^^^^^^^^^
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/executor.py”, line 99, in execute
return cast(T, exception_callback(e))
^^^^^^^^^^^^^^^^^^^^^
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/executor.py”, line 38, in raise_exception
raise e
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/executor.py”, line 80, in execute
call = method(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File “/app/api/.venv/lib/python3.12/site-packages/weaviate/connect/v4.py”, line 934, in connect
raise WeaviateStartUpError(f"Could not connect to Weaviate:{e}.") from e
weaviate.exceptions.WeaviateStartUpError: Could not connect to Weaviate:Connection to Weaviate failed. Details: .
2026-01-15 00:29:13.492 INFO [Dummy-1047] [document_indexing_task.py:102] - Processed dataset: 12ea72e3-c9ff-4b33-a519-e288ce5cabb4 latency: 0.5789082440314814
2026-01-15 00:29:13.493 INFO [Dummy-1047] [document_indexing_task.py:131] - document indexing tenant isolation queue 081fa228-d2d2-4b9c-a586-dca51415c1b5 next tasks: []
2026-01-15 00:29:13.502 INFO [Dummy-1047] [trace.py:128] - Task tasks.document_indexing_task.priority_document_indexing_task[16c8ad3d-b01e-4049-a916-ed73dae6870a] succeeded in 0.5901001710444689s: None
2026-01-15 00:29:42.851 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[550440d9-6185-49f4-97ae-18678d7aaf1d] received
2026-01-15 00:29:42.865 INFO [Dummy-1048] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[550440d9-6185-49f4-97ae-18678d7aaf1d] succeeded in 0.013193074963055551s: None
2026-01-15 00:30:00.002 INFO [MainThread] [strategy.py:161] - Task schedule.check_upgradable_plugin_task.check_upgradable_plugin_task[5ba74d5e-703e-43c2-992e-1b55e53dc80f] received
2026-01-15 00:30:00.003 WARNING [Dummy-1049] [log.py:232] - Start check upgradable plugin.
2026-01-15 00:30:00.003 WARNING [Dummy-1049] [log.py:232] - Now seconds of day: 1770.0032949447632
2026-01-15 00:30:00.007 WARNING [Dummy-1049] [log.py:232] - Total strategies: 0
2026-01-15 00:30:00.008 WARNING [Dummy-1049] [log.py:232] - Checked upgradable plugin success latency: 0.004957632045261562
2026-01-15 00:30:00.019 INFO [Dummy-1049] [trace.py:128] - Task schedule.check_upgradable_plugin_task.check_upgradable_plugin_task[5ba74d5e-703e-43c2-992e-1b55e53dc80f] succeeded in 0.01647809403948486s: None
2026-01-15 00:30:42.851 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[c8788ac7-0ce2-46ee-91a0-de6a630afba6] received
2026-01-15 00:30:42.863 INFO [Dummy-1050] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[c8788ac7-0ce2-46ee-91a0-de6a630afba6] succeeded in 0.011534405988641083s: None
2026-01-15 00:31:42.851 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[bc91dfa5-fb3c-4622-ac85-f9ef9f647f67] received
2026-01-15 00:31:42.863 INFO [Dummy-1051] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[bc91dfa5-fb3c-4622-ac85-f9ef9f647f67] succeeded in 0.011604055063799024s: None
2026-01-15 00:32:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[d243d57b-10de-4756-8a12-4775cef1fd22] received
2026-01-15 00:32:42.865 INFO [Dummy-1052] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[d243d57b-10de-4756-8a12-4775cef1fd22] succeeded in 0.012991473078727722s: None
2026-01-15 00:33:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[c599d787-27cf-4eb7-9feb-da14ded388d1] received
2026-01-15 00:33:42.867 INFO [Dummy-1053] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[c599d787-27cf-4eb7-9feb-da14ded388d1] succeeded in 0.014653912978246808s: None
2026-01-15 00:34:42.851 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[3c418b4e-cfe6-497f-9be7-554cc210b621] received
2026-01-15 00:34:42.864 INFO [Dummy-1054] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[3c418b4e-cfe6-497f-9be7-554cc210b621] succeeded in 0.012223441968671978s: None
2026-01-15 00:35:42.851 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[450bf627-115d-4317-9679-d992d16ea990] received
2026-01-15 00:35:42.865 INFO [Dummy-1055] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[450bf627-115d-4317-9679-d992d16ea990] succeeded in 0.013502061017788947s: None
2026-01-15 00:36:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[28dfc0f2-4617-4059-b0e7-d3897c32d123] received
2026-01-15 00:36:42.865 INFO [Dummy-1056] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[28dfc0f2-4617-4059-b0e7-d3897c32d123] succeeded in 0.012684233952313662s: None
2026-01-15 00:37:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[389d8a7c-ca0a-4d79-a546-5de234e4c341] received
2026-01-15 00:37:42.863 INFO [Dummy-1057] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[389d8a7c-ca0a-4d79-a546-5de234e4c341] succeeded in 0.010889502940699458s: None
2026-01-15 00:38:42.853 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[ef3e0bac-cd20-4065-8255-fd1bdf5ecfb4] received
2026-01-15 00:38:42.866 INFO [Dummy-1058] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[ef3e0bac-cd20-4065-8255-fd1bdf5ecfb4] succeeded in 0.012295693974010646s: None
2026-01-15 00:39:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[c5ce7903-eb95-490d-8b6a-17a9f3631eb0] received
2026-01-15 00:39:42.865 INFO [Dummy-1059] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[c5ce7903-eb95-490d-8b6a-17a9f3631eb0] succeeded in 0.012386396992951632s: None
2026-01-15 00:40:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[f5084a94-fbfb-407b-9be3-8ff83a636105] received
2026-01-15 00:40:42.864 INFO [Dummy-1060] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[f5084a94-fbfb-407b-9be3-8ff83a636105] succeeded in 0.011906969011761248s: None
2026-01-15 00:41:42.851 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[764d16ac-cfe8-4303-a5df-c13219fcebab] received
2026-01-15 00:41:42.865 INFO [Dummy-1061] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[764d16ac-cfe8-4303-a5df-c13219fcebab] succeeded in 0.013232778990641236s: None
2026-01-15 00:42:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[edd67a18-d230-4d6c-97e2-f1a8c936742e] received
2026-01-15 00:42:42.865 INFO [Dummy-1062] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[edd67a18-d230-4d6c-97e2-f1a8c936742e] succeeded in 0.013191823964007199s: None
2026-01-15 00:43:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[58c62185-ebe0-4747-85cd-83f2cb3d0eb9] received
2026-01-15 00:43:42.864 INFO [Dummy-1063] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[58c62185-ebe0-4747-85cd-83f2cb3d0eb9] succeeded in 0.012177053955383599s: None
2026-01-15 00:44:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[ffd45049-93e7-4218-b226-50e63ff2c8cb] received
2026-01-15 00:44:42.865 INFO [Dummy-1064] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[ffd45049-93e7-4218-b226-50e63ff2c8cb] succeeded in 0.012952374061569571s: None
2026-01-15 00:45:00.005 INFO [MainThread] [strategy.py:161] - Task schedule.check_upgradable_plugin_task.check_upgradable_plugin_task[9faa50ff-0305-4727-84ea-10ea78288e17] received
2026-01-15 00:45:00.007 WARNING [Dummy-1065] [log.py:232] - Start check upgradable plugin.
2026-01-15 00:45:00.007 WARNING [Dummy-1065] [log.py:232] - Now seconds of day: 2670.007495164871
2026-01-15 00:45:00.011 WARNING [Dummy-1065] [log.py:232] - Total strategies: 0
2026-01-15 00:45:00.012 WARNING [Dummy-1065] [log.py:232] - Checked upgradable plugin success latency: 0.004497095942497253
2026-01-15 00:45:00.022 INFO [Dummy-1065] [trace.py:128] - Task schedule.check_upgradable_plugin_task.check_upgradable_plugin_task[9faa50ff-0305-4727-84ea-10ea78288e17] succeeded in 0.01553877501282841s: None
2026-01-15 00:45:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[fa448614-8c08-4142-8291-08f261611692] received
2026-01-15 00:45:42.880 INFO [Dummy-1066] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[fa448614-8c08-4142-8291-08f261611692] succeeded in 0.028055537957698107s: None
2026-01-15 00:46:42.852 INFO [MainThread] [strategy.py:161] - Task schedule.workflow_schedule_task.poll_workflow_schedules[06097d2c-05f7-4af4-b523-d4f42b9c36e7] received
2026-01-15 00:46:42.864 INFO [Dummy-1067] [trace.py:128] - Task schedule.workflow_schedule_task.poll_workflow_schedules[06097d2c-05f7-4af4-b523-d4f42b9c36e7] succeeded in 0.012092349003069103s: None

3. weaviate container

docker logs docker-weaviate-1 --tail=100

docker : {“action”:“startup”,“build_git_commit”:“6c571ff”,“build_go_version”:“go1.22.8”,“build_image_tag”:“”,“build_wv_version”:“”,“default_vectorizer_module”:“none”,“level”:“info”,“msg”:“the default vectorizer modules is set to “none”, as a result all new schema classes without an explicit vectorizer settin
g, will use this vectorizer”,“time”:“2026-01-14T08:18:23Z”}
发生位置 行:1 文字:1

* docker logs docker-weaviate-1 --tail=100
* ```
    + CategoryInfo          : NotSpecified: ({\"action\":\"star...-14T08:18:23Z\"}:String) [], RemoteException
    + FullyQualifiedErrorId : NativeCommandError

Hmm, there doesn’t seem to be any straightforward error messages…

I apologize for suggesting a rather ad-hoc approach, but could you try completely stopping and restarting the environment?

docker compose down
docker compose up -d

Other potential causes might involve the contents of .env… Have you made any changes around Weaviate?

This might be unrelated to the error, but just as a friendly reminder: the docker compose ps output shows a pgvector container that was started 6 days ago. If you’re no longer using it, you might want to clean it up.

When pasting command outputs or logs, you can format them neatly by wrapping them between lines containing only ``` (three backticks), like this:

```
Paste your log here
```
````
1 Like

It seems that Weaviate is not actually listening on port 8080, which prevents Dify from connecting to it and therefore makes it impossible to upload new files into the Knowledge Base.

```
docker exec -it docker-weaviate-1 sh
/ # ps aux | grep weaviate
1 root 0:07 /bin/weaviate --host 0.0.0.0 --port 8080 --scheme http
27 root 0:00 grep weaviate
/ # netstat -tlnp
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
tcp 0 0 127.0.0.11:43709 0.0.0.0:* LISTEN -
tcp 0 0 :::7946 :::* LISTEN 1/weaviate
tcp 0 0 :::6060 :::* LISTEN 1/weaviate
```

Do you have any suggestions on how to make Weaviate properly listen on port 8080 and restore the connection?

I would really appreciate your help. Thank you.

@rose_1001
Thanks for the update!
I took another look at the output of your docker compose ps, and I see that Weaviate is exposing port 8080 to the host network.

docker-weaviate-1 ... 0.0.0.0:8080->8080/tcp, [::]:8080->8080/tcp

This isn’t the default configuration. So, I assume you’ve customized something like docker-compose.yaml or docker-compose.override.yaml files.
Could you share how you have defined weaviate and worker in your compose file?

Also, since the output of docker logs docker-weaviate-1 is quite short, it’s possible that Weaviate isn’t starting up properly due to a port conflict or some network-related issue–it might be caused by a customization, but it also might not be.

@kurokobo
Thank you very much for your guidance.

I have updated my docker-compose configuration as suggested and restarted the services. However, I am still unable to upload new files into the Knowledge Base, so the issue has not been resolved yet.

Below is my current Weaviate configuration in the docker-compose.yml file:

weaviate:
  image: semitechnologies/weaviate:1.27.0
  profiles:
    - weaviate
  restart: always
  volumes:
    - ./volumes/weaviate:/var/lib/weaviate
  environment:
    PERSISTENCE_DATA_PATH: ${WEAVIATE_PERSISTENCE_DATA_PATH:-/var/lib/weaviate}
    QUERY_DEFAULTS_LIMIT: ${WEAVIATE_QUERY_DEFAULTS_LIMIT:-25}
    AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: ${WEAVIATE_AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED:-true}
    DEFAULT_VECTORIZER_MODULE: ${WEAVIATE_DEFAULT_VECTORIZER_MODULE:-none}
    CLUSTER_HOSTNAME: ${WEAVIATE_CLUSTER_HOSTNAME:-node1}

    AUTHENTICATION_APIKEY_ENABLED: ${WEAVIATE_AUTHENTICATION_APIKEY_ENABLED:-true}
    AUTHENTICATION_APIKEY_ALLOWED_KEYS: ${WEAVIATE_AUTHENTICATION_APIKEY_ALLOWED_KEYS:-WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih}
    AUTHENTICATION_APIKEY_USERS: ${WEAVIATE_AUTHENTICATION_APIKEY_USERS:-hello@dify.ai}

    AUTHORIZATION_ADMINLIST_ENABLED: ${WEAVIATE_AUTHORIZATION_ADMINLIST_ENABLED:-true}
    AUTHORIZATION_ADMINLIST_USERS: ${WEAVIATE_AUTHORIZATION_ADMINLIST_USERS:-hello@dify.ai}

    DISABLE_TELEMETRY: ${WEAVIATE_DISABLE_TELEMETRY:-false}
    ENABLE_TOKENIZER_GSE: ${WEAVIATE_ENABLE_TOKENIZER_GSE:-false}
    ENABLE_TOKENIZER_KAGOME_JA: ${WEAVIATE_ENABLE_TOKENIZER_KAGOME_JA:-false}
    ENABLE_TOKENIZER_KAGOME_KR: ${WEAVIATE_ENABLE_TOKENIZER_KAGOME_KR:-false}

After restarting Weaviate, I checked the logs and this is all that appears:

docker ps
CONTAINER ID   IMAGE                                       COMMAND                  CREATED              STATUS                        PORTS                                                                          NAMES
8331ab3b0bcb   nginx:latest                                "sh -c 'cp /docker-e…"   About a minute ago   Up About a minute             0.0.0.0:80->80/tcp, [::]:80->80/tcp, 0.0.0.0:443->443/tcp, [::]:443->443/tcp   docker-nginx-1
d2ca69e4a8d5   langgenius/dify-api:1.11.2                  "/bin/bash /entrypoi…"   About a minute ago   Up About a minute             5001/tcp                                                                       docker-worker_beat-1
139ece472876   langgenius/dify-api:1.11.2                  "/bin/bash /entrypoi…"   About a minute ago   Up About a minute             5001/tcp                                                                       docker-worker-1
3009802ec24e   langgenius/dify-api:1.11.2                  "/bin/bash /entrypoi…"   About a minute ago   Up About a minute             5001/tcp                                                                       docker-api-1
a1a0cb827a47   langgenius/dify-plugin-daemon:0.5.2-local   "/bin/bash -c /app/e…"   About a minute ago   Up About a minute             0.0.0.0:5003->5003/tcp, [::]:5003->5003/tcp                                    docker-plugin_daemon-1
fdad17ccce73   postgres:15-alpine                          "docker-entrypoint.s…"   About a minute ago   Up About a minute (healthy)   5432/tcp                                                                       docker-db_postgres-1
e8bec6c3a779   redis:6-alpine                              "docker-entrypoint.s…"   About a minute ago   Up About a minute (healthy)   6379/tcp                                                                       docker-redis-1
60cf986d8a46   langgenius/dify-web:1.11.2                  "/bin/sh ./entrypoin…"   About a minute ago   Up About a minute             3000/tcp                                                                       docker-web-1
f4f3dc3785c5   semitechnologies/weaviate:1.27.0            "/bin/weaviate --hos…"   About a minute ago   Up About a minute                                                                                            docker-weaviate-1
04a5ff4b82a6   langgenius/dify-sandbox:0.2.12              "/main"                  About a minute ago   Up About a minute (healthy)                                                                                  docker-sandbox-1
cf007e85bd6a   ubuntu/squid:latest                         "sh -c 'cp /docker-e…"   About a minute ago   Up About a minute             3128/tcp                                                                       docker-ssrf_proxy-1
53f4d6cbcf72   pgvector/pgvector:pg16                      "/docker-entrypoint.…"   6 weeks ago          Up 10 days (healthy)          5432/tcp                                                                       docker-pgvector-1
docker logs docker-weaviate-1

{"action":"startup","build_git_commit":"6c571ff","build_go_version":"go1.22.8","build_image_tag":"","build_wv_version":"","default_vectorizer_module":"none","level":"info","msg":"the default vectorizer modules is set to \"none\", as a result all new schema classes without an explicit vectorizer setting, will use this vectorizer","time":"2026-01-19T00:20:19Z"}
{"action":"startup","auto_schema_enabled":true,"build_git_commit":"6c571ff","build_go_version":"go1.22.8","build_image_tag":"","build_wv_version":"","level":"info","msg":"auto schema enabled setting is set to \"true\"","time":"2026-01-19T00:20:19Z"}
{"build_git_commit":"6c571ff","build_go_version":"go1.22.8","build_image_tag":"","build_wv_version":"","level":"info","msg":"No resource limits set, weaviate will use all available memory and CPU. To limit resources, set LIMIT_RESOURCES=true","time":"2026-01-19T00:20:19Z"}
{"build_git_commit":"6c571ff","build_go_version":"go1.22.8","build_image_tag":"","build_wv_version":"","level":"info","msg":"module offload-s3 is enabled","time":"2026-01-19T00:20:19Z"}

I would really appreciate it if you could take another look and let me know whether this configuration might still cause a startup, port-listening, or module-related issue.

@kurokobo
I changed

volumes:
  - ./volumes/weaviate:/var/lib/weaviate

to

volumes:
  - weaviate_data:/var/lib/weaviate

and added the following at the end of the compose file:

volumes:
  weaviate_data:

After making this change, I was able to upload files to the Knowledge Base successfully.

Thank you very much for your kind and detailed support. I truly appreciate it and hope to receive your help again in the future.

@rose_1001
I’m glad to hear that it seems to have been resolved.

If your fix worked, the issue may have been related to the file system permissions, ACL, SELinux, or similar mechanisms on your Docker host.
Also, Docker might be running in rootless mode.

Anyway, normally, there shouldn’t be any issues with everyday use if you use a named volume (weaviate_data:/var/lib/weaviate) instead of bind mount (./volumes/weaviate:/var/lib/weaviate).
However, if you ever consider file-based backing up or restoring your environment, please note that your files for Weaviate are not inside the ./volumes/weaviate directory on your Docker host. Be aware of this point.

Enjoy!

2 Likes

@Dify_Forum_Helper, please summarize the Q&A.

Q:
Self‑hosted Dify (Docker) shows the error:

Could not connect to Weaviate: Connection to Weaviate failed. Details: .

when uploading documents to the Knowledge Base.
All containers (including weaviate) are Up, but inside the Weaviate container netstat shows no listener on port 8080. Why is this happening and how was it fixed?


Summary

  • Environment: Self-hosted Dify (Docker, v1.11.2), Weaviate 1.27.0 as middleware.
  • Symptoms:
    • “Could not connect to Weaviate” error when uploading files to the KB.
    • docker compose ps shows docker-weaviate-1 as Up.
    • However, upon checking inside the container, the Weaviate process is running, but it is not listening on :8080.

Root Cause

The cause is believed to be that Weaviate’s data directory was assigned using a bind mount to a host directory.

weaviate:
  ...
  volumes:
    - ./volumes/weaviate:/var/lib/weaviate   # ← bind mount

It is highly probable that one of the following issues occurred on the host side for this path:

  • Permission / ownership inconsistencies
  • Restrictions such as ACL / SELinux / AppArmor
  • Docker running in rootless mode, restricting access to this path

As a result, Weaviate failed to initialize storage, and although the process itself existed, it was unable to properly bind to HTTP port 8080.
From Dify’s perspective, this resulted in a state where “the container is Up, but a TCP connection cannot be established,” leading to the error.


Actual Actions Taken and Resolution

  1. First, a restart was attempted with docker compose down && docker compose up -d, but it did not resolve the issue.
  2. Weaviate’s volume setting was changed from a host directory bind mount to a Docker named volume.
weaviate:
  image: semitechnologies/weaviate:1.27.0
  profiles:
    - weaviate
  restart: always
  volumes:
    - weaviate_data:/var/lib/weaviate    # Changed here
  environment:
    PERSISTENCE_DATA_PATH: ${WEAVIATE_PERSISTENCE_DATA_PATH:-/var/lib/weaviate}
    ...

# Add to the end of the file
volumes:
  weaviate_data:
  1. After making the change and restarting the containers, Weaviate started normally, began listening on 8080, and file uploads to the Knowledge Base were successfully completed.

Best Practices from This Case

  1. Named Volumes Recommended for Weaviate

    • Similar to the official docker-compose.yaml, using a named volume like weaviate_data:/var/lib/weaviate helps avoid issues stemming from host OS permissions or SELinux settings.
    • If there’s no specific need to use a bind mount (./volumes/weaviate:/var/lib/weaviate), it’s safer to use a named volume.
  2. Things to Check When Using Bind Mounts

    • Does the relevant directory (./volumes/weaviate) actually exist?
    • Check with ls -ld ./volumes ./volumes/weaviate if the user running Docker has read/write access.
    • In an SELinux-enabled environment, you might need to consider :z or :Z options, or adjust the context.
    • If Docker is in rootless mode, confirm if the path is accessible by that user.
  3. Troubleshooting “Could not connect to Weaviate” when the container is Up

    • Check if the port is actually being listened to from within the container:
      docker exec -it docker-weaviate-1 sh
      netstat -tlnp | grep 8080
      
    • If 8080 does not appear, Weaviate has failed to start, and you should first suspect volume settings or environment variables.
  4. Backup Considerations

    • When using named volumes, Weaviate’s data is not placed in an arbitrary host directory (like ./volumes/weaviate).
    • If you want to back up based on files, you will need procedures that explicitly handle volumes, such as docker run --rm -v weaviate_data:/data ....

The conclusion of this topic is:

By switching Weaviate’s data directory from a bind mount to the host to a Docker named volume, port 8080 correctly started listening, resolving the connection error from Dify.

This completes the Q&A.