open-webui 部署遇到问题

version: ‘3’
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
ports:
- “3000:9090”
volumes:
- ./data:/app/backend/data
environment:
- HF_ENDPOINT=https://hf-mirror.com/
restart: always

以上是 docker-compose 配置 启动后报错

cal disk and outgoing traffic has been disabled. To enable repo look-ups and downloads online, pass ‘local_files_only=False’ as input.
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO [open_webui.apps.openai.main] get_all_models()
ERROR [open_webui.apps.openai.main] Connection error:
INFO [open_webui.apps.ollama.main] get_all_models()
ERROR [open_webui.apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Name or service not known]
INFO [open_webui.apps.openai.main] get_all_models()
ERROR [open_webui.apps.openai.main] Connection error:
INFO [open_webui.apps.ollama.main] get_all_models()
ERROR [open_webui.apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Name or service not known]

怎么会去连接ollama
这个有佬知道咋解决吗

environment:
  - HF_ENDPOINT=https://hf-mirror.com/
  - ENABLE_OLLAMA_API=False

这样试试

加了 - ENABLE_OLLAMA_API=False 后不去连接ollama了 谢谢佬

还有就是我端口映射错了 应该把8080 映射出去