Skip to content

Comfyui cuda环境部署

N卡环境

最好是显存4GB+,使用nvidia-smi查看: 这里会显示你的驱动版本以及cuda版本,第二步要选择适配你的cuda版本镜像

root@feiniu:~# nvidia-smi
Sat Mar 14 22:47:16 2026       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 575.57.08              Driver Version: 575.57.08      CUDA Version: 12.9     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 5060        Off |   00000000:01:00.0 Off |                  N/A |
|  0%   45C    P8              8W /  145W |       0MiB /   8151MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+

+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|  No running processes found                                                             |
+-----------------------------------------------------------------------------------------+

docker compose脚本部署

参考以下脚本,更改你的存储位置: 可以查看dockerhub上更多适配, 本人已搬运常用版本的镜像,

registry.cn-hangzhou.aliyuncs.com/gpg_dev/yanwk/comfyui-boot:cu130-slim

registry.cn-hangzhou.aliyuncs.com/gpg_dev/comfyui-boot:cu128-slim

registry.cn-hangzhou.aliyuncs.com/gpg_dev/comfyui-boot:cu126-slim

registry.cn-hangzhou.aliyuncs.com/gpg_dev/comfyui-boot:rocm7

version: "3"
services:
  comfyui:
    image: yanwk/comfyui-boot:cu130-slim
    container_name: comfyui-cuda
    #restart: unless-stopped
    # 关键:启用 NVIDIA GPU 支持
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              device_ids: ['0']
              capabilities: [gpu]
    # 端口映射
    ports:
      - "8188:8188"
    # 环境变量 (对应 -e CLI_ARGS)
    environment:
      - CLI_ARGS=
      # 可选:如果你需要设置额外的 HuggingFace Token,可以在这里添加
      # - HF_TOKEN=your_token_here
      # 额外建议:设置 PyTorch 分配器以优化显存碎片 (针对 8GB 显存很有用)
      - PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:512
    security_opt:
      - "label=type:nvidia_container_t"
    # 卷挂载 (对应 -v)
    volumes:
      # 主存储目录
      - /vol1/1000/comfyui/root:/root
      # 模型缓存目录
      - /vol1/1000/comfyui/models:/root/ComfyUI/models
      - /vol1/1000/comfyui/hf-hub:/root/.cache/huggingface/hub
      - /vol1/1000/comfyui/torch-hub:/root/.cache/torch/hub
      # 用户数据目录
      - /vol1/1000/comfyui/input:/root/ComfyUI/input
      - /vol1/1000/comfyui/output:/root/ComfyUI/output
      - /vol1/1000/comfyui/workflows:/root/ComfyUI/user/default/workflows

部署命令

docker compose -f comfyui-cuda.yml up -d

docker命令行部署

mkdir -p \
  storage \
  storage-models/models \
  storage-models/hf-hub \
  storage-models/torch-hub \
  storage-user/input \
  storage-user/output \
  storage-user/workflows

docker run -it --rm \
  --runtime nvidia \
  --gpus all \
  -p 8188:8188 \
  -v "$(pwd)"/storage:/root \
  -v "$(pwd)"/storage-models/models:/root/ComfyUI/models \
  -v "$(pwd)"/storage-models/hf-hub:/root/.cache/huggingface/hub \
  -v "$(pwd)"/storage-models/torch-hub:/root/.cache/torch/hub \
  -v "$(pwd)"/storage-user/input:/root/ComfyUI/input \
  -v "$(pwd)"/storage-user/output:/root/ComfyUI/output \
  -v "$(pwd)"/storage-user/workflows:/root/ComfyUI/user/default/workflows \
  -e CLI_ARGS="--disable-xformers" \
  yanwk/comfyui-boot:cu128-slim

其他

CUDA Image Tags - Slim
Start with only ComfyUI and ComfyUI-Manager, yet include many dependencies to make future Custom Node installation easier. Recommended for beginners.

cu126-slim [doc]

Using CUDA 12.6, Python 3.12
cu128-slim [doc] ⭐

Using CUDA 12.8, Python 3.12
cu130-slim [doc]

Using CUDA 13.0, Python 3.13 (with GIL). No xFormers by default
CUDA Image Tags - MEGAPAK
All-in-one bundles, including dev kits and many Custom Nodes for ComfyUI.

cu126-megapak [doc]

Using CUDA 12.6, Python 3.12, GCC 13
cu128-megapak [doc]

Using CUDA 12.8, Python 3.12, GCC 14
More Image Tags
rocm [doc]

For AMD GPU with ROCm.
xpu [doc]

For Intel GPU with XPU.
cpu [doc]

For CPU only.
nightly [doc]

Using preview version of PyTorch (CUDA).
archived

Archived Dockerfiles of retired image tags.