Domino on Linux/Unix, Troubleshooting, Best Practices, Tips and more ...

alt

Daniel Nashed

Docker Desktop LLM support with NVIDIA

Daniel Nashed – 5 May 2025 10:48:17

After my first and simple test, I updated Docker Desktop on my lab machine.
There is a GPU option in the experimental settings showing up when you have a matching GPU.


Once enabled it, the following command loads the model into the NVIDIA GPU


docker model run ai/qwen3:0.6B-Q4_0


To check details I ran a Docker container with Ubuntu also using the GPU.


Beside nvidia-smi to check the card driver version I installed "nvtop" (which is part of Ubuntu) to see the GPU performance.


docker run --gpus all -it --rm ubuntu bash


Looking a bit deeper into the installed binaries you can indeed see that Docker also leverages the llama.cpp project (
https://github.com/ggml-org/llama.cpp)

Interestingly the GPU option mentions additional software will be downloaded. But so far I only found those files:



Directory of C:\Program Files\Docker\Docker\resources\model-runner\bin


05/05/2025  12:32    
         .
05/05/2025  12:32    
         ..
05/05/2025  12:31                71 com.docker.llama-server.digest

05/05/2025  12:31         1.838.320 com.docker.llama-server.exe

05/05/2025  12:31            44.784 com.docker.nv-gpu-info.exe

05/05/2025  12:31           481.520 ggml-base.dll

05/05/2025  12:31           492.272 ggml-cpu.dll

05/05/2025  12:31            65.776 ggml.dll

05/05/2025  12:31         1.212.144 llama.dll

              7 File(s)      4.134.887 bytes


Image:Docker Desktop LLM support with NVIDIA


Image:Docker Desktop LLM support with NVIDIA


Links

    Archives


    • [HCL Domino]
    • [Domino on Linux]
    • [Nash!Com]
    • [Daniel Nashed]