There is a compatibility issue when running Ollama with AMD GPU drivers on different Linux kernel versions, which conflicts with running ROCm-based containers like vLLM. However, ROCm containers (e.g.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results