Skip to content

llama.cpp server-cuda-b3895 Public Latest

Install from the command line
$ docker pull ghcr.io/shankarg87/llama.cpp:server-cuda-b3895

Recent tagged image versions

  • Published 3 months ago · Digest
    sha256:e87648969690a021adb33b8fbc4c04ea1b5eda2e1dd4014c7161953daecd3c57
    0 Version downloads
  • Published 3 months ago · Digest
    sha256:abe42b1fc18b72d0157d6c0b39fd189e9ca1d27dec54a0f79adb8c53f538e5c0
    0 Version downloads
  • Published 3 months ago · Digest
    sha256:72e56bf7b54d374626639bb975cb26a3707286548048bcf1902fe356ffebb6c6
    0 Version downloads
  • Published 3 months ago · Digest
    sha256:2d056946395f4f80d7ca6d2b2d69d7c4698f0902ef4ecf6631a90d6312457f0b
    0 Version downloads
  • Published 3 months ago · Digest
    sha256:6ef9bf6977d831fe08ca16b7696c49761bce91525f58278f28175d85e2caeafc
    0 Version downloads

Loading

Details


Last published

3 months ago

Total downloads

194