Skip to content

llama.cpp server-cuda-b5586 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/qiyuangong/llama.cpp:server-cuda-b5586

Recent tagged image versions

  • Published 4 months ago · Digest
    sha256:b5dfda3b5ff790db51d068ddd0e7b10f1d4e9d671e5feb41f0db6eb36f57c2c5
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:e554d5a9d1319cce37f786c5be5fa1049977a62e053fab6e1cf5d4b69cba4288
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:4f64ad23bbdc5b4ec6938a724c4de9c9ebfcf960fa7700e551185763e509cefc
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:96ed2fd195e97e82809efb401e72126312be3e64a2af9b69bbd399b6f637b698
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:03cfba90e41ffd099909db1d3e47f645e13f3efc5c1d7cfef5091c809a71a513
    2 Version downloads

Loading

Details


Last published

4 months ago

Total downloads

25