llamacpp binary package in openKylin Nile V2.0 amd64

 This package contains the llama.cpp core runtime libraries.
 The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.

Publishing history

Date Status Target Pocket Component Section Priority Phased updates Version
  2025-06-12 08:25:27 UTC Published openKylin Nile V2.0 amd64 proposed main libs Optional 1.0.0-ok0.1
  • Published
  • Copied from openkylin nile-release amd64 in 2.0 sp2 development