llamacpp-dev 1.0.0-ok0.1 (amd64 binary) in openkylin nile.bedrock

 This package contains the header files to compile applications that use llama.cpp, and contains the binary files to run llama.cpp.
 The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.

Details

Package version:
1.0.0-ok0.1
Source:
llamacpp 1.0.0-ok0.1 source package in openKylin
Status:
Published
Component:
main
Priority:
Optional