llamacpp 1.0.0-ok0.1 (amd64 binary) in openkylin nile

 This package contains the llama.cpp core runtime libraries.
 The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.

Details

Package version:
1.0.0-ok0.1
Source:
llamacpp 1.0.0-ok0.1 source package in openKylin
Status:
Published
Component:
main
Priority:
Optional