Skip to content

diogok/llama.cpp.zig

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llama.cpp.zig

A build.zig for llama.cpp, with Vulkan.

You can use llama.cpp from Zig projects.

You can also cross-compile llama.cpp to different targets.

Support

Supported targets are:

  • Linux x86_64
  • Linux aarch64
  • Windows x86_64
  • Windows aarch64

Supported backends are:

  • CPU
  • Vulkan

Other targets and backends can be added with time and test devices.

Test devices

  • Random x86_64 running linux: All good.
  • Random x86_64 running windows: All good.
  • Raspberry Pi 5 (aarch64 linux): CPU works, Vulkan compiles but don't due to some lack of memory.
  • Surface pro X SQ2 (aarch64 windows): CPU works, vulkan compiles but don't run due to some missing feature.
  • Termux (aarch64 android/linux): CPU works, vulkan compiles but don't run.

How to build

All you need is Zig installed. All dependencies are pulled and compiled.

You can compile with:

zig build install

You can choose the backend used:

zig build install -Dbackend=vulkan
zig build install -Dbackend=cpu #default

And choose a target architecture and OS:

zig build install -Dtarget=x86_64-linux

First compilation can take several minutes on some plataforms.

Use in Zig

Add as a dependency on your project:

zig fetch --save git+https://github.com/diogok/llama.cpp.zig

Then build the library on your buid.zig like:

const llama_cpp_dep = b.dependency("llama_cpp_zig", .{
    .target = target,
    .optimize = optimize,
    .backend = backend, // like `.vulkan` or `.cpu`
});
const llama_cpp_lib = llama_cpp_dep.artifact("llama_cpp");
you_module.linkLibrary(llama_cpp_lib);

Refer to [src/demo.zig] for an usage example.

Licenses

MIT

About

A build.zig for llama.cpp

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published