llama.cpp/ggml
2025-03-21 20:27:47 +01:00
..
cmake cmake : enable building llama.cpp using system libggml (#12321) 2025-03-17 11:05:23 +02:00
include llama: Add support for RWKV v7 architecture (#12412) 2025-03-18 07:27:50 +08:00
src vulkan: workaround for AMD Windows driver 16 bit unpack8 bug (#12472) 2025-03-21 20:27:47 +01:00
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt SYCL: using graphs is configurable by environment variable and compile option (#12371) 2025-03-18 11:16:31 +01:00