llama.cpp/.devops
Chenguang Li 6e1c4cebdb
CANN: Support Opt CONV_TRANSPOSE_1D and ELU (#12786)
* [CANN] Support ELU and CONV_TRANSPOSE_1D

* [CANN]Modification review comments

* [CANN]Modification review comments

* [CANN]name adjustment

* [CANN]remove lambda used in template

* [CANN]Use std::func instead of template

* [CANN]Modify the code according to the review comments

---------

Signed-off-by: noemotiovon <noemotiovon@gmail.com>
2025-04-09 14:04:14 +08:00
..
nix repo : update links to new url (#11886) 2025-02-15 16:40:57 +02:00
cloud-v-pipeline build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
cpu.Dockerfile cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
cuda.Dockerfile cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
intel.Dockerfile cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
llama-cli-cann.Dockerfile CANN: Support Opt CONV_TRANSPOSE_1D and ELU (#12786) 2025-04-09 14:04:14 +08:00
llama-cpp-cuda.srpm.spec repo : update links to new url (#11886) 2025-02-15 16:40:57 +02:00
llama-cpp.srpm.spec repo : update links to new url (#11886) 2025-02-15 16:40:57 +02:00
musa.Dockerfile cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
rocm.Dockerfile cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
tools.sh docker: add perplexity and bench commands to full image (#11438) 2025-01-28 10:42:32 +00:00
vulkan.Dockerfile ci : fix build CPU arm64 (#11472) 2025-01-29 00:02:56 +01:00