llama.cpp/ggml/src/ggml-cann
William Tambellini 70680c48e5
ggml : upgrade init_tensor API to return a ggml_status (#11854)
* Upgrade init_tensor API to return a ggml_status

To prepare for an 'abort-free' ggml
(ggml not to abort on OOMs but return a OOM status),
as agreeed with Diego in the ggml repo,
upgrade the init_tensor() and view_init() APIs
to return a ggml_status.

* misc fixes

---------

Co-authored-by: slaren <slarengh@gmail.com>
2025-02-28 14:41:47 +01:00
..
kernels CANN: Fix build error with GCC 13 (#11990) 2025-02-28 15:23:47 +08:00
.clang-format [CANN] Add Ascend NPU backend (#6035) 2024-07-17 14:23:50 +03:00
acl_tensor.cpp cann: support q4_0 model (#8822) 2024-08-05 12:22:30 +08:00
acl_tensor.h cann: support q4_0 model (#8822) 2024-08-05 12:22:30 +08:00
aclnn_ops.cpp CANN: RoPE operator optimization (#10563) 2024-11-29 14:46:55 +08:00
aclnn_ops.h [CANN] Add Ascend NPU backend (#6035) 2024-07-17 14:23:50 +03:00
CMakeLists.txt CANN: Fix SOC_TYPE compile bug (#10519) 2024-11-28 15:25:24 +08:00
common.h CANN: Improve the Inferencing Performance for Ascend NPU Device (#10454) 2024-11-26 18:08:37 +08:00
Doxyfile cann : fix doxy (ggml/0) 2024-09-08 11:05:55 +03:00
ggml-cann.cpp ggml : upgrade init_tensor API to return a ggml_status (#11854) 2025-02-28 14:41:47 +01:00