* add exaone model support * add chat template * fix whitespace Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> * add ftype * add exaone pre-tokenizer in `llama-vocab.cpp` Co-Authored-By: compilade <113953597+compilade@users.noreply.github.com> * fix lint Co-Authored-By: compilade <113953597+compilade@users.noreply.github.com> * add `EXAONE` to supported models in `README.md` * fix space Co-authored-by: compilade <git@compilade.net> --------- Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> Co-authored-by: compilade <113953597+compilade@users.noreply.github.com> Co-authored-by: compilade <git@compilade.net> |
||
|---|---|---|
| .. | ||
| CMakeLists.txt | ||
| llama-grammar.cpp | ||
| llama-grammar.h | ||
| llama-impl.h | ||
| llama-sampling.cpp | ||
| llama-sampling.h | ||
| llama-vocab.cpp | ||
| llama-vocab.h | ||
| llama.cpp | ||
| unicode-data.cpp | ||
| unicode-data.h | ||
| unicode.cpp | ||
| unicode.h | ||