* Speedup tokenization On current master it takes ~3.2 seconds to tokenize Wikitext. With this change it becomes ~525 ms. * Fixit: it was missing the piece after the last found occurence --------- Co-authored-by: Iwan Kawrakow <iwan.kawrakow@gmail.com> |
||
|---|---|---|
| .. | ||
| CMakeLists.txt | ||
| perplexity.cpp | ||
| README.md | ||
perplexity
TODO