![]() * Nomic Embed Text V2 with Mixture-of-Experts (MoE) architecture - Adds MoE-based embedding model supporting multilingual embeddings. - Selects architecture variant based on hyperparameter detection (MoE layers). - Removes unnecessary subclass initialization checks for clarity. https://www.nomic.ai/blog/posts/nomic-embed-text-v2 Co-authored-by: Jared Van Bortel <jared@nomic.ai> * fix tokenizer * don't rename this tensor --------- Co-authored-by: Jared Van Bortel <jared@nomic.ai> |
||
---|---|---|
.. | ||
scripts | ||
__init__.py | ||
constants.py | ||
gguf.py | ||
gguf_reader.py | ||
gguf_writer.py | ||
lazy.py | ||
metadata.py | ||
py.typed | ||
quants.py | ||
tensor_mapping.py | ||
utility.py | ||
vocab.py |