Enabled the model kotoba-tech/kotomamba-2.8B-v1.0 to be loaded with the Transformers library alone.
For details, see: https://huggingface.co/kotoba-tech/kotomamba-2.8B-v1.0
The tokenizer is code20K_en40K_ja60K.ver2.2 used by llm-jp/llm-jp-13b-v2.0.
Uses
Direct Use
from transformers import pipeline
pipe = pipeline(
"text-generation",
"yuji96/kotomamba-2.8B-v1.0-hf",
do_sample=True,
max_new_tokens=1024,
early_stopping=True,
)
print(pipe("ใใใพใใฆ")[0]["generated_text"])
Model Examination [optional]
[More Information Needed]
- Downloads last month
- 7
Model tree for yuji96/kotomamba-2.8B-v1.0-hf
Base model
kotoba-tech/kotomamba-2.8B-v1.0