# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("HooshvareLab/gpt2-fa")
model = AutoModelForCausalLM.from_pretrained("HooshvareLab/gpt2-fa")Quick Links
ParsGPT2
BibTeX entry and citation info
Please cite in publications as the following:
@misc{ParsGPT2,
author = {Hooshvare Team},
title = {ParsGPT2 the Persian version of GPT2},
year = {2021},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/hooshvare/parsgpt}},
}
Questions?
Post a Github issue on the ParsGPT2 Issues repo.
- Downloads last month
- 2,568
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="HooshvareLab/gpt2-fa")