How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="HooshvareLab/gpt2-fa")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("HooshvareLab/gpt2-fa")
model = AutoModelForCausalLM.from_pretrained("HooshvareLab/gpt2-fa")
Quick Links

ParsGPT2

BibTeX entry and citation info

Please cite in publications as the following:

@misc{ParsGPT2,
  author = {Hooshvare Team},
  title = {ParsGPT2 the Persian version of GPT2},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/hooshvare/parsgpt}},
}

Questions?

Post a Github issue on the ParsGPT2 Issues repo.

Downloads last month
2,568
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support

Model tree for HooshvareLab/gpt2-fa

Adapters
1 model
Finetunes
5 models
Quantizations
1 model

Spaces using HooshvareLab/gpt2-fa 30