An Empirical Survey of the Effectiveness of Debiasing Techniques for Pre-trained Language Models
Paper
•
2110.08527
•
Published
This model is a fine-tuned version of gpt2 on a English Wikipedia dataset.
Dropout debiased gpt2 using the hyperparameters specified in Measuring and Reducing Gendered Correlations in Pre-trained Models (Webster et al. 2021) and the code used in An Empirical Survey of the Effectiveness of Debiasing Techniques for Pre-trained Language Models (Meade et al. 2022).
The following hyperparameters were used during training:
Base model
openai-community/gpt2