Pretrained GPT2 Turkish model that is meant to be an entry point for fine-tuning on other texts.
-
Updated
Jan 11, 2021
Pretrained GPT2 Turkish model that is meant to be an entry point for fine-tuning on other texts.
Add a description, image, and links to the byte-level-bpe topic page so that developers can more easily learn about it.
To associate your repository with the byte-level-bpe topic, visit your repo's landing page and select "manage topics."