This project is an implementation of the GPT language model using the "Attention is All You Need" paper as a guide, with inspiration from OpenAI's GPT-2 and GPT-3 models. We use Shakespeare as the dataset for training and fine-tuning the model.
-
Notifications
You must be signed in to change notification settings - Fork 1
A simple PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) language model from scratch.
License
VanekPetr/my-own-GPT
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A simple PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) language model from scratch.
Topics
Resources
License
Stars
Watchers
Forks
Packages 0
No packages published