Skip to content

Latest commit

 

History

History
3 lines (2 loc) · 280 Bytes

README.md

File metadata and controls

3 lines (2 loc) · 280 Bytes

GPT Language Model from Scratch

This project is an implementation of the GPT language model using the "Attention is All You Need" paper as a guide, with inspiration from OpenAI's GPT-2 and GPT-3 models. We use Shakespeare as the dataset for training and fine-tuning the model.