Skip to content

Pre-train the dataset of Shakespeare poem to make character level language model using decoder only transformer.

Notifications You must be signed in to change notification settings

RajGothi/GPT-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

GPT-from-scratch

Pre-train the dataset of Shakespeare poem to make character level language model using decoder only transformer.

Implemented Multihead attention transformer from the scratch rather than just using any pre-built transformer.

input.txt contain the dataset of Shakespeare poem dataset.

Bigram.ipynb file is to understand the basic bi-gram level LM model.

Run the gpt.py for pre-training.

About

Pre-train the dataset of Shakespeare poem to make character level language model using decoder only transformer.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published