Skip to content

Deep070203/NanoGPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NanoGPT

Code for the Bigram and Transformer model to create a simple GPT for language generation.

Create with 10M parameters and around 1M tokens.

This was taught by Andrej Karpathy on Youtube

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages