Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 198 Bytes

README.md

File metadata and controls

7 lines (4 loc) · 198 Bytes

NanoGPT

Code for the Bigram and Transformer model to create a simple GPT for language generation.

Create with 10M parameters and around 1M tokens.

This was taught by Andrej Karpathy on Youtube