NanoGPT Code for the Bigram and Transformer model to create a simple GPT for language generation. Create with 10M parameters and around 1M tokens. This was taught by Andrej Karpathy on Youtube