Skip to content

TumCucTom/YukiGPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

YukiGPT

A decoder only version of the model outlined in Attention is all you need. It does not implement the cross attention section. This is as we are generating uncoditioned text.

What's in here?

This is implemented in gpt.py.

The bigram file implements a simple bigram langauge model.

The dataset_info.py file gives basic info on a dataset if you want to import a new one.

The self_attention.py file explores the trick of self attention.

TODO

  • Collect and train on F1 radio
  • Implement a tokeniser
  • Expand to make more like nanoGPT

Credits

This follows Andrej Karpathy: Let's build GPT: from scratch, in code, spelled out..

About

A decoder only GPT based on part of the model proposed in "Attention is all you need" (no encoder / cross attention). Trained to generate f1 radio messages.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages