Training repo for Toy GPT: context-3 model + small structured corpus (001_animals.txt)
-
Updated
Apr 4, 2026 - Python
Training repo for Toy GPT: context-3 model + small structured corpus (001_animals.txt)
Interactive Monte Carlo simulation of the VNAE framework. A visual playground to test how Theta and Beta parameters influence win rates in asymmetric systems.
Training repo for Toy GPT: bigram model + small domain corpus (010_llm_glossary.txt)
Training repo for Toy GPT: context-2 model + small domain corpus (010_llm_glossary.txt)
Training repo for Toy GPT: Context-3 model + small neutral corpus (000_cat_dog.txt)
Training repo for Toy GPT: unigram + small neutral corpus (000_cat_dog.txt)
Training repo for Toy GPT: unigram model + small structured corpus (001_animals.txt)
Training repo for Toy GPT: context-2 model + small structured corpus (001_animals.txt)
Training repo for Toy GPT: bigram model + small structured corpus (001_animals.txt)
Training repo for Toy GPT: Context-3 model with attention + small domain corpus (030_analytics.txt). Attention requires scale.
Training repo for Toy GPT: Context-2 model with attention + small domain corpus (030_analytics.txt). Attention requires scale.
Training repo for Toy GPT: Context-2 model with embeddings + small domain corpus (030_analytics.txt). Much more efficient use of space.
Training repo for Toy GPT: Context-2 model + small neutral corpus (000_cat_dog.txt)
Training repo for Toy GPT: bigram + small neutral corpus (000_cat_dog.txt)
Can a cellular neural network learn how to reliably express meaningful patterns? This is a toy model for a gene regulatory network for cell specification.
Toy models for ordinal realization, residue, selective crystallization, and regime analysis in relational systems.
🚀 Train a custom unigram model using simple and efficient methods, enabling easy adoption for natural language processing tasks.
Training repo for Toy GPT: context-3 model + small domain corpus (010_llm_glossary.txt)
Numerical consistency tests for a relational realisation-budget formulation of SR and static Schwarzschild sector.
🚀 Train a 200-bigram language model with ease, enhancing text generation tasks and improving natural language processing capabilities.
Add a description, image, and links to the toy-model topic page so that developers can more easily learn about it.
To associate your repository with the toy-model topic, visit your repo's landing page and select "manage topics."