Record: Alpha=144 LoRA + Warm-start A + WD 1.0 — val_bpb 1.07209 (3-seed mean)#1767
Open
renqianluo wants to merge 1 commit intoopenai:mainfrom
Open
Record: Alpha=144 LoRA + Warm-start A + WD 1.0 — val_bpb 1.07209 (3-seed mean)#1767renqianluo wants to merge 1 commit intoopenai:mainfrom
renqianluo wants to merge 1 commit intoopenai:mainfrom
Conversation
…eed mean) Four composable novel changes on top of dexhunter's phased-TTT code: 1. Alpha/rank LoRA scaling enables stable higher rank (128 vs 96) 2. Warm-start LoRA A across batches lets feature directions accumulate 3. Raised TTT weight decay (0.5 -> 1.0) prevents warm-A overfit 4. Alpha lifted 96 -> 144 gives LoRA more adaptation strength; WD keeps it stable 3-seed mean 1.07209 BPB (seeds 1337, 42, 314). All seeds improve monotonically across each of the four changes. Matches/approaches dexhunter's 1.07193 closely despite different seed set.
leon2k2k2k
added a commit
to leon2k2k2k/parameter-golf
that referenced
this pull request
Apr 22, 2026
TTT_LORA_ALPHA env var (default 96, spec uses 144). Only zero B on reset; A accumulates feature directions across batches. Output scaled by alpha/rank. Validated by renqianluo (openai#1767) and bigbag (openai#1771). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
3 tasks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Four composable small-LOC changes to
BatchedLinearLoRAon top of @dexhunter's 1.07193 phased-TTT code. Everything outside the LoRA module (VarLen attention, Fused MLP, multi-phase global SGD, trimmed GPTQ, triple depth recurrence) is unchanged.forward(x) * (alpha/rank). Without this, raising rank directly diverges on some seeds.Results
Every seed improves monotonically across every change.
Compliance
All train ≤596s, eval 455.7–456.7s, artifacts ≤15.94MB. Issue #1017 conditions 1–4 verified.
Attribution