diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/README.md b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/README.md new file mode 100644 index 0000000000..17ab255b9d --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/README.md @@ -0,0 +1,296 @@ +# Record: Independent 3-seed reproduction of PR #1874 + TTT_LORA_RANK=192 + +**val_bpb = 1.06996** (3-seed mean, std 0.00059) | **all 3 total submissions < 16 MB** | **all 3 trains < 600 s, all 3 evals < 600 s** | 8 × H100 80 GB SXM | **t = 17.67 vs current SOTA, p < 0.005** + +> **What this submission is, in one paragraph.** +> An independent end-to-end reproduction of [PR #1874](https://github.com/openai/parameter-golf/pull/1874) by @AjAnubolu — the full SmearGate / AttnOutGate / LoRA-TTT / Phased Global SGD TTT / Polar Express NS / MIN_LR / LQER stack — run from scratch on our own pod across three independent seeds, with one additional hyperparameter change (LoRA-TTT rank raised from 128 to 192). All three training+eval logs are included unedited; all three quantized model artifacts are included reload-ready in `models/`; two additional sweep artifacts (rank=128 baseline and rank=192 single-seed) are included so a reviewer can independently verify the rank-delta claim. The 3-seed mean of 1.06996 is **0.011 nats** below the current merged SOTA (PR #1493, 1.0810), passes the README's 0.005-nat threshold by 3.5×, and clears the `p < 0.01` significance requirement by a wide margin (computed `t = 17.67` vs critical `t = 6.965` for one-tailed df = 2; passes `p < 0.005` by `t = 17.67` vs critical `t = 9.925`). + +--- + +## 3-Seed Results (verbatim from logs) + +| Seed | val_bpb (`quantized_ttt_phased`) | Total submission bytes | Headroom under 16M | Eval time (s) | Log | Artifact | +|------|----------------------------------|-----------------------:|-------------------:|---------------|-----|----------| +| 42 | **1.06927777** | 15,954,871 | 45,129 | 438.3 | [`train_seed42.log`](train_seed42.log) | [`models/champion_3seed_42.int6.ptz`](models/champion_3seed_42.int6.ptz) | +| 314 | **1.07023963** | 15,954,924 | 45,076 | 440.6 | [`train_seed314.log`](train_seed314.log) | [`models/champion_3seed_314.int6.ptz`](models/champion_3seed_314.int6.ptz) | +| 999 | **1.07035739** | 15,947,796 | 52,204 | 434.3 | [`train_seed999.log`](train_seed999.log) | [`models/champion_3seed_999.int6.ptz`](models/champion_3seed_999.int6.ptz) | +| **Mean** | **1.069958** | 15,952,530 | 47,470 | 437.7 | — | — | +| **Std (sample, n = 3)** | **0.000592** | — | — | — | — | — | + +Every number in this table is produced by the included `train_gpt.py` and reported by the script itself. Grep any log for `quantized_ttt_phased val_loss:... val_bpb:...` (line ~759) and `Total submission size quantized+brotli:... bytes` (line ~136) — those lines are the source of truth. + +--- + +## Byte-Budget Compliance — Authoritative Numbers + +The challenge counts model + LZMA-wrapped code together. The included `train_gpt.py` runs `_compressed_code_size()` at the end of every training run, which reads its own source, runs it through pyminify + lzma + b85, and reports the resulting byte count. That number is added to the brotli-compressed int6 model artifact to produce the total. + +| Component | Seed 42 | Seed 314 | Seed 999 | +|-----------|--------:|---------:|---------:| +| Code size (uncompressed source the script self-introspects) | 134,706 B | 134,706 B | 134,706 B | +| Code size (lzma-wrapped, what the budget charges) | 33,710 B | 33,710 B | 33,710 B | +| Model `.int6.ptz` (brotli-compressed quantized state dict) | 15,921,161 B | 15,921,214 B | 15,914,086 B | +| **Total submission bytes (model + wrapped code)** | **15,954,871 B** | **15,954,924 B** | **15,947,796 B** | +| **Cap** | **16,000,000 B** | **16,000,000 B** | **16,000,000 B** | +| **Headroom** | **45,129 B** | **45,076 B** | **52,204 B** | + +The shipped `train_gpt.py` is 32,353 B on disk (already LZMA-wrapped). The 33,710 B figure above is what the script computes for *its own* budget when run; both are well under any reasonable interpretation of the cap. + +--- + +## Statistical Significance + +The README requires beating SOTA by 0.005 nats at `p < 0.01`: + +- SOTA at submission: **1.0810** (PR #1493 by @bigbag, currently merged on `main`) +- Our 3-seed mean: **1.069958** +- Improvement vs SOTA: **0.011042 nats** +- Required improvement: 0.005 nats +- Excess over requirement: **0.006042 nats** +- Standard error of the mean (n = 3, df = 2): 0.000342 +- **t-statistic: 17.67** +- Critical t (one-tailed, df = 2, p = 0.01): 6.965 → **passes** +- Critical t (one-tailed, df = 2, p = 0.005): 9.925 → **passes** +- p-value bound: **< 0.005** + +Even if a reviewer wanted to attribute zero credit to our `TTT_LORA_RANK 128 → 192` change and treat the submission purely as an independent reproduction of PR #1874's stack, the 3-seed mean still clears the 0.005-nat threshold over the current merged SOTA at `p < 0.005`. + +--- + +## What We Actually Did (compute log + sweep table) + +This is a single-paragraph, no-spin account. We ran ~$245 of compute on a single 8×H100 RunPod node over a ~36-hour weekend window (2026-04-26 to 2026-04-28). Phases: + +1. **Phase 0 — independent reproduction of PR #1874 (single seed, ~$15).** Pulled PR #1874's source verbatim, set up the SP8192 + FineWeb data pipeline on our pod, confirmed the stack reproduces to `val_bpb 1.06907` on seed 42 (within ~2σ of @AjAnubolu's reported 1.06766; reproduction artifact shipped in `models/pr1874_baseline_rank128_seed42.int6.ptz`). + +2. **Phase 1 — single-seed hyperparameter sweep on top of PR #1874 (~$173).** Tested one knob at a time, each as a clean A/B against the seed-42 reproduction. Results in the table below. Most came back inside the noise band; only `TTT_LORA_RANK=192` consistently improved. + +3. **Phase 2 — Newton-Muon graft attempt (~$12).** Filed as a separate non-record submission ([branch `nm-doc-packing-negative-result`](https://github.com/GodlyDonuts/parameter-golf/tree/nm-doc-packing-negative-result)). It regressed strongly due to dynamo recompile fragmentation; we wrote up the negative result rather than burying it. + +4. **Phase 3 — 3-seed validation of `TTT_LORA_RANK=192` (~$45).** Three full 600 s training + ~440 s eval runs at seeds 42, 314, 999 with `TTT_LORA_RANK=192` set as the new default in the wrapped `train_gpt.py`. Those are the three logs and the three `champion_3seed_*` artifacts in this folder. + +### Single-seed sweep results (one-knob-at-a-time on PR #1874 + seed = 42) + +| Run | Configuration | val_bpb | Δ vs PR #1874 baseline (1.06907) | +|-----|--------------|--------:|--------------------------------:| +| `pr1874_repro_seed42` | PR #1874 unmodified, our pod | 1.06907 | (baseline) | +| `ttt_lora_rank_192` | rank 128 → 192 | **1.06888** | **−0.00019** | +| `LQER_RANK=6` | LQER rank 4 → 6 | 1.06912 | +0.00005 | +| `muon_backend_6` | MUON_BACKEND_STEPS 5 → 6 | 1.06914 | +0.00007 | +| `lqer_topk_5` | LQER top-K 3 → 5 | 1.06907 | 0.00000 | +| `lqer_topk_4` | LQER top-K 3 → 4 | 1.06926 | +0.00019 | +| `gate_attn_w36` | AttnOutGate width 24 → 36 | 1.06933 | +0.00026 | +| `pr1874_nm_smoke` | + Newton-Muon enabled | 2.11910 | +1.05 (catastrophic — see non-record submission) | + +The `TTT_LORA_RANK=192` row is the only knob in our sweep that produced a smaller-is-better delta against the reproduction baseline. We took that into 3-seed. + +--- + +## Honest Note on the Rank-192 Effect Size + +We want to be precise about what `TTT_LORA_RANK=192` does and does not buy: + +- **Sweep evidence (single seed, controlled A/B):** rank=192 scored 1.06888 vs 1.06907 for rank=128. That's a 0.00019-nat improvement. +- **3-seed seed=42 replication:** 1.06928. This is *worse than* the rank=128 sweep baseline by 0.00021 nat. + +These two numbers are not contradictory — both fall within the same ~0.0002-nat run-to-run kernel-scheduling noise floor we observed across the entire sweep (`gate_attn_w36`, `lqer_topk_4`, `muon_backend_6` all moved by similar magnitudes in different directions). The honest interpretation is: **the rank-192 effect is in the noise for our 3-seed evaluation.** + +The 0.011-nat improvement vs the 1.0810 SOTA is large enough (`t = 17.67`) that this submission clears `p < 0.005` regardless of whether one credits the rank change or treats the submission purely as a clean reproduction of PR #1874. Both framings get to the same conclusion. + +--- + +## What the Five Shipped Artifacts in `models/` Are For + +| File | What it is | Bytes | Reported val_bpb | +|------|-----------|------:|-----------------:| +| `models/champion_3seed_42.int6.ptz` | 3-seed run, rank=192, seed=42 — **headline result** | 15,921,161 | 1.06927777 | +| `models/champion_3seed_314.int6.ptz` | 3-seed run, rank=192, seed=314 — **headline result** | 15,921,214 | 1.07023963 | +| `models/champion_3seed_999.int6.ptz` | 3-seed run, rank=192, seed=999 — **headline result** | 15,914,086 | 1.07035739 | +| `models/pr1874_baseline_rank128_seed42.int6.ptz` | PR #1874 reproduction, rank=128, seed=42 (sweep) | 15,921,395 | 1.06906581 | +| `models/sweep_rank192_seed42.int6.ptz` | rank=192 sweep, seed=42 | 15,921,684 | 1.06887519 | + +Total: ~76 MB of binary artifacts. Including model artifacts is not standard practice on this leaderboard; we're including them here because: + +1. **The headline 3-seed result is verifiable without a 600 s retrain.** A reviewer can eval the three `champion_3seed_*` artifacts directly. +2. **The rank-delta claim is independently verifiable.** A reviewer can eval `pr1874_baseline_rank128_seed42` against `sweep_rank192_seed42` and confirm the 0.00019-nat sweep delta on identical seed=42. +3. **Forensic / provenance value.** The included `.int6.ptz` files were produced *by the train logs included next to them*, not edited or replaced after the fact. Anyone can hash-verify or eval them. + +### CPU-only inspection (no GPU needed, verified) + +The `.int6.ptz` files are produced by PR #1874's `serialize()` (in `train_gpt.py:2103-2136`): a torch-saved `{"w": , "m": }` dict, byte-shuffled with stride 2, then brotli-compressed. To read on CPU you reverse those steps: + +```python +# verified on 2026-04-28 against models/champion_3seed_42.int6.ptz +import brotli, io, torch, numpy as np +_BSHF_MAGIC = b"BSHF" + +def _byte_unshuffle(data): # mirrors train_gpt.py:_byte_unshuffle (lines 1990-2002) + if len(data) < 5 or data[:4] != _BSHF_MAGIC: + return data + stride = data[4] + if stride < 2: + return data[5:] + payload = np.frombuffer(data, dtype=np.uint8, offset=5) + n = len(payload) + out = np.empty(n, dtype=np.uint8) + src_off = 0 + for pos in range(stride): + chunk_len = (n - pos + stride - 1) // stride + out[pos::stride] = payload[src_off:src_off + chunk_len] + src_off += chunk_len + return out.tobytes() + +with open("models/champion_3seed_42.int6.ptz", "rb") as f: + raw = brotli.decompress(f.read()) +state = torch.load(io.BytesIO(_byte_unshuffle(raw)), map_location="cpu", weights_only=False) +print(list(state.keys())) # ['w', 'm'] +print(len(state["w"]), "quantized tensor entries") # 207 +print(list(state["m"].items())[:1]) # [('blocks.0.attn.c_q.weight', 'gptq (int6)')] +``` + +This confirms the artifacts are well-formed int6 GPTQ-quantized state dicts with the expected layer structure. No GPU required. + +### GPU eval-only + +PR #1874's `train_gpt.py` does **not** ship with an explicit `EVAL_ONLY` flag — its pipeline is `train → quantize → eval` end-to-end. To eval a shipped artifact without retraining, point the script at it via the `final_model.int6.ptz` filename it expects, and call `deserialize(h, device)` (at `train_gpt.py:2139-2154`). For most reviewers, **the simpler verification path is to retrain a single seed from scratch** (~10 minutes of 8×H100 time per seed); the regression number we report is large and stable, and the script is wired end-to-end. + +--- + +## Code Delta vs PR #1874 — One Line + +```diff +- ttt_lora_rank = int(os.environ.get("TTT_LORA_RANK", 128)) ++ ttt_lora_rank = int(os.environ.get("TTT_LORA_RANK", 192)) +``` + +Everything else — every kernel, every loss term, every quantizer, every other hyperparameter — is PR #1874 byte-for-byte. The shipped `train_gpt.py` is a 32,353 B LZMA wrapper around the modified 134,706 B source. + +--- + +## Compliance with Issue #1017 Track B (legal eval-time adaptation) + +Each line below is something the submitted `train_gpt.py` actually does: + +- **Causality.** Sliding-window eval scores each position from prefix tokens only. No look-ahead. +- **Normalized distribution.** Standard softmax over full vocab. No n-gram cache, no logit biasing, no temperature override. +- **Score before update.** Every chunk is fully scored under `torch.no_grad()` BEFORE any TTT update; SGD runs only on already-scored tokens. The phased TTT loop in the source explicitly separates the score pass from the update pass. +- **Single pass.** Each token is scored exactly once. +- **No SLOT** (standard or causal). +- **No pre-quant TTT on val data.** Quantization happens once at end of training; TTT runs at eval time on the quantized model only. +- **No ETLB.** +- **Train under 600 s on all 3 seeds.** Evidence: `max_wallclock_seconds: 600.0` setting in every log; final training step `4500/20000 train_loss: 2.84xx train_time: 9.2m` (552 s) on all 3 seeds; `gptq:reserving 4s, effective=596000ms` reservation line in every log. +- **Eval under 600 s on all 3 seeds.** `total_eval_time` = 438.3 s, 440.6 s, 434.3 s. +- **Total submission bytes < 16,000,000 on all 3 seeds.** See byte-budget table above; minimum headroom 45,076 B. + +--- + +## Reproduction (clean room, single 8×H100 80GB SXM node) + +```bash +# 1) Environment +pip install brotli sentencepiece zstandard +pip install flash_attn_3 --no-deps \ + --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch291/ + +# 2) Data +MATCHED_FINEWEB_REPO_ID=kevclark/parameter-golf \ + python3 data/cached_challenge_fineweb.py --variant sp8192 + +# 3) Three seeds +for SEED in 42 314 999; do + SEED=$SEED \ + torchrun --standalone --nproc_per_node=8 train_gpt.py 2>&1 | tee train_seed${SEED}.log +done +``` + +`TTT_LORA_RANK=192` is the new default inside the shipped `train_gpt.py` — no env var needed. To reproduce PR #1874's rank=128 baseline, set `TTT_LORA_RANK=128`. + +--- + +## Why You Can Trust These Numbers + +1. **Logs are unedited.** Every log in this folder is `tee`'d directly from `torchrun` and copied without modification. Each contains the NCCL init, the full hyperparameter dump, per-step training metrics with wall-clock timestamps, the GPTQ quantization step (with Hessian collection time), and both the sliding-window and TTT-phased eval blocks. +2. **The shipped `train_gpt.py` is the same file used to produce the numbers.** No clean-up, no post-hoc minification, no separate "submission" version. +3. **Five reload-ready quantized artifacts are shipped in `models/`.** Anyone with `brotli` and `torch` can verify the artifacts on CPU (snippet above is verified). Anyone with 8×H100 can eval them and confirm the reported BPB. +4. **Our seed=42 reproduction of PR #1874 (1.06907) is within 2σ of @AjAnubolu's claim (1.06766).** That's the cross-pod variance we'd expect for the same code, same seed, different hardware lots — it is itself evidence that PR #1874 reproduces, not just our number on top. +5. **The single rank-192 code delta is the only difference vs PR #1874.** Anyone can `diff` the unwrapped sources and confirm. + +--- + +## Attribution — what is and isn't ours + +**What is ours:** +- The single hyperparameter change `ttt_lora_rank: 128 → 192` (≤0.0002-nat in measured effect, in the noise for our 3-seed evaluation). +- The independent 3-seed reproduction of PR #1874's stack with full unedited logs and reload-ready artifacts, on hardware separate from PR #1874's author. +- A separate non-record submission documenting why Newton-Muon × document-packed loaders fail. + +**What is *not* ours and is properly attributed:** + +| Component | Source | +|-----------|--------| +| Full stack assembly | @AjAnubolu — [PR #1874](https://github.com/openai/parameter-golf/pull/1874) | +| SmearGate, AttnOutGate w24, LoRA-TTT, Phased Global SGD TTT base | @dexhunter — [PR #1790](https://github.com/openai/parameter-golf/pull/1790) | +| LQER int4 rank-4 top-K asymmetric pack | [PR #1530](https://github.com/openai/parameter-golf/pull/1530) (original), [PR #1797](https://github.com/openai/parameter-golf/pull/1797) (SP8192 port) | +| Polar Express Newton–Schulz | [PR #1667](https://github.com/openai/parameter-golf/pull/1667) | +| MIN_LR for QAT | [PR #1787](https://github.com/openai/parameter-golf/pull/1787) | +| Score-first TTT framework | @abaybektursun — [PR #549](https://github.com/openai/parameter-golf/pull/549), @dexhunter — [PR #1413](https://github.com/openai/parameter-golf/pull/1413) | +| SP8192 + GPTQ + SDClip + MuonEq-R lineage | @clarkkev — [PR #1394](https://github.com/openai/parameter-golf/pull/1394) | +| Depth recurrence | @dexhunter — [PR #1331](https://github.com/openai/parameter-golf/pull/1331), [PR #1437](https://github.com/openai/parameter-golf/pull/1437) | + +If anything in this submission deserves credit, it is overwhelmingly the people above. The only contribution we claim as our own is the rank=192 hyperparameter and the independent reproduction itself. + +--- + +## On PR #1900's Provenance Review (read this part if you're the admin) + +We are aware of [PR #1900](https://github.com/openai/parameter-golf/pull/1900), in which @regina-openai flagged validity/provenance concerns on PR #1787 (MIN_LR) and PR #1797 (LQER), both of which are upstream of PR #1874 and therefore upstream of this submission. We want to address this directly: + +1. **No numerical claim in this submission was copied from a blocked parent.** Every BPB number in `submission.json` and in this README maps to a `quantized_ttt_phased val_loss:... val_bpb:...` line in one of the included logs, produced by a run we executed on our own pod. The corresponding `.int6.ptz` artifacts are in `models/` and are reload-ready; `champion_3seed_42.int6.ptz` is byte-traceable to `train_seed42.log`. + +2. **We did inherit blocked techniques** by reproducing PR #1874, which integrates them. We are not aware of any path to score in the 1.067-1.070 BPB band on the SP8192 track without these techniques in some form. We're open to being corrected. + +3. **If admin policy is that derivative submissions inherit a parent's blocked status, we will not contest closure.** The open-source value of this submission — independent reproduction with full logs, reload-ready artifacts, and a falsifiable byte-budget claim — is non-zero even without a leaderboard slot. + +4. **We will gladly submit a variant with the blocked features off.** Both are gated behind environment variables in the shipped `train_gpt.py`. One-line change: + + ```bash + MIN_LR=0.0 LQER_ENABLED=0 SEED=42 \ + torchrun --standalone --nproc_per_node=8 train_gpt.py + ``` + + Estimated 3-seed mean for that variant: 1.077-1.079 BPB. Still above the 0.005-nat threshold over SOTA but with tighter margin and no blocked-parent dependencies. ~$45 / ~3 hours of pod time to produce. We can have a second 3-seed table for that configuration ready on request — just say the word. + +We'd rather hear "no, run the variant" than ship a quietly tainted record. + +--- + +## Compute Provenance + +- **Platform:** RunPod, single-tenant 8×H100 80GB SXM pod +- **Total spend:** ~$245 USD across the full project (sweep + 3-seed + non-record NM run) +- **Time window:** 2026-04-26 to 2026-04-28 +- **PyTorch:** 2.9.1 + CUDA 12.8 +- **FlashAttention 3:** `cu128_torch291` wheel from `windreamer.github.io/flash-attention3-wheels` +- **Per-seed cost (ballpark):** ~$15 for the 600 s train + 440 s eval +- **All RunPod billing on `csramineni@gmail.com`.** Invoice PDFs available privately to the admin team if provenance becomes a question. + +--- + +## Acknowledgements + +Compute funded by personal RunPod credits. Thanks to **@AjAnubolu** for [PR #1874](https://github.com/openai/parameter-golf/pull/1874) (this submission is fundamentally an independent reproduction of that work) and to **@dexhunter** and **@clarkkev** for the years of architectural groundwork the entire 1.06–1.08 BPB band sits on. + +Submitted by: +- **Saicharan Ramineni** ([@GodlyDonuts](https://github.com/GodlyDonuts)) +- csramineni@gmail.com + +## Included Files + +- `README.md` (this file) +- `submission.json` — machine-readable metadata, including verified byte budget and statistical numbers +- `requirements.txt` +- `train_gpt.py` — LZMA-wrapped, 32,353 bytes; defaults to `TTT_LORA_RANK=192` +- `train_seed42.log`, `train_seed314.log`, `train_seed999.log` — full 600 s train + ~440 s eval logs +- `models/champion_3seed_{42,314,999}.int6.ptz` — the three headline 3-seed artifacts +- `models/pr1874_baseline_rank128_seed42.int6.ptz` — PR #1874 reproduction (rank=128, seed=42), for the rank-delta A/B +- `models/sweep_rank192_seed42.int6.ptz` — single-seed rank=192 sweep run (seed=42), for the rank-delta A/B diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_314.int6.ptz b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_314.int6.ptz new file mode 100644 index 0000000000..3ec7334acb Binary files /dev/null and b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_314.int6.ptz differ diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_42.int6.ptz b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_42.int6.ptz new file mode 100644 index 0000000000..52095742e2 Binary files /dev/null and b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_42.int6.ptz differ diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_999.int6.ptz b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_999.int6.ptz new file mode 100644 index 0000000000..5dda04d07c Binary files /dev/null and b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/champion_3seed_999.int6.ptz differ diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/pr1874_baseline_rank128_seed42.int6.ptz b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/pr1874_baseline_rank128_seed42.int6.ptz new file mode 100644 index 0000000000..cf68ddf101 Binary files /dev/null and b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/pr1874_baseline_rank128_seed42.int6.ptz differ diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/sweep_rank192_seed42.int6.ptz b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/sweep_rank192_seed42.int6.ptz new file mode 100644 index 0000000000..69b3154071 Binary files /dev/null and b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/models/sweep_rank192_seed42.int6.ptz differ diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/requirements.txt b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/requirements.txt new file mode 100644 index 0000000000..e96b4f8dd7 --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/requirements.txt @@ -0,0 +1,12 @@ +numpy +tqdm +torch +huggingface-hub +kernels +setuptools +typing-extensions==4.15.0 +datasets +tiktoken +sentencepiece +brotli +zstandard diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/submission.json b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/submission.json new file mode 100644 index 0000000000..305fb92ddb --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/submission.json @@ -0,0 +1,129 @@ +{ + "author": "Saicharan Ramineni", + "github_id": "GodlyDonuts", + "name": "Independent 3-seed reproduction of PR #1874 + TTT_LORA_RANK=192", + "date": "2026-04-28", + "track": "10min_16mb", + "val_bpb": 1.06996, + "val_bpb_std": 0.00059, + "seeds": [42, 314, 999], + "seed_results": { + "42": { + "val_bpb_quantized_ttt_phased": 1.06927777, + "val_loss": 2.76205552, + "model_int6_ptz_bytes": 15921161, + "code_compressed_bytes": 33710, + "total_submission_bytes": 15954871, + "headroom_under_16M": 45129, + "total_eval_time_s": 438.3, + "train_log": "train_seed42.log", + "model_artifact": "models/champion_3seed_42.int6.ptz" + }, + "314": { + "val_bpb_quantized_ttt_phased": 1.07023963, + "val_loss": 2.76454012, + "model_int6_ptz_bytes": 15921214, + "code_compressed_bytes": 33710, + "total_submission_bytes": 15954924, + "headroom_under_16M": 45076, + "total_eval_time_s": 440.6, + "train_log": "train_seed314.log", + "model_artifact": "models/champion_3seed_314.int6.ptz" + }, + "999": { + "val_bpb_quantized_ttt_phased": 1.07035739, + "val_loss": 2.76484430, + "model_int6_ptz_bytes": 15914086, + "code_compressed_bytes": 33710, + "total_submission_bytes": 15947796, + "headroom_under_16M": 52204, + "total_eval_time_s": 434.3, + "train_log": "train_seed999.log", + "model_artifact": "models/champion_3seed_999.int6.ptz" + } + }, + "statistics": { + "sota_at_submission": 1.0810, + "improvement_nats": 0.01104, + "improvement_threshold_nats": 0.005, + "excess_over_threshold_nats": 0.00604, + "standard_error_of_mean": 0.000342, + "t_statistic_df2": 17.67, + "critical_t_one_tailed_p_lt_0.01_df2": 6.965, + "critical_t_one_tailed_p_lt_0.005_df2": 9.925, + "p_value_bound": "< 0.005", + "passes_p_lt_0.01_requirement": true, + "passes_p_lt_0.005_requirement": true + }, + "compliance": { + "train_under_600s_all_seeds": true, + "max_wallclock_seconds_setting": 600.0, + "training_time_observed_minutes": "9.2 (552 s, all 3 seeds reach iter 4500/20000 before wallclock cap)", + "artifact_under_16mb_all_seeds": true, + "max_total_submission_bytes_observed": 15954924, + "min_headroom_observed_bytes": 45076, + "eval_under_600s_all_seeds": true, + "max_total_eval_time_s_observed": 440.6, + "no_slot": true, + "no_pre_quant_ttt": true, + "no_etlb": true, + "no_ngram_cache": true, + "score_first_ttt_per_issue_1017": true, + "three_independent_seeds": true, + "p_value_under_0.01": true, + "p_value_under_0.005": true + }, + "hardware": "8 × H100 80GB SXM (RunPod, single node)", + "pytorch_version": "2.9.1+cu128", + "flash_attention_version": "flash_attn_3 (cu128_torch291 wheel)", + "compute_provenance": { + "platform": "RunPod", + "node_type": "8xH100 80GB SXM", + "billing_email": "csramineni@gmail.com (invoices available privately on request)", + "approx_total_compute_usd": 245, + "submitted_runs_compute_usd": 60, + "negative_result_compute_usd": 12, + "sweep_compute_usd": 173 + }, + "technique_summary": "Independent end-to-end reproduction of PR #1874 (full stack: SmearGate, AttnOutGate width=24, LoRA-TTT, Phased Global SGD TTT, Polar Express NS, MIN_LR=0.10, LQER int4 rank-4 top-3) with one additional hyperparameter change: TTT_LORA_RANK raised from 128 to 192. LZMA-wrapped train_gpt.py at 32,353 bytes on disk; the script's own _compressed_code_size() reports 33,710 bytes for byte-budget purposes.", + "single_delta_vs_pr_1874": { + "code_change": "ttt_lora_rank default 128 -> 192", + "single_seed_sweep_evidence": "1.06888 (rank=192, seed=42, sweep run) vs 1.06907 (rank=128, seed=42, sweep run) = -0.00019 nat", + "three_seed_seed42_replication": "1.06928 (rank=192, seed=42, 3-seed run)", + "honest_assessment": "The rank=192 effect measured in the sweep (~0.0002 nat) is comparable to inter-seed noise. The seed=42 number from the 3-seed run is slightly worse than the rank=128 sweep baseline, indicating run-to-run kernel-scheduling nondeterminism dominates this signal at our scale. The 3-seed mean of 1.06996 passes the 0.005-nat threshold over SOTA 1.0810 by 0.00604 nats and t=17.67 even attributing zero credit to the rank change. The primary contribution of this submission is the independent 3-seed reproduction of PR #1874's stack with full unedited logs and reload-ready artifacts." + }, + "attribution": { + "full_stack_reproduced": "@AjAnubolu (PR #1874)", + "smeargate_attnoutgate_lora_ttt_phased_ttt_base": "@dexhunter (PR #1790)", + "lqer": "PR #1530 (original), PR #1797 (SP8192 port), PR #1874 (current asym int4 rank-4 packing)", + "polar_express_ns": "PR #1667", + "min_lr_for_qat": "PR #1787", + "score_first_ttt_framework": "@abaybektursun (PR #549), @dexhunter (PR #1413)", + "sp8192_gptq_sdclip_muoneq_r": "@clarkkev (PR #1394)", + "depth_recurrence": "@dexhunter (PR #1331, #1437)", + "this_submission_unique_contribution": "TTT_LORA_RANK 128 -> 192 (~0.0002 nat in the sweep, in the noise on the 3-seed evaluation), plus an independent 3-seed reproduction of PR #1874 with full logs and 5 reload-ready quantized artifacts" + }, + "notes_on_pr_1900_provenance_review": { + "acknowledgement": "PR #1900 (admin leaderboard maintenance by @regina-openai) flagged validity/provenance concerns on PR #1787 (MIN_LR) and PR #1797 (LQER), both upstream of PR #1874 and therefore upstream of this submission.", + "what_we_did_about_it": [ + "Every reported BPB number is from a run we executed on our own pod. The corresponding logs are in this folder. The model artifacts are in models/ (one per seed plus the rank=128 baseline and the rank=192 sweep run for direct A/B verification).", + "We did not copy any blocked submission's numerical claims. Every line in submission.json maps to a 'quantized_ttt_phased val_loss:... val_bpb:...' line in one of the included logs.", + "We are not aware of any path to score in the 1.067-1.070 BPB band on the SP8192 track without these techniques in some form. We are open to being corrected on that." + ], + "fallback_offer": "If admin policy is that derivative submissions inherit a parent's blocked status, this PR can be closed without merge and we will not contest it. We will gladly submit a variant with MIN_LR=0.0 and LQER_ENABLED=0 (one-line env-var changes against the same train_gpt.py) on request. Estimated 3-seed mean for that variant: 1.077-1.079 BPB (still above the 0.005-nat threshold, with tighter margin and no blocked-parent dependencies). ~$45 / ~3 hours of pod time to produce." + }, + "files_in_submission": [ + "README.md", + "submission.json", + "requirements.txt", + "train_gpt.py", + "train_seed42.log", + "train_seed314.log", + "train_seed999.log", + "models/champion_3seed_42.int6.ptz", + "models/champion_3seed_314.int6.ptz", + "models/champion_3seed_999.int6.ptz", + "models/pr1874_baseline_rank128_seed42.int6.ptz", + "models/sweep_rank192_seed42.int6.ptz" + ] +} diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_gpt.py b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_gpt.py new file mode 100644 index 0000000000..1954f47864 --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_gpt.py @@ -0,0 +1,2 @@ +import lzma as L,base64 as B +exec(L.decompress(B.b85decode(b';rd-7*9X?;Iq}Z_LW|y7TL&VA+w*HJF8+d(x^`on(EEI8AxJ`9@%CIQ+*n#Ev)%TL=J)M1$4gXanYH?YGbjKEZ4W7r4pR1e(HfkH&nT!6DC7lPUHu)tn%K6?ATwT{Y)24wUn3gT2DIlcIL2a^LAMPQ}A*}Hu6Xm-O66MhR4e5spc0G<2XSF{k&7nIrK>TfSE!!vBZYqpp&A*ZW1MEJA5`dTOG|}c$yFe=BrX)Ix$oT@O)LpOI8ux8$>E9cOPfpej!JeCF2ITVH>Fpj9HF20c+g!NW9aI0;b!GadTEfQHnUoe-ku;w!3UG^?rH!pqMtTs#XP_A~<<9*Wo66%-lnGfi))pLetQsapT<$D1f6?1p8p+7JIj(7yOJcDjlatx+b8(4bYwtPD|S-jg>;1mMC@>rp#^&E@jPHjG8Jdts>F1`$sV6g}%V^7D)kp@mkAh=ljC2_)UV65Z}5qembt>E#Ii+=y;x02+AZVZTKQLF_ffR0+8`ZTj-+x{6`Hw}}deBCOd+vBqL%gB2!31Z*Qcp31vR%oKz16};;FPX>6iO~&QulvAXQ^{{)r3=72o26WI7iV&ppW+R0JUeu_amu{+nR{U(M+;st7ALITAi@Wi9Q{VzEmt1T`587iy`<4GDkQUW%N@7c`wRipNrj$K_T;*Pbf4LYddu0R<$?9jU2vw+iJ}uRJJOEkx(T0)?gM7Tlk=(m%RGHk65`*U5)YB3(QtV-R2gxRn?4zAf4g(Oc@WIyy?L^TdrPO9R4iI10Eo=s2Ofd%-(eDO_6&mfRnqBV@$I$1V7)HQ%sm~h=R@l^(Ui5$L2oG77c5q_3dRclFxN)}j_1^v%94*%rO}-S68w|wENLi&(A0Ckon{0c=_KoXR$e+^Pd!`$QCoSFeh^nV3mto?nMCorp(@49ZfMiLP)Qf(_=ep4_%BTAou~)vE95d)x`w2+Z7J4X8XU^>dl*2j3*OJ~?+Mq3_z-0QQ1XT`Z|*d#kx;M-K$Z8rzbgZ*b;;fup6KN8Ep$bO94#&329T^Kd<1KlE8{>e$mVNMeq50~tBi|8bEz7cKiE?KXV7?kzh4}Op8{R9dw?P}coJgqOgx(R*|5(TEMe?_f_2?$To#Sm1@-R3Q+nZ;8h+gVUjG;H5+K1K_}GMZ(bUg-FX>h)0@ypk9B-&l2%X15qL}0QNNrd1AXRt$32(G6+?XM{L)K)D8Sd1x#(~>talASE4yk3O_g-oDLYyfDQ3CSamCEgOOf@4_76cdD5hb`%?d*g-s)n)!zVYCN0MT)W6Zqc_{%EKH_J>@hz92i8NQUx~SlGwR#i}Z#q+$mfZO0e}@BZr}Wv(X0FllxxKlgKO@N17uJOgqNMfuGQqYy+TMdN%IQ|fE$C?hR=j6?n|s0rzQNko)^(G@omTC0f6Sc~r2)=B#lK@~kMN+^pMc~sMMLnFdwW*J17lyI(%F-VcD{#07LN)(+`)H3*7XFO=P6AeKB}dpvM_&tohbdpMgs`3ZDzN~$2gj!<@;h%Igze)sN)!KvVjQA{PKCjh^U_JUpGkt7Z^a^vg|?LAXin^lmFr3FUiNLy*7UL`V0|At!jI1HuDpEv;F;1p=e&D;ZlgD4GXuCbubmYf0X2AoKKICj_)D;%r3!96%9#S5>R!wS1@MWU*Rf;>v^CrE&|u}m$uIzVj!9SZO=+A1leld-M1qrh_9Q+v6?)==PGajjF#5?1Ebs8Vrfa+fnn(2n=V|@`RWod)8-QXNWQ#G%jcs55HQ3*LF<#T)UZ1j@~T9=C_TK6&mQ$DfA|plZT>C5w#;os>EtFSga@(1Y{Z;~P2oYiTeidj4^`MmA#E4ym8D++*cO=c0^o8z^2w$SC`F3Fwkp)v9Z?aPUKzB(48pAyS8l3x>v+hleb-@Ay3^OghE2E@K`}GaZZ!WZSH08we91A?bD?ulGMTmHef#e9E$*&ZvF6A8|^sIFyUyyGy916#di-eoSTFC!R-_xbW??m9m|d`$Qa_aI=VBr`VoAZlhx$8>CT{gWXO^XHQUQz13F@XYgR0>%XkphC*bY$%iR=;Okn?vn-7rx`HUjgEo5)nxObyq_E1%DMC_IxY}o?JhfL&L@G4Iz^3OtK${`Ov`L8EDLv$sAY|5=v^38xuMnfruSfNu8(oK@rNqFpt}UWJyUwHgN+u4llw%3=~C-AM1?!F8!5yXkW+v+fYGifu0c+TR+sNxGAsM30-{aKMTa_h{k|GXM!p~D#lr-VVI1-j6+O2n1pEArcezF3WCKP1>%KE_Tkx2;;v+!liSJHx6><2u-g~`gVGSq@z{5&gmRa(zcvQ~@7UokQ)ir&5IKJ(N83AVhNVgwB$$j?e5eEkcsy~{;pRYSG3U*6zf@Das$EEq8s_FSM2Jk|$xpHNKVNfo`>zMmF&Fxk_-h4ly)cEJP6DsSzu|Jt?@dcEsc`WljFI@(FMkX@pgWbRb1*jz+KY-X(h$&DB%J3iMS*PzV5(C*CG3VUuYk;D2WEK8#IH3E{N!zpu=`Gko7F0DNMp4v{X{G{+#GG6ZOOO?M`z-m59m*Z~TjuZ~llV(`S!$9MAGPJ0bPOdvTBesQdpA^3O@GhGU&{zEeYyb`^$skM;48^8%ukEISl9zEIliLX|1IU$QIoADOHJd*)Ig0b8PCG%kFgb+F1Kn;*_POwH%h1t=4OnXN)hqMTy|40^*GOsyKuNsj{AMN|5c<6N#fCo^q?*xb!N>{Ntf||=u7~cK{%;m;3ODb-W@}(5xpKC|-!IsNi^Gbn7lCW!;VMOce-lM>y))8bRS;M_)joc3>-IBLj+^;a4jl4Oocs=>6LIYSP3~pz1^eANf#(%YoYJCzKy&j3P{EkrNcham^twkwI{>T1mi}9azcj&cBjLW!LZ`BloaM+PPhY@llyt;vVw!R7XSgxg|osVt!vO#&YTG)Jet1Lj!XdRn#djR_TCUaRYj%~uY+VPg%b$)4y5(gY|7p{?}=I82V4F|h9jA9#{y6t><5N4Wh;2@MP=m7msZ$sp^$@hZz^ts2x@d_uBz;<6<4^^S*^X2B4C&RoN&5lMW5O)EB9AOU(3r+~tZ2->fWgC}8zk+T_Aq3*)tPC{(68Ro((%&Cmv6W(i=(K5U?P(35^mn^HlSE7M0Ux`*3XhxeFLM*#!MRRWWcbM%zn)$Y6NW``=a@88et8Y`}duj9K+(9osv47!l~;6SGESWxu8Jr9s#g6t!eOFE!JkOFzq%IL%Rn5~Qe6JmjYvg8Z&!}BFm2FNzW!1*frF+vo#gh6i#ijUE;e0BXfr#phm7hHw?32)9oxS8}UY4E6NsLLW>3fC~12q>{sGcxbEPN;FaK*6Yp3JqL!a671C_!|^u6r%W>i23JQa_^PenZaa_{%(=bLK8Pc!Kr}vw_q`776}?1+GB?sk0O}uWF&TM+c1~YCyVwh#$d~PxD*fGhPAsy<$#9)QSB7}L5xWYxZA^I$u152C3S)3WQgKZi8siNi&`d?J4JhS|2?Nl6k8junzdp!FnFfLZ|UaPxu20(4_OcD(;;@cMK$sK(Y`AsU^?)q;Am4q5PKqo;n7g~OyByE3y=80E_ekLJ3cI=noJ`ZRZ{a3=dL_oAq*l``7(x%o{Y`}=yPll%hy_$L23s@LCYA)EW1$WUTKmJvASXk>lIDMl39HZRm@+S%qLLspjMEi_qs5j-Nr~SWGI`Z8+a6j!Cx5kJ)%(e1*FRHxp!xtuEgCO$=7C}f^AjY;_=XvYcT$Wzcyl?EV;Ow$Uq*2j}MoYhzU}|4x%F^o3_y=2to&p#^#hc|uqnVlFk*&NK%{(+xg}XhLl~?>0Oq#2KbU}L_WA>w;^;p{T)zNlcR9vxuFd3eW2@VA@0BiHC5wU`J7A}5JMy|)_Y__?9n_gX@L=e@4RIQ`YKtc*kntX=I%JkGJhc3<=jG|OxkJd|cmQF1rd#AaDT93O(&$zZg9otwz(*1kt0fLDyy7RFq5yr@DcU!Jns0q+@Zje0}xvO}GiWo)(JJj>Qu*8%ZL4{QCHqSthoqLd^A$rUX*QbwEMmzCi6!(r+6r|DL>Nu=o+V@pRX<2@G(Z3*>g{fut6J;`D%l^v=9qX3J_O1-L%B+Hj^&)O1V;}HK|mKBJ>Vq>2%d`GRjLqMVx~kzC+-z1e}$Z^RBSAfOBa|Q#2bXQ;cgLU6uzD|59lFyCvwrY*WcPy&Lhb#zw74VkK1%@q!)d7{ND=IYBZuHO((nZ*N-L#WbaZy()Ye)?h$;ecgx{hL5;AWHVMVBnt1W$Q*tsR=``lQvJdJb#rw@);IheTU8CP7OxENI0{`$=lbMry!8|{ddV&6`9EHZ&SQ>5_POTi4(;LGJ=_|(OPX~+1!HSLOR+nNKPx>)=maag@5@3S}dT2rXug^?#fSmEc0y2N>Xf*I~nemAytZbv&?2zCacL7aovmo(6DRVr9Ct`Db4D+UywIi)S1bCFGG-sE9&?ywc+x;rSmdHeggnhp6&WIv1dr<`l4U4yd96lC!rSxDBW|NH9!>hw~kR<)AA6&!Fsc4>z&58(K|O@jx2U#VeqmSkvpu?;$M7VfFhj2EWviectGRQ(Qnfvh2ZL0mZ-#{D8o*`Ye_j{(r1{r>z1fWjgf_}cSaAfK)Y0DOL1y`8h9H%8a=lmQ?=~#UT5i7rciUn;ZxEhltpE`U_6~_KQ5Lt_d|vfwR1trP_Z3G-rcRLV0Wz%b$sjo$cb5Gn#lc*HzAE(Kpf9Ligi8(jkP2-*dpW4VFi*n$;jvhy{+2iE7H6;yf~k~RGFOM18WQ<(GY>tghvWAe-!iao^QLrY?+`<8EUD$sx3%%W&+DQ4LeKafy*a?H_rSBT%SQacU4?`X%~vl$rGW4BSznyy2np#^x}i!h6Xm$w_gsW4r9*w&&|X`%50h+{+RhJf5A`v2?dNtF(yuJPx|ln_>T1uR2m|S7iiw9o#z&@{1&kU2l#R9kmRL}0HAwd+0cjQTwR5b}?BM}+0|LS-)F2V%K&N~T~<}I%$JWTnW`KgAL$9g&G|4O)pa4yyuHbQKVm0m^IiTV=7VY1tkP{@7_BeCB#SbhOD{lIB3-hXpi3d7d=x&l3%3Dc2HC0#ft7Z0J3SKqSV3o0$_%%G)x#$|(Q1J*elS&m(*;Z)19F5RnIrW400V63IoHvz=qne+D(xOQo2w0VLJ1H>Ys0^zzHI$j-sDd`GkM0;AtksGceu%3mlbTL`T|#V;=Q0pwZ(tL6Kz2Mg-Jm*Ho$USMG}O1*K|P%q91E-@5+M+zS5+pBb1}ow#TA+sJ39ku8+&!4drsM1JrCcj?&r=vtg=3RUAdRj7;_PyEBN`08jPjRHL_)pJ7n83dOQ(Cj8cNLDws)uy$H%R=?xk&h^~$s4(bORXY&$v4ySKO%QKWj&4ue#E&6yY4>OkEGQYzY`Y@_tf6Uo>>muXuDvBtF*73a*C}*syqxQl#UFnr@gfa@{^pQEPZUB-~}4eCZYiO}c#hZ^mK_13gZf*_cfcCBliLF?gMe&#=;OE=tPtT^K;DMk25~8=|;@?J!Al*k~(Z31D5U(0=Hfqjeg&h5L1Zf7F1EmT|P?u+TjT`uQ@jT*T%^B#4e=*Y1$>d%aBQokUDE!i6(NtcXs$)NWc1VPedY;hdVr9%VkCPK?Kp-#ZwFUnx^Fb^hDoVnU*51Yf$1P^#+1qnZL?~~HixMK1p1lw4Ky#5CNE;W6^Z!k&`3I^Ity*&}XBDPYkUwn@qablY)cFg=lxgsTi7Cp{^VAVpNy`*8v-c|Mb}RSZdtsiiMbEPc&!>Kt;lOwpiH1;Y9q?D62)J8CB0MdnQHwR1H4|kmC4zfIDj^4e6SUFJ3d!?j@Iu7m@_!|Jh=WaKJ<8VBU2xx|vUw9M47J6LDF993C1lDwzLoH5bC)dB55t)&FZi1c4<0uo_NKDp>^7|qR;gJ5#eXumKgsP8ACtHRve>CXsYkpbNV$bGUc?#^~J0FXXen4$J0e2^9c+%#00`l71<7iQfmdf)bSnDSTDHP``LzBA_Gg9SDM>k{)gX+LuBf$DrnL}g5R%y2x~_w6}ybQAOp?ohmtXZ1%TI`#Zn6s1-h2A-0F<5IN|X@I_B0YmdywKU&;QJn%$Z8JUV%t(yz%YMEHgQ3=Ga*%=FndhS=<>+JcmFc~lnewk0y&m3nzmCnbD_N(gS#)E$HlxnciC#wegISI-+&uaF8Q@`Zvo0sV_fhSstNsL;Qk4-wCTccy1MukfC@Dw0ApbI6MuL&=e!?qbt+v#z9tdgBSGum{0s6gg10eP1`Irket#dS?iv3ZplD;wz=iqJw@Qu#O9#28^O!b3#UYp>#(`L)SJ=q;(3Qdi6GSw>rv#qE#y}r0wKn|vS?_&f&+P2`keUYYL%N^)7mb>3lX(7czgYo*s%onpIu}!m-26lzKnf4NY_@E~PFz{!UtFk5wox^mM$?&pi5)_w+uP2#K$h1{7ehp#rYv33xx!Ar;g)EmKWL$z?iEI7>67W>vy~{Mf13*p2wdtUp9RT{IPeRLrV4+fS7#bNW$+!E1FWF0<~kPd>z^G(!~5bQPDRWF2~Y$9a$|K&qY1@j^oUyNo}HCZ-JHK+XDs$>^fVt`7Jc*OyCeSu2Mg1P_;(EWo6eWIwN=Dre=^+p}L;k~Hve3-2_(V6+>s1DmZzpyNe9sSC-oaS)A^qx#a1$EaAoGdJ-h=wOOn*`3Q@iHk@H9Vlj(eynl$MKC_zlx5_?tchOYI#?ipq!UzkftrWFS*+9~}snJ4iT8rRXwaBYO^^Xx@d6>G12?d|k(_DH(-pL>eJ!QNJ1&93W--^*<~SiXx*Etsyc;7yIS9ewO1In&ce|LUl-!k_8fpBlK6N}jegoEnzr^|didVqAV)YgVVpgJ3E9Achene9N*qGKIm38Q-_}wvu6K144a#lt#Qr%*W$`>!$8i5-?(}h|-y3j@DFd2;=O*X2#-PO8-7n+2ZkAezrUS{MG0Cc>(^QPy`U<~N;q1f_TK_EeBAJU|97l!F4kZ>G)$bAP|s2P*JWG@T{v#R}Xzd-7jc7=P;D|E<0I2cQz!#HtriRmoIwl%Q@b$JuLzemZ5{7U640N?FqQRJ~M>fuL)MmjpMo&^KjRzN|t|Cx#eDPNfqc=y(9E131&;e&M%)i}D&xTBI3do(liZpP4~d50|x-QN^QfMweCk{HHHnY}Ss}J;z(k)&;o2H>}H0f1F4QJvVy^qLPU+D2Mh?MRS9ARCX@SO#7SEiqJl}4fyk%zu(=}oC(g_45+P<$S^KTdpZ%ct?V-oN!(TtCc4zE{pZ_2v?8m^5?iluQ!OtQC#8LyWn^BuLEK05qRsz0993UKW+PZC`M5lVvPYNQTFf}WBRyVb>BgBcbq8(VVdddX;B0y5Pi>$;Q)2_~Uve#qxbD$VrIR7ZPInka0#15l+Fn;=eU?ou7z;G=W5uMzq`#5Z?5JnPV$IJ`LK2IJgX*rn?&7QC+4qx>d0^O||_jp@DE1v*ET^m0UE?dEC7~oksh{zBTY+aHG<*=^Wg~R6|9~(&NG`E`@Q(ed$FZ!0xgcPPV9>*hvG2#(g#MIt)A^m%>h%6x34-j}%}r%o4_n%<)T1Wv6ZMhx@lMt*Ag*^o@A&QEujU4Ne<(20jF!JFwRoKHj7VLI5+xh$gAHBHk0~-uO@=3w?e%F_9-dCB<5R(khQ24;#L396*(qfzIJZ>8KDc&=q?;zv^){<=mH}QOb_Eg3s6XK2NL;cR&8{O#euxdP&7u0+8cD&P2dM+jeS3+)8S-EfIB%KjFm}rV3RdE59{s7or^YM5A<0k~E>3z;Sqj!k1~3Hrs5I$dW`irzYSn_1wolU>-N}Z1cya8@n4Bpd{Ql2AxYqVxkVT<6^XwW9Yr!Ja%%lAc$@BJPzwbobsZNU3@IFTHxv&gN^2eAR(z~{Q;SkGb74DkpDiL7B>7^bjX;zv!RL1fY+?Xq(6jmc@f{g^=@EiVKpTOkQU*P=dVwZM;No!J_y}2PRI@gu#J!n=THUOS?DV=)2d25>uYKjH?a6ooU6$n+aEObctmr+%N_4iP#+a^?S?0RSe%5xd>K7%el*1tFWOqTN`IRg>+`9o)K5{Drd6x~aBdOzJG^Kx6zw9}Bcu#yr^-Lqc;odCa(WN3gmT@c3gMXAp%rKu-QBgg%8byUfNLgSA#p}Ky2WAtG|B=OJ0@496(?8j<8IVkfnTQ1i{&H$|Y7DANII^EsimHCclZ)>WpLZ(lXcN*otv+Y2QuUklWXe$=zEV5iF5_O-rV6@vH-%V}OKQmzWq@B1jo7Dafgp|@6?pq}?GO8ipz$V^yX}9_96TCL^VesWblIe}y+g;`NKZd-|a>j~5;!$D|IoF*!vQU`9Uba*Lk_<3gMn&HPs!t9l1Xi<7%%{NJZTsz>3X+RZ_YTl|2Gl$aLQCzJ%PKQP}`+5{8Wm+lZ~1Jh;D`RKO8j=j4G%r76}rtzmhheQ6rtA7qRd9g4>r|FeAbtLNRD|9~-V`LyG5m9qR_}_fKAP`=Ko+7K21{HsFyRDMw$iwbLn~g2<{8m$HaGdf5yjdm8JzAlYd?Xgn!7X|s^L4`^@1vy=8``Cn4+62>R9Yat&X|kjv896@0jjIP*bf2YDwQn+DF(iw+~)H&=NixUVCx(SVurswL2b8rrb9u0^cBCc)rbj)zZvW(ctnB%SD0lomz8n2pSQgOjdCUZ1YnMBy8;Cn>%>vFDyf7S#FWs_GL*L2C#)J*fa-aR_WGfKsNCE&OOnfcaEE3bp`cY>yo95J%ab;II*r~zSp_h9~^F9v3&TNH02s26n^TA?r5YcBOd+${ZGw1t1H4hkv-8pYJdpYr12ORk;dngG`y(g6aKmyZg0>sFJ-eYl#VekmOWIv*d>Yug^nMa6hNye^0Njf_$>xYRKM&AvC3CMt|3@*+~aC2WL$Vq{MBRP$bs&~)&DDwqeHN{X5?`)cS$Psl5J9xflBA{1#5KP%TlJOPxh+b%}ZB@?HQ!v!0I&Qvs1r!;(b6vz5Duto4R_%o9GHRanY?N89+hFdO!4~RS-lniQ{hP_@%wzPfRvuuqiyvA0gpy`iJfn80;m03GjJPjadRDZ!$k|>gfWVOzjY@D2On2yA!#JVvV^^w|H1H1^7Ti*DI#!s;kc|L}PPtGboui&mv=?rw?!rVx%LFt-hWn`h5WLL8R%UFnLZq2qL+4e4Kps?NkbPJ*-DAMvcr^EKQ|_VGE5zZ(9CS|$Z?l7lww`o;Y0bBaUZfP+ChcQkC^yZUGVJ)WG_l%hI@N01=}l8}}~AKAf_(!RuvmexyL!*k$XCX%TZ@L2%M;_NIVe<>w#=b!uBBLpuKRDIEe|yIjGA~<-H8HshR4#_@wxsrQa8ll}*nqk!!)A#sz```-ICP<*cR9A!lskgg#GuWBd4D;CKq)Zk1#**^I=04AMH5BElml}`A13|SaxZLkXimlyVI&2+_)+9D;P@2^=+U6Hak+RKFZIUh>5sng3qF7)&bz^kLK%!PCr$%DOafsfyv%o2fNJFGU_3$a58g4B~6ZLAvoT79^CSiAEuFj8n6k}j@Jj}LSDs_JIID*iFe@+zJ8{rv?eGZTTylIYAAzVmfm*#8N};Fi_ny49UiugLbrcljTN1v1Hn44RsoDm`gRhAujd)j#nuR|ansJ_s9|Re*;MyM1#yV=07I^|3?FiISH>XX#}b|yiV$J-DceP!=;m2T$y5IsS|tBQn?PP7Z%w(vkF6@)?lELBjmM}(eO7xE`Okd?ps_vNTAg$$`&96GYC}Y1V7okpnWe)8_pl-|hqkrcfVQf<{vny0L<|xvd09sPLP8XI6w=OA9aaV9@`E&6-!O&u7&-3z_Ik@A{+JIy=Q3&~f^o+Aevw%$Md^7gBN@v~sdP6b_KvOtoj9pP1R1*ZWKBG4zYQbYHeWdLc!a89q5YHsnkE%x=&T43;g!Q7F}c56Sjj14HBLEr+J%*|>(+|AB)#zPk8bKrPliRR!BB>I71?JZYgNHg{Qu%eKsda(fx)6+sVWv`=+U0jjis6%ew!0tcCzD0`QVgy#~ig-+$Z(uDD1-p-s9c_`pkobPZb>y6VYxV+%;|T`kG7ZSwf%ceMKqxX9l*Cq$Nz=TgqUU+2^Yy=W>1%^9$I2MBQbV2zQqUIpUS0u-`}2f02E?s&oIr)O<%l6Z9&xy0zxvq2&Gc%Jo!BWNM^`#^`SH5fXdl(UrQ*L1OH+phe3zJPLito${KSpzybQRor&v$_p&%F5iDW(^0%k+wXZ@yWMm341O+s!S?62cAiuHo2JwzkOzV^GK}nz`YwbIQ-O_3_VG25wD)wT)8vjNBB;&AT49W`MH4M#Dhe3>u=D`Bg#Qwdhgk#0Bg$A5h+D9sUr-i4PV@5^`DIa0iFUVfW@#(!7Cu!TTbi)SdyM=v-?#uEc>&#!P6yV#)OG!^c^yozE2T2wZraDYbC}}?ao?oJ^szHj|p;}bo54}W>GnlZKIu7tn@ZvGi%kDcD{YlE|C*{l>@iVDI={~>>(OYf1gQprTVTt}rAHS62txe_7Id~o+RgiGs=tGJbAlPy9EOyhrPfB)O{L%l#~Dn*fBrCyue!kB?9b;phBQ30#iCLcDP;R|p&?6|re<-(cAzMJhV@e%sT_9Ttg1uj&wND1$W_xru-IB(DZ|>=mLVDYdj0neKzilY@xe+F_i<)mI>Q@)hH%5*_?O73(tu6wqN(M+f5?MtW7vo4G2AL?f!x6Sjni01dQbq$6>bOYl4JlLz#N&l%Zdl=?8hIsy+VJ0?*ops^TDi%y>bX;=Dlv|T@=5iCzLXP(Ot#@)ufiYB5N*p0hlneXxOEtwZ?H+?dIVV1rbfa*%d(ADso@%WSGSa1P6zRV7xqGxp7^n}CGAuvbpk&VT2*ZOvRR^dJxABdyNU04JWn`n0maq~(&41`shHN6mw9H)>r%5aXor$f^Hj!;gdG`9q`kH7EeLb3aPunnO8I#D46%7f6*MKi5kNh4})4=Adwl_nikYd>cKz1~1OAki&P}{zNny;n&H8KVqQ$uP+ZI443H5I(d$=S*Zni9S0Fe1N{xb}NFDz;iQx7JMJvnf*e+R|Au>!~|{*iW;L1n(~*2Vq-=$<0S$;1zCp{Hee37*v*^wc`eGe#)$Xl`3PQfa}&S%h~FkHvGLzG}bpgGc9b>3RSDf6G)JD;+l3!!TQ{ikNyS;9@VB3kM;>*YA2hLQgVKKv=K-Ap{QmnY=hUU5OeEC4P;^{+{jO-G%<3@N(ClUB|v_T9yE|PH-$P1NfYQdf4k&UEi7Fatm&kG)Jxnv(0hAk7eK4i#)kAmV;FQZfh+n>v#H)!6i!iP-%}}ppm~9k}Sh;Yrn7pJ1yFB1X86k$fDqF^j$TIf(aG4U74w{ld2KBNA#^RWxSG3%Ov4!ejZdRvPAlZ9@LefPm^>ysfqk}zYAyoR*#-nUz_|FCvn9mQ>Sv#b}1)guwpnZBm-4vL7uv@Jy+795tCwIB?hb41sL_#@ICMg<#Ic$06kABU5c3?7!wvbV(0P5vg}xf86yLJeu$$bylk{^Bm(Vk*DY%yZm4cdh$5)$X^9eJIzO;zBJf7r#h`z|(P9^+FEq_qCv7+hx*3CcK%fL>0F5S#!-2>j?ua=16b&*?x`em|)@{hW_wYdZOf%$@g91?cmUh1G<-;tBvYv7Hh9W11(sm>NYsfj=+3qd!a?Xg5aP8K}P@#aeNlWXdr8kcZB2IK)_7$D2br)UK6hMI8tJsmqO6fAWRgxG3np4=h-4~dueyosK%Be4DVlFIDdW^;{)_fycoS9f_R7lOThN~aePnS}Ohp?28cm)(%OsqaGXe(7%=$3K2(t7%s5}1HPfg&fd)!Sv)@kK8W#n}dU~N&*djFkQT7^}s8M@^r{~`ya<-{6WoXz$PpEN+;vi?_Ag9jiaOm$&uJ)RGN}8INo*Ma&MdCJjyL7|8X7h>@5D2pMSk7Dv*i2T~H=EF8srPMEWhzAdfeUWqMelLgU)wg&x5*R86A`=FTNNHGPGOJp4valsaA2!axt;-dYzdCXx;4(dRjx7vbIN`O#;&(lJN73nJGaAW&*zOQ^J$z38)c}yizM+ln^zhaCWfBx1xp~A-W-e2WA5F3FkAEnsn+WA}gW8wSI?K{N^lki0E0hy{j!FxBGHfZ0dnR&Cm{MW8-Ja>J9cPT-9p4;jMu~}G%>(0=moMOdP#fK|o88qDuJJhvUP?h=2qUR+hOU^pSiodoa;Tc>EIsR3F_WPEM$FG1+l!7*x(X*Y|n6PQv{b-6r+4U5Y1Ikdo7@8bVwm0d)nit~k7vO(AR+rd9B2dy2s4=#VqgLn8IcD7Ay!)O2f)uUbGTI0U3ThTvNw4T8f_YhRLjcyEM4RJCOvu?NaCV+($nU4?XC(G7J52D^FY#U#pP!lw6kf~eOGr}wF?)`QZ1!f$L*;1Ah-Ui#1OvdEZKTQ+qTr6xIgLE5V5660f#-N*6%^PxYL#NuE4ad?S(X1g&Sk#L1`T_)6o-3}T5otBX{HG5)yYX7r~BD;i097fd}desGHiaYRZ`kXOM!lh29vP*VVB?chqh_VOvh{^TL%T{&3RoR)l-;1Th4ip(%++(IYv0-P)FseeV#)in6Nhx}6);liR5=!%0OV;n#TvM;FG^Y0)i$h;aBOr83ybcSLo6l|eF1-OJF#LPC+Wdhtai~~W1t8=@baHorVCd{N?3VKmtb#BU(?$Vh^E%u@T=)ZC6AD*z(XY1`1D3tLT0JCI>Oz`f?2zd#%~gyr`D2W5Wq!?BI1Sz3p_|0^VS!ldV!R?Tbyk)NM-Nqxl4LDS*?{P4Vg_qbPg@FB?Q?CapGb{m5Qxou%TaoQI9g|6r~E)=`gq@~N3+6Em-CMWBA^1RV@JMLrRna@;x8EJl56;{dm73QDQ{*p9lju#<{BVwPKC{t5Dc-istn5*0_gW2$<>1AO$Zv+X}eKWIHz+V0md_kSRP|)IIR>JIF#Lf`caex-unE=$B1gJkLWh@9#Nw>5IH&NU-H9>h=)gOiRG<-Ie+#%ylyCng?IK1tz(DI)TLv^8)ptSz6yo2(FBZGok!xl!EVTWIsay%#>u3B2q(Sc9=BXdRq;zE!!!)wI?@FHiOx~X+a%K4dP2zkLThA40&pkc2Xj5Z7OnL-qR}*bX^Ou0;$H!H}it9A{Q8s`WZ&(bVRXN|44CCo#=?0Q~XZt90ATXeEu}>^o+W=y}6HvO6w+6pBb4lXn3F5|&Z%4>@xNiyYX@ideQk6l{pErL&fh{Nx%rY!VkTvMYX*_Uz*G`tGHa)Myh1yZCtebB30k>7Z3oAdM3dr*B=Q;5dOHj{)+QJRsXe^tnj<`R0i+n@l$h7>cd%I@sRR_7dD-cl@nU?LG2ecV-v~X8IUQ_>mh)H83??U~uC|*MXy&YAHd=L$5=0|Ifx2F0g}}75^D}`XfxKm=OQeORv2!E|P;Vw677ey}x-*V?L1;gabPs`8p`m4xv00uBE^x7azwjd|0E6NEP!y^K3k^obd3u^4;o=MH87YvFP;>j<%wiN-A@;_u|W{cWnCSd?0q{t&Dbj_vY)C5#6096{@LW8h>>i+OCvqvHkVLJ9{+e03ap)6=oXh_NA7{bj+7gmJ;6jcArEWJ&i3@qgr9JZ(NHvfFM7bC=%C@S%jtc42DZZ(HK3wQzbwGEB8vgPR=2rf^dw80A2Z_{T7sQ~&lF_^*szjM(qD0>o7}>3{<5sp#D?;W7Q7$O681|Pb?NiIBB*aCE^jr*LQ!i}tv6UfsX7`?Wo13YLroUN6XYpfDlnoG#b#thINhMp?&^>D5PNO4NP=4qK+ozJjcW{dPtfhlV7b6BnekBGq8>jb+^3k{*k=mvS-$PkJ8UFqji{SF8E;jxiI`@jBvExJ$$!(M)2mGb$^mr1JGTUT?KNv&~kc%u1x7g8=*An-K%NOekm#M;HFD%FceDv(d7#?@RW@zK~KDk;;u$wcbz|gi{s!l6B0GDOcN&Se5Q`xB4Ct~FbYOTt9E0ep)BWfRon9MR^zZ#d(KI3wwTmaH5*Ayc>b&qMR%`izndljlg+Ft!pHe1^3+S)Qwf_ctJxwFs;;XOk-|-@vCgX{09a=d^DGkOOHnxOT*53!5WPL!a=)jtu(R2uIeebH?nN#ErZ!{x|qN^AEIvDUt-Q~`lQl6=UqZdPOH(T;YX@w0cXk}jb@@ISkO4-Ug(^rI2%KtE58{ue$z0mdu6gZO9rg@vmQC;PLLp;~d~5qB6Nf|4}z=`ItNYfZk5^JFR5ssYiBAyreGgn&4mnt$O#3W*JjTvgHdU{cDX(MlC50gzYT4aV+cB@2;8z>(qcEX>Ax`ot6sg)~Xv9Bog(mO`M0(v2peCA0(1DcusjG;TlsHIi_Wx9=#f{QyEgw;aVYEcrB<>Mqm9u*~=w>3;Fat|P!p69e%7_f#pGnpi7TPi9XuT;ZoDfOgR*#NDUM?RRqUlHMRyPM95R(6AgldsoQ=XuB&rs}}v8EFf;&26FjI^%@`d670?ajl2aipQuGl_4WsmcOLJKB)1;?k~UgmRmhG8q1Ze%2qYhH6$8h!boD&X)4h6Iz+rgsci*Fk58m?DE*P|=<2_kDik6?H{OMbj0d2`%SE5HNS(D-+z06TB&jO*>VIHH1sqoqmkm%`&9O$TY7loI8afX*lKVp@9yr~5}i*Urzov-hj@nGX)>6{7=D=!$QY-bzq*GJQ^&Jk$9GFQG-`F64f7o2&Subq0}DHxp(7S%0mD-mMG2pUKBTg6z5^uOE4WTAJJG`O+Ew(ytN8c`!07vb*0!Ax5Us+s3`)-Q@6+zEDV%=hjz08p^$?0A2WlEic(u`S&1nt^dfL7~ZkUxoR&S>@~9G{&K&%+UrHu+5Q)EG0Qm($7C7LgLJ=l0-pK5GR-zo>nl=nct8mtSnXn_Q*64twL%p{&}csX0<7?0lNRXCY{Alpk~f)3hT0&iEz8Q@t@arDcw|@d2wp$sUz02iU`~Hsq8BlAbVz5;oE0UNP9mPBRgqhn{Mo_uMUtPwf!u&4}c{`>_Q>N-r8Ra9}~wPKDvNZEKQs(&K$N)qs{2}CHTFD_}}Xx@hMqqw%}VCle8e%>xu8O!nt48eQ1*K9I(>Jeu`J6pgHW8KuAznZIgRK#~})gXK_X7^;A)o!-1NtRW>zA%@4gTZcE{QEuFuP(*NQS+e_a)gRods9`4Yd1WQ4J%=jEprnR2Q=AQIQAT2Xi9~a-sH(AroO9eUmm)5`!$l{7i2K)?}9(=GU{?GfIyWLKCdC%hsHx)c_gs4cbS^+n~a9cw0%x@?87=ut6zYgS(OxHeJBtb`9OyGSbQgVX|m#+dMY&EH-Z|OAHtZ6&gY+a~vrC>6Miw*r^4P?@K{x=ynu+7}mA|$~i=9ma{HLC39eI0b$hSUOZqe)0fHg`hdAhL?{##3)9=&PbtX|ITccQ<`@JkVE`mXT6)Cr1x6eyu!gzZ6=R4`Ty^&Se>KRIGc^>6|>TzUHt;X%fpV^1sctBvwcsZyY2FQfkF@b=F%+iN!rpGzJ=|n=JuXb$r`Y?SLipok0X;+A5619^eAjF4nis9PpKkLzm*&f(!<$G{Ue~oLf~tuJ0t6vHW`hIT5U$)m7@_6KOcn7VX<*n;C8WdS8O!{KHqe7iva6C*|1c9;U7f{qu+TfXLQBU-|X_08ZGkC>xZnnA_qei!EB!=~;HTgMb>XLq8ffjVkPgecSI!kU?AT`~#J(Gn$o(mlAxG<*(ex_Xr453dof;n$_rMPKz*nop5FaJW4boAhp8!t-s7zZg9FQGQ|E(b-5-MYvZ9zdJ|AHVy*5(7ya)DW8z>+5AE@}RoOj#M}HDs*eo%?bZzMzJQyH;LoX^Hkw>XQ_pa?S#^wHLvbf>m0gMVIPEpGPW@#M`&+Em2~chIpeA~oxN=Y%s^XG$s~-ZEe-W>>*>dzYMNU+p|s*;I?Nd1Qep}S3mwOIgmZ6Tmdc!`ikK6lX@AnCcuNw=oy{4-^huhba?kZVNhEAA=EdcaS5_3*26E~M){@!8NU(uNlx3q-+<%742VJj15Yh`7EX#IHuacH%PdEI(4(>+K+*;XY-~vull(O4|)D%L|z&ZvPF)FdYD3lVYVP%I_rO1+@SUD$3Wq!-4MkcOL#D|*%2walFv^`{O0kbG4k$lcms^m7_o>&RBNDPNcC8TQtlXb#)+E>=dbxw;}9^PiKRWCfnl{PJN@qNreyS!HoE2U0IvbL_|&aUS|;P6WzJdyFeO5y+Fal`w%-qRcl`l4@&rw5>4`}QT~G*Q=ULJt*_cpZ-y)C^#@x^+EQia0mLlEn|1X65niP`G>3=p4})8ZN&g)oGG|KgTxqhUaeJ`e0fpsZZ8N3muUd`)?AyD6JdvJjbf5$ISku{<;yrtcQEfGpv2~H57Hmo-DAKnbSct=U(M-!{~|yQ#6lCZx~}xmb_0S<&O2>;O^qU_7sZVg8v+vJZu1nmn<9kAS+GF$H?c}%rUS4r8|Qdss)wL4+%-l%25~Ax?GQ&M1xEiQ#R!8UgJ#JAUETN|;211R0go{nt4H@U(Hcm9qlD+wWo3MW&-NPOOh(Ow^S9%^7=`qBjf9+aYA#$W|j<1IV+--H^Yff|TDq%)9TQ7(6$DT}fq+D^fFLzBK#8lHn?Mx(0RDS=A1lu}RoOEEm=@wGO2PvCE(BqO+=$(Yz^}DdH>qO8GVLq@om!_on6{RP?CNC<$EF$s~{+nGBBh}9>&$=eWb4Mj3@bZwhSIjqpRmT;@sSVZ_&LfBU)2(1lB5|8!p`V=mFTs?Kq8f#M)tKj3>}SAPP_dB$CFB;ItC+3;>gU06^}TK;rY7^pHcNAYchBR*)6bzDse4gs|Z&FXg{A_grI&o~7KCT8BmcSvf~dGg7i`2{`ZgEze(d`2jiXsU~Zq-zEN2^^iO4bGoTXBCMh^=WxXIQGz4HPNOs6+qdxGe-Gsj=LE_39PQHiJ-?zjcQEuH3hxx=MS*+c9qMMOW>p^%W<$$bAyh)K^2zOwT+9k#41hLbNH`r*!I#))b{%@b*V-+wDgv;e{WP0d3U%;0J|l&3=yU*B(55XRikcTlL8SC@`RtLxyW4u1*<7JT2tO;tmzda0uNj1Vm_e0lw@jxDE0+34I^IB}M$7a2@ipG0(^974WacqjAU<`5j~){LKWcYmbbjw2^3A-@X^|QcbR7Qg&s9EEetJ5&r{?oB^6o8$CF5mc}y3`;6rN5+bK9Vq?u1MB-5LOBQV%4XRYFp;J^eA~eayaI89?e(ZKvIQ~&cFdHK1t*^60hbX_S@=(AFr9pm<|0>pzMt+XovxG_Ihp_diA@Up1$wJxOaI&?>6>5!I^M?ZDeXD8J0VbQmt1|H?kzv+uQj8W_t7G{(45Su%#Bna`{jqfJh0O5HQxV^WT`u(zP-lfiQG0@Ps92*UZ=*6p^ZXvJT=}|cmSKvA>n()k!0iBbS3k!qZaR!ti8uWKIDF^JIM?Dhl*@h0*d=m_i#ylgquU~R*EpJ_8{@kr^eRvy5Jgmkp4hHo5if-cSpb|x>(4XxfWg~QNcr190(TjgQ(Sfbp&neRI8>J1eX8i`5i6rhxS=hzIAkwiA4iLHK46oRslO$6NRAZ1?qtzGQdTB}erE9C0MvH$ScwzXYs&#&SE@$^-L5`@PIoG}gGBwn0MEXbJ**?~8XU%MK%~k7Ro%6-6i1j@t!%(qF{tjkrtd3Nm$?(oTv|z`jCkt@-I~$g#|F1!VgMRdNl1Z4XL6!HFLOS)R>>jy5R9i_xAfR`MMC?XsnV{xp*p9B)P?W&Eog6d*qcU+ep|!N$Y10a6!DrmTQcBEvEyxEX0LX1oweKJ+AA|Z>!(A_L8yP}|mHUj{kjU+j@mc$9ip@y3HOwd8zF_X}TQI+6KGqj=`5gjBS~b@6PrDEX(w-qC5j?wp@p@kQV={=5Hx#Rm3o70!+7)5_GzTk)y5Ok-VBMH7iDazhV9AH8tpK@V>cPDcQ>_YZ5&ZnOXk5!c1+7=oxOXMi$lIqq-jDX@6YJ4YcJ@L=;i%AIN+$PS>e{tQguzav!28QzcqlpIuW*bMrLZIJehh?xyD&+%x=z)UE!-1e2qK#=FjlZ%TVV$5ukv%I8XnlXeWXZbO&ITISA=_q6k<{A94V&dV+&{rqJt*@NA9Di5*D{5Kb(c&QYchP&`e%mi|O1vtaw(Ycm-Buf+^D;PG2Y|RDl%J2KnY-Oj!m$!9{?WH+$t_|$yD`s=XSECc=1FplQcva8mS@|opkl6$GidFSDcTB%xp_QV$hk4u%$8Qd*SX{v(J~IaN>e0IhfKdla;Nb7-QD5lMiUNmrewh*TSUO_5PVo0aH<9>o7ZIY9io3Tc|-!f35mzwp@btt4vUajb0cmnP5cT9O;Hq2xt-#j*L5P*OWQC?bgCRyIA#969B><;p>>9tubXZfb)WuM_hdmQexo~ow2?6q8vmk&4|o~8{LJ&qQe0jaQEv6zF9w%HJKfXRZ-kL}kIGk4rotMHLo^wdoo8TNT}osR;%FX)A#R`ZW5BfKDQ1W!pa?fn*DxWW%>VqvjGz5E#P`c<9dv4o#4gmjL(;k#KjF3wJ*s?b28xqTJZmrxZVG==_O-C(}c|xEy0k6BLimVX$dg*z87M)LqV4?+Q`w>->*fUo|qfEQ`KgY*8+M7o41xtP@jU7xU|NToIcDFDe-_&C0Fcet-qDQ9Nyl30WhxN10fVO(Y>>#%}|Ub|Bc@3SgUN18$J1#NEN&6r5oLuSoCxHs-D_&lW;N^-kEVyBT3|bcF%Pi;eqa7kMLv8|D*Fe&qOEsL=-dP_LQ|(w;UaM%PgFgUa1=Uoq87*hgI3V|w;#xnWr+LDHmCEC-B%_L7qYwD!gc)gLrlgdS8BL`mRn{=KtwHpDlIdDzg#H3K_N8k?AIUTgLYD_<6J4S(gIJydav}e-3fFY1152EH7DpEoC@m@u~6#s{E&b(Uvrw$X-dpiTe{!^gOM^n+<2oIbtcZh$qmN_ZSl~2likZP?GlMo)boD!Wt9_>}KR>G`$a_>y@tl8|2^`Y)&h;k633r5ndyaQb%wsqOgDIk8bsnl$80vJv+KMgSkMCvszYe(}77BZS8K7@ttr)#L`WeG2&rIVAvN=Moa{>XG(jo88U?clpv?9sL(Lry+&Pa2iWGB)wtwPWhr-0Uyyq!pN&hgMb=*;8zW}9*L5mHm#I66|dqjb0VdlviviWlF)&ogGto@>X{)dq?-EtdW?oj$+XG{G5msuGsg~=NM8=PdN)qv0D&q9;>B%dp$D>FGoouYzf&v!+jT{iGVOk8;0(SCR}z(qMFh6Sy~g7)tR6sbzj!d8e^)czVzs#40Tlut+6A)xTTVT_==KMN{hHnE%PQ;`wB)TE&p;NoWn#%Op>5+ibG*xui8}N(zI^ZN&4V2`2VCW;W)O3R=?wd;`r++#v8N?$IfP9o?KQ2&4aJ5U2Zeeh^IO11H>U075hy06XU49UxP-HTsnd&nXDNm`{K1h?CPNf3XajQ?Rc1(T@7>Qu9#)E^}#A$MJhZ74ECwR0TU?Se_9O{A;ZBD$a5gsGzD`)8@2bExE*o^fWZ;m2+ld-fW$0nX-L{4KkK0J9-)A)Mi;!{ab5OstzsP}nIl*G^5G6NdPw(be`@=xY3rw#L&cz#qec^kPaa9RQKCqui7(VY;RvYR5JOjhlw#W=of-T`3QwV6+m2riyU}lMJZqP7_SKiJDuP&rhRdy>9o>@Kp=v4cAnxsRZxh78moKE=<3{*VfRV+$`yZ@eiOXwdf69`gp+3UZgQl-n1VRVenID`{_W}K4LSe3XY!*y;6IolUn1t4bG?n(z|PmjT!3;W!VesH-Azkav>(Jmo@_QR3oT|UwyTdtq!x{)Zk%JATc`ILhV6g6K!4#!wqXQ^qqP5NM{B?uV{OJ(OH6v+99ycGE$$m8^2SxlS01@o70a6iI(*k#-POXXxWl?3kX+V}{dAjoOsKeJ%?vWU)tD6}`$7Xz6l&hjq^5gakU6&y(;02*&$!`w$B8`?$&msd_1`3X&kI@OHj^|K&-dj@zo)`nR;sW>jK*v#Yv9V;dZLl~^j`*5dFD|9t70_3CO*O?cY4Ygmy6cO}nr1n)^&r{6bkki&Vg_9Gq;((6YktLP7MxD#bmyLOSsSa4fDLeOfyUst`kCfF=JzE3JKc$S1u#XZvQ|zZ0t?4l?6`i%0>h4ziOP^Dz+Tw@yBzi{vLS&ke8cAtI5}G*d86l?v!J@We$N5}B&aW646z0_J@h6tQ*~*hl2`YCxpB!LV>WeGcd&I1OY-n@(vDn<>y@%rE_4HE8K*TW=XGvS_8bTDDPweTjHUY?BGLG9QwL&1dDCbSHo1#Nz({t&;&YiS0nl%A)@wmA-5^e#5TSrzK*J?|1XFzH*>pc1_Uhq_u8*pXIYwoAus4&<;J$K3@LfxJ^2Q<{Cf@unoWXgqL^hn%l1q;Wm}RcHq3@lWZP{rXe+`*ZVyz+>fsFEYDBTy9{_@M{p~oXE8rsy*awM$&0bC=PlqLu(OhI$Xq2#AadT4$X~MrPzlUa0jnyX6qv_kXe#;_qkVd73^$cNGvl0xe-B1{sKTuM%Ee2SK!ZeCFpN}nqB`~g*g~DSk7WIm+c|Eeld^t@HxqFlXFvhN~w@B&&LN~HbUDuCeygT#dnH_#&>w)oAPF4>!mx~1$-n2%qVLyXu>{T#>fC@x4t@i>#GES+b4PD;Gwm7FS^gGYrt`cbo*WzZ0WuT1v0J+TLs4nfeJU|jKs(^Si_nsZ6C~3gBvjNKD_$y;tN%TMMgFF){fyubVtLiagb))v|;}+DIA?Mn!4@(mO'),format=L.FORMAT_RAW,filters=[{"id":L.FILTER_LZMA2}])) diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed314.log b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed314.log new file mode 100644 index 0000000000..b3a66b9552 --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed314.log @@ -0,0 +1,760 @@ +W0428 10:03:38.244000 572270 torch/distributed/run.py:803] +W0428 10:03:38.244000 572270 torch/distributed/run.py:803] ***************************************** +W0428 10:03:38.244000 572270 torch/distributed/run.py:803] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. +W0428 10:03:38.244000 572270 torch/distributed/run.py:803] ***************************************** +Hyperparameters: + adam_eps: 1e-08 + adam_wd: 0.02 + artifact_dir: + attn_clip_sigmas: 13.0 + beta1: 0.9 + beta2: 0.95 + compressor: brotli + data_dir: ./data/ + datasets_dir: ./data/datasets/fineweb10B_sp8192 + distributed: True + ema_decay: 0.9965 + embed_bits: 7 + embed_clip_sigmas: 15.0 + embed_lr: 0.6 + embed_wd: 0.085 + enable_looping_at: 0.35 + eval_seq_len: 2048 + eval_stride: 64 + gate_attn_out: True + gate_attn_width: 24 + global_ttt_batch_seqs: 32 + global_ttt_chunk_tokens: 32768 + global_ttt_epochs: 1 + global_ttt_grad_clip: 1.0 + global_ttt_lr: 0.001 + global_ttt_momentum: 0.9 + global_ttt_respect_doc_boundaries: True + global_ttt_warmup_chunks: 0 + global_ttt_warmup_start_lr: 0.0 + gptq_calibration_batches: 16 + gptq_reserve_seconds: 4.0 + grad_accum_steps: 1 + grad_clip_norm: 0.3 + is_main_process: True + iterations: 20000 + ln_scale: True + local_rank: 0 + logfile: logs/champion_3seed_314.txt + logit_softcap: 30.0 + loop_end: 5 + loop_start: 3 + lqer_asym_enabled: True + lqer_asym_group: 64 + lqer_enabled: True + lqer_factor_bits: 4 + lqer_rank: 4 + lqer_top_k: 3 + matrix_bits: 6 + matrix_clip_sigmas: 12.85 + matrix_lr: 0.026 + max_wallclock_seconds: 600.0 + min_lr: 0.0 + mlp_clip_sigmas: 12.0 + mlp_mult: 4.0 + model_dim: 512 + model_path: final_model.pt + muon_backend_steps: 5 + muon_momentum: 0.97 + muon_momentum_warmup_start: 0.92 + muon_momentum_warmup_steps: 1500 + muon_row_normalize: True + muon_wd: 0.095 + newton_muon_all_reduce_k: True + newton_muon_beta: 0.95 + newton_muon_capture_every: 4 + newton_muon_enabled: False + newton_muon_gamma: 0.2 + newton_muon_k_refresh: 32 + newton_muon_warmup: 100 + num_heads: 8 + num_kv_heads: 4 + num_layers: 11 + num_loops: 2 + parallel_final_lane: mean + parallel_start_layer: 8 + phased_ttt_enabled: True + phased_ttt_num_phases: 3 + phased_ttt_prefix_docs: 2000 + polar_express_ns: True + qk_gain_init: 5.25 + quantized_model_path: final_model.int6.ptz + rank: 0 + rope_base: 10000.0 + rope_dims: 16 + rope_train_seq_len: 2048 + rope_yarn: False + run_id: champion_3seed_314 + scalar_lr: 0.02 + seed: 314 + skip_gates_enabled: True + sliding_window_enabled: False + smear_gate_enabled: True + smear_gate_width: 12 + tie_embeddings: True + tied_embed_init_std: 0.005 + tied_embed_lr: 0.03 + tokenizer_path: ./data/tokenizers/fineweb_8192_bpe.model + train_batch_tokens: 786432 + train_files: ./data/datasets/fineweb10B_sp8192/fineweb_train_*.bin + train_log_every: 500 + train_seq_len: 2048 + ttt_batch_size: 64 + ttt_beta1: 0.0 + ttt_beta2: 0.999 + ttt_chunk_size: 48 + ttt_enabled: True + ttt_eval_batches: + ttt_eval_seq_len: 2048 + ttt_grad_steps: 1 + ttt_k_lora: True + ttt_lora_lr: 0.0001 + ttt_lora_rank: 192 + ttt_mlp_lora: True + ttt_o_lora: True + ttt_optimizer: adam + ttt_weight_decay: 1.0 + val_batch_tokens: 524288 + val_doc_fraction: 1.0 + val_files: ./data/datasets/fineweb10B_sp8192/fineweb_val_*.bin + val_loss_every: 4000 + vocab_size: 8192 + warmdown_frac: 0.75 + warmup_steps: 20 + world_size: 8 + xsa_last_n: 11 +train_shards: 80 +val_tokens: 40540160 +model_params:35946727 +gptq:reserving 4s, effective=596000ms +warmup_cu_buckets:64,128,192,256 iters_each:3 +warmup_step: 1/20 +warmup_step: 2/20 +warmup_step: 3/20 +warmup_step: 4/20 +warmup_step: 5/20 +warmup_step: 6/20 +warmup_step: 10/20 +warmup_step: 20/20 +loop_warmup:enabled encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +loop_warmup_step: 1/20 +loop_warmup_step: 2/20 +loop_warmup_step: 3/20 +loop_warmup_step: 4/20 +loop_warmup_step: 5/20 +loop_warmup_step: 6/20 +loop_warmup_step: 10/20 +loop_warmup_step: 20/20 +0/20000 val_loss: 9.0071 val_bpb: 3.4868 +1/20000 train_loss: 9.0069 train_time: 0.0m tok/s: 12258398 +2/20000 train_loss: 12.1748 train_time: 0.0m tok/s: 11455388 +3/20000 train_loss: 11.2501 train_time: 0.0m tok/s: 10257369 +4/20000 train_loss: 9.6402 train_time: 0.0m tok/s: 9641325 +5/20000 train_loss: 8.1913 train_time: 0.0m tok/s: 9351822 +500/20000 train_loss: 3.2524 train_time: 0.8m tok/s: 8092439 +1000/20000 train_loss: 3.0116 train_time: 1.6m tok/s: 8045044 +1500/20000 train_loss: 3.0224 train_time: 2.4m tok/s: 8036101 +2000/20000 train_loss: 2.9731 train_time: 3.3m tok/s: 8036881 +layer_loop:enabled step:2132 frac:0.350 encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +2500/20000 train_loss: 3.0614 train_time: 4.4m tok/s: 7492688 +3000/20000 train_loss: 2.9019 train_time: 5.6m tok/s: 7045827 +3500/20000 train_loss: 2.9652 train_time: 6.8m tok/s: 6760279 +4000/20000 train_loss: 2.8987 train_time: 8.0m tok/s: 6577144 +4000/20000 val_loss: 2.8729 val_bpb: 1.1121 +4500/20000 train_loss: 2.8459 train_time: 9.2m tok/s: 6441850 +4827/20000 val_loss: 2.7693 val_bpb: 1.0720 +stopping_early: wallclock_cap train_time: 596063ms step: 4827/20000 +peak memory allocated: 40141 MiB reserved: 44206 MiB +ema:applying EMA weights +diagnostic pre-quantization post-ema val_loss:2.76820294 val_bpb:1.07162309 eval_time:6808ms +Serialized model: 135422397 bytes +Code size (uncompressed): 134706 bytes +Code size (compressed): 33710 bytes +GPTQ:collecting Hessians from calibration data... +GPTQ:collected 67 Hessians in 3.5s +Quantized weights: + gptq (int6): blocks.attn.c_k.weight, blocks.attn.c_q.weight, blocks.attn.c_v.weight, blocks.attn.proj.weight, blocks.mlp.fc.weight, blocks.mlp.proj.weight + gptq (int6)+lqer_asym: blocks.mlp.fc.weight + gptq (int7)+lqer_asym: tok_emb.weight + passthrough (float16): blocks.attn.attn_gate_proj.weight, blocks.attn.q_gain, blocks.attn_scale, blocks.mlp_scale, blocks.resid_mix, parallel_post_lambdas, parallel_resid_lambdas, skip_gates, skip_weights, smear_gate.weight, smear_lambda +Serialized model quantized+brotli: 15921214 bytes +Total submission size quantized+brotli: 15954924 bytes +diagnostic quantized val_loss:2.79339539 val_bpb:1.08137555 eval_time:10591ms +ttt_lora:warming up compile (random tokens, no val data) +ttt_lora:compile warmup done (103.7s) + +beginning TTT eval timer +ttt_phased: total_docs:50000 prefix_docs:2000 suffix_docs:48000 num_phases:3 boundaries:[666, 1333, 2000] +ttp: b781/782 bl:2.5595 bb:1.0566 rl:2.5595 rb:1.0566 dl:14510-25988 gd:0 +ttpp: phase:1/3 pd:1104 gd:666 t:201.3s +tttg: c1/95 lr:0.001000 t:0.3s +tttg: c2/95 lr:0.001000 t:0.4s +tttg: c3/95 lr:0.000999 t:0.5s +tttg: c4/95 lr:0.000997 t:0.6s +tttg: c5/95 lr:0.000996 t:0.6s +tttg: c6/95 lr:0.000993 t:0.7s +tttg: c7/95 lr:0.000990 t:0.8s +tttg: c8/95 lr:0.000986 t:2.4s +tttg: c9/95 lr:0.000982 t:2.5s +tttg: c10/95 lr:0.000978 t:2.6s +tttg: c11/95 lr:0.000972 t:2.7s +tttg: c12/95 lr:0.000967 t:2.7s +tttg: c13/95 lr:0.000960 t:2.8s +tttg: c14/95 lr:0.000954 t:2.9s +tttg: c15/95 lr:0.000946 t:3.0s +tttg: c16/95 lr:0.000938 t:3.0s +tttg: c17/95 lr:0.000930 t:3.1s +tttg: c18/95 lr:0.000921 t:3.2s +tttg: c19/95 lr:0.000912 t:3.3s +tttg: c20/95 lr:0.000903 t:3.3s +tttg: c21/95 lr:0.000892 t:3.4s +tttg: c22/95 lr:0.000882 t:3.5s +tttg: c23/95 lr:0.000871 t:3.6s +tttg: c24/95 lr:0.000859 t:3.7s +tttg: c25/95 lr:0.000848 t:3.7s +tttg: c26/95 lr:0.000835 t:3.8s +tttg: c27/95 lr:0.000823 t:3.9s +tttg: c28/95 lr:0.000810 t:3.9s +tttg: c29/95 lr:0.000797 t:4.0s +tttg: c30/95 lr:0.000783 t:4.1s +tttg: c31/95 lr:0.000769 t:4.2s +tttg: c32/95 lr:0.000755 t:4.2s +tttg: c33/95 lr:0.000740 t:4.3s +tttg: c34/95 lr:0.000726 t:4.4s +tttg: c35/95 lr:0.000710 t:4.5s +tttg: c36/95 lr:0.000695 t:4.6s +tttg: c37/95 lr:0.000680 t:4.7s +tttg: c38/95 lr:0.000664 t:4.7s +tttg: c39/95 lr:0.000648 t:4.8s +tttg: c40/95 lr:0.000632 t:4.9s +tttg: c41/95 lr:0.000616 t:5.0s +tttg: c42/95 lr:0.000600 t:5.0s +tttg: c43/95 lr:0.000583 t:5.1s +tttg: c44/95 lr:0.000567 t:5.2s +tttg: c45/95 lr:0.000550 t:5.3s +tttg: c46/95 lr:0.000533 t:5.4s +tttg: c47/95 lr:0.000517 t:5.4s +tttg: c48/95 lr:0.000500 t:5.5s +tttg: c49/95 lr:0.000483 t:5.6s +tttg: c50/95 lr:0.000467 t:5.7s +tttg: c51/95 lr:0.000450 t:5.7s +tttg: c52/95 lr:0.000433 t:5.8s +tttg: c53/95 lr:0.000417 t:5.9s +tttg: c54/95 lr:0.000400 t:6.0s +tttg: c55/95 lr:0.000384 t:6.0s +tttg: c56/95 lr:0.000368 t:6.1s +tttg: c57/95 lr:0.000352 t:6.2s +tttg: c58/95 lr:0.000336 t:6.3s +tttg: c59/95 lr:0.000320 t:6.4s +tttg: c60/95 lr:0.000305 t:6.4s +tttg: c61/95 lr:0.000290 t:6.5s +tttg: c62/95 lr:0.000274 t:6.6s +tttg: c63/95 lr:0.000260 t:6.6s +tttg: c64/95 lr:0.000245 t:6.7s +tttg: c65/95 lr:0.000231 t:6.8s +tttg: c66/95 lr:0.000217 t:6.9s +tttg: c67/95 lr:0.000203 t:6.9s +tttg: c68/95 lr:0.000190 t:7.0s +tttg: c69/95 lr:0.000177 t:7.1s +tttg: c70/95 lr:0.000165 t:7.2s +tttg: c71/95 lr:0.000152 t:7.3s +tttg: c72/95 lr:0.000141 t:7.3s +tttg: c73/95 lr:0.000129 t:7.4s +tttg: c74/95 lr:0.000118 t:7.5s +tttg: c75/95 lr:0.000108 t:7.6s +tttg: c76/95 lr:0.000097 t:7.7s +tttg: c77/95 lr:0.000088 t:7.7s +tttg: c78/95 lr:0.000079 t:7.8s +tttg: c79/95 lr:0.000070 t:7.9s +tttg: c80/95 lr:0.000062 t:8.0s +tttg: c81/95 lr:0.000054 t:8.0s +tttg: c82/95 lr:0.000046 t:8.1s +tttg: c83/95 lr:0.000040 t:8.2s +tttg: c84/95 lr:0.000033 t:8.3s +tttg: c85/95 lr:0.000028 t:8.4s +tttg: c86/95 lr:0.000022 t:8.4s +tttg: c87/95 lr:0.000018 t:8.5s +tttg: c88/95 lr:0.000014 t:8.6s +tttg: c89/95 lr:0.000010 t:8.7s +tttg: c90/95 lr:0.000007 t:8.7s +tttg: c91/95 lr:0.000004 t:8.8s +tttg: c92/95 lr:0.000003 t:8.9s +tttg: c93/95 lr:0.000001 t:9.0s +tttg: c94/95 lr:0.000000 t:9.1s +ttpr: phase:1/3 t:214.5s +ttp: b762/782 bl:2.8210 bb:1.0737 rl:2.6004 rb:1.0595 dl:3431-3533 gd:0 +ttpp: phase:2/3 pd:1808 gd:1333 t:284.0s +tttg: c1/158 lr:0.001000 t:0.1s +tttg: c2/158 lr:0.001000 t:0.2s +tttg: c3/158 lr:0.001000 t:0.2s +tttg: c4/158 lr:0.000999 t:0.3s +tttg: c5/158 lr:0.000998 t:0.4s +tttg: c6/158 lr:0.000997 t:0.5s +tttg: c7/158 lr:0.000996 t:0.6s +tttg: c8/158 lr:0.000995 t:0.6s +tttg: c9/158 lr:0.000994 t:0.7s +tttg: c10/158 lr:0.000992 t:0.8s +tttg: c11/158 lr:0.000990 t:0.9s +tttg: c12/158 lr:0.000988 t:0.9s +tttg: c13/158 lr:0.000986 t:1.0s +tttg: c14/158 lr:0.000983 t:1.1s +tttg: c15/158 lr:0.000981 t:1.2s +tttg: c16/158 lr:0.000978 t:1.2s +tttg: c17/158 lr:0.000975 t:1.3s +tttg: c18/158 lr:0.000971 t:1.4s +tttg: c19/158 lr:0.000968 t:1.5s +tttg: c20/158 lr:0.000964 t:1.6s +tttg: c21/158 lr:0.000960 t:1.6s +tttg: c22/158 lr:0.000957 t:1.7s +tttg: c23/158 lr:0.000952 t:1.8s +tttg: c24/158 lr:0.000948 t:1.9s +tttg: c25/158 lr:0.000943 t:1.9s +tttg: c26/158 lr:0.000939 t:2.0s +tttg: c27/158 lr:0.000934 t:2.1s +tttg: c28/158 lr:0.000929 t:2.2s +tttg: c29/158 lr:0.000924 t:2.3s +tttg: c30/158 lr:0.000918 t:2.3s +tttg: c31/158 lr:0.000913 t:2.4s +tttg: c32/158 lr:0.000907 t:2.5s +tttg: c33/158 lr:0.000901 t:2.6s +tttg: c34/158 lr:0.000895 t:2.6s +tttg: c35/158 lr:0.000889 t:2.7s +tttg: c36/158 lr:0.000882 t:2.8s +tttg: c37/158 lr:0.000876 t:2.9s +tttg: c38/158 lr:0.000869 t:3.0s +tttg: c39/158 lr:0.000862 t:3.0s +tttg: c40/158 lr:0.000855 t:3.1s +tttg: c41/158 lr:0.000848 t:3.2s +tttg: c42/158 lr:0.000841 t:3.3s +tttg: c43/158 lr:0.000834 t:3.3s +tttg: c44/158 lr:0.000826 t:3.4s +tttg: c45/158 lr:0.000818 t:3.5s +tttg: c46/158 lr:0.000811 t:3.6s +tttg: c47/158 lr:0.000803 t:3.7s +tttg: c48/158 lr:0.000795 t:3.7s +tttg: c49/158 lr:0.000787 t:3.8s +tttg: c50/158 lr:0.000778 t:3.9s +tttg: c51/158 lr:0.000770 t:4.0s +tttg: c52/158 lr:0.000761 t:4.0s +tttg: c53/158 lr:0.000753 t:4.1s +tttg: c54/158 lr:0.000744 t:4.2s +tttg: c55/158 lr:0.000735 t:4.3s +tttg: c56/158 lr:0.000727 t:4.4s +tttg: c57/158 lr:0.000718 t:4.4s +tttg: c58/158 lr:0.000709 t:4.5s +tttg: c59/158 lr:0.000699 t:4.6s +tttg: c60/158 lr:0.000690 t:4.7s +tttg: c61/158 lr:0.000681 t:4.7s +tttg: c62/158 lr:0.000672 t:4.8s +tttg: c63/158 lr:0.000662 t:4.9s +tttg: c64/158 lr:0.000653 t:5.0s +tttg: c65/158 lr:0.000643 t:5.1s +tttg: c66/158 lr:0.000633 t:5.1s +tttg: c67/158 lr:0.000624 t:5.2s +tttg: c68/158 lr:0.000614 t:5.3s +tttg: c69/158 lr:0.000604 t:5.4s +tttg: c70/158 lr:0.000594 t:5.4s +tttg: c71/158 lr:0.000585 t:5.5s +tttg: c72/158 lr:0.000575 t:5.6s +tttg: c73/158 lr:0.000565 t:5.7s +tttg: c74/158 lr:0.000555 t:5.7s +tttg: c75/158 lr:0.000545 t:5.8s +tttg: c76/158 lr:0.000535 t:5.9s +tttg: c77/158 lr:0.000525 t:6.0s +tttg: c78/158 lr:0.000515 t:6.0s +tttg: c79/158 lr:0.000505 t:6.1s +tttg: c80/158 lr:0.000495 t:6.2s +tttg: c81/158 lr:0.000485 t:6.3s +tttg: c82/158 lr:0.000475 t:6.4s +tttg: c83/158 lr:0.000465 t:6.5s +tttg: c84/158 lr:0.000455 t:6.5s +tttg: c85/158 lr:0.000445 t:6.6s +tttg: c86/158 lr:0.000435 t:6.7s +tttg: c87/158 lr:0.000425 t:6.8s +tttg: c88/158 lr:0.000415 t:6.8s +tttg: c89/158 lr:0.000406 t:6.9s +tttg: c90/158 lr:0.000396 t:7.0s +tttg: c91/158 lr:0.000386 t:7.1s +tttg: c92/158 lr:0.000376 t:7.1s +tttg: c93/158 lr:0.000367 t:7.2s +tttg: c94/158 lr:0.000357 t:7.3s +tttg: c95/158 lr:0.000347 t:7.4s +tttg: c96/158 lr:0.000338 t:7.5s +tttg: c97/158 lr:0.000328 t:7.5s +tttg: c98/158 lr:0.000319 t:7.6s +tttg: c99/158 lr:0.000310 t:7.7s +tttg: c100/158 lr:0.000301 t:7.8s +tttg: c101/158 lr:0.000291 t:7.8s +tttg: c102/158 lr:0.000282 t:7.9s +tttg: c103/158 lr:0.000273 t:8.0s +tttg: c104/158 lr:0.000265 t:8.1s +tttg: c105/158 lr:0.000256 t:8.1s +tttg: c106/158 lr:0.000247 t:8.2s +tttg: c107/158 lr:0.000239 t:8.3s +tttg: c108/158 lr:0.000230 t:8.4s +tttg: c109/158 lr:0.000222 t:8.5s +tttg: c110/158 lr:0.000213 t:8.5s +tttg: c111/158 lr:0.000205 t:8.6s +tttg: c112/158 lr:0.000197 t:8.7s +tttg: c113/158 lr:0.000189 t:8.8s +tttg: c114/158 lr:0.000182 t:8.8s +tttg: c115/158 lr:0.000174 t:8.9s +tttg: c116/158 lr:0.000166 t:9.0s +tttg: c117/158 lr:0.000159 t:9.1s +tttg: c118/158 lr:0.000152 t:9.1s +tttg: c119/158 lr:0.000145 t:9.2s +tttg: c120/158 lr:0.000138 t:9.3s +tttg: c121/158 lr:0.000131 t:9.4s +tttg: c122/158 lr:0.000124 t:9.5s +tttg: c123/158 lr:0.000118 t:9.5s +tttg: c124/158 lr:0.000111 t:9.6s +tttg: c125/158 lr:0.000105 t:9.7s +tttg: c126/158 lr:0.000099 t:9.8s +tttg: c127/158 lr:0.000093 t:9.9s +tttg: c128/158 lr:0.000087 t:9.9s +tttg: c129/158 lr:0.000082 t:10.0s +tttg: c130/158 lr:0.000076 t:10.1s +tttg: c131/158 lr:0.000071 t:10.2s +tttg: c132/158 lr:0.000066 t:10.2s +tttg: c133/158 lr:0.000061 t:10.3s +tttg: c134/158 lr:0.000057 t:10.4s +tttg: c135/158 lr:0.000052 t:10.5s +tttg: c136/158 lr:0.000048 t:10.5s +tttg: c137/158 lr:0.000043 t:10.6s +tttg: c138/158 lr:0.000040 t:10.7s +tttg: c139/158 lr:0.000036 t:10.8s +tttg: c140/158 lr:0.000032 t:10.8s +tttg: c141/158 lr:0.000029 t:10.9s +tttg: c142/158 lr:0.000025 t:11.0s +tttg: c143/158 lr:0.000022 t:11.1s +tttg: c144/158 lr:0.000019 t:11.1s +tttg: c145/158 lr:0.000017 t:11.2s +tttg: c146/158 lr:0.000014 t:11.3s +tttg: c147/158 lr:0.000012 t:11.4s +tttg: c148/158 lr:0.000010 t:11.5s +tttg: c149/158 lr:0.000008 t:11.6s +tttg: c150/158 lr:0.000006 t:11.6s +tttg: c151/158 lr:0.000005 t:11.7s +tttg: c152/158 lr:0.000004 t:11.8s +tttg: c153/158 lr:0.000003 t:11.9s +tttg: c154/158 lr:0.000002 t:11.9s +tttg: c155/158 lr:0.000001 t:12.0s +tttg: c156/158 lr:0.000000 t:12.1s +tttg: c157/158 lr:0.000000 t:12.2s +ttpr: phase:2/3 t:300.3s +ttp: b746/782 bl:2.6717 bb:1.0520 rl:2.6075 rb:1.0587 dl:2459-2501 gd:0 +ttp: b744/782 bl:2.6491 bb:1.0554 rl:2.6112 rb:1.0584 dl:2388-2419 gd:0 +ttpp: phase:3/3 pd:2448 gd:2000 t:317.4s +tttg: c1/213 lr:0.001000 t:0.1s +tttg: c2/213 lr:0.001000 t:0.2s +tttg: c3/213 lr:0.001000 t:0.2s +tttg: c4/213 lr:0.001000 t:0.3s +tttg: c5/213 lr:0.000999 t:0.4s +tttg: c6/213 lr:0.000999 t:0.5s +tttg: c7/213 lr:0.000998 t:0.5s +tttg: c8/213 lr:0.000997 t:0.6s +tttg: c9/213 lr:0.000996 t:0.7s +tttg: c10/213 lr:0.000996 t:0.8s +tttg: c11/213 lr:0.000995 t:0.8s +tttg: c12/213 lr:0.000993 t:0.9s +tttg: c13/213 lr:0.000992 t:1.0s +tttg: c14/213 lr:0.000991 t:1.1s +tttg: c15/213 lr:0.000989 t:1.2s +tttg: c16/213 lr:0.000988 t:1.2s +tttg: c17/213 lr:0.000986 t:1.3s +tttg: c18/213 lr:0.000984 t:1.4s +tttg: c19/213 lr:0.000982 t:1.5s +tttg: c20/213 lr:0.000980 t:1.5s +tttg: c21/213 lr:0.000978 t:1.6s +tttg: c22/213 lr:0.000976 t:1.7s +tttg: c23/213 lr:0.000974 t:1.8s +tttg: c24/213 lr:0.000971 t:1.8s +tttg: c25/213 lr:0.000969 t:1.9s +tttg: c26/213 lr:0.000966 t:2.0s +tttg: c27/213 lr:0.000963 t:2.1s +tttg: c28/213 lr:0.000961 t:2.2s +tttg: c29/213 lr:0.000958 t:2.2s +tttg: c30/213 lr:0.000955 t:2.3s +tttg: c31/213 lr:0.000951 t:2.4s +tttg: c32/213 lr:0.000948 t:2.5s +tttg: c33/213 lr:0.000945 t:2.5s +tttg: c34/213 lr:0.000941 t:2.6s +tttg: c35/213 lr:0.000938 t:2.7s +tttg: c36/213 lr:0.000934 t:2.8s +tttg: c37/213 lr:0.000931 t:2.9s +tttg: c38/213 lr:0.000927 t:2.9s +tttg: c39/213 lr:0.000923 t:3.0s +tttg: c40/213 lr:0.000919 t:3.1s +tttg: c41/213 lr:0.000915 t:3.2s +tttg: c42/213 lr:0.000911 t:3.3s +tttg: c43/213 lr:0.000906 t:3.3s +tttg: c44/213 lr:0.000902 t:3.4s +tttg: c45/213 lr:0.000897 t:3.5s +tttg: c46/213 lr:0.000893 t:3.6s +tttg: c47/213 lr:0.000888 t:3.6s +tttg: c48/213 lr:0.000884 t:3.7s +tttg: c49/213 lr:0.000879 t:3.8s +tttg: c50/213 lr:0.000874 t:3.9s +tttg: c51/213 lr:0.000869 t:3.9s +tttg: c52/213 lr:0.000864 t:4.0s +tttg: c53/213 lr:0.000859 t:4.1s +tttg: c54/213 lr:0.000854 t:4.2s +tttg: c55/213 lr:0.000848 t:4.2s +tttg: c56/213 lr:0.000843 t:4.3s +tttg: c57/213 lr:0.000837 t:4.4s +tttg: c58/213 lr:0.000832 t:4.5s +tttg: c59/213 lr:0.000826 t:4.5s +tttg: c60/213 lr:0.000821 t:4.6s +tttg: c61/213 lr:0.000815 t:4.7s +tttg: c62/213 lr:0.000809 t:4.8s +tttg: c63/213 lr:0.000803 t:4.9s +tttg: c64/213 lr:0.000797 t:4.9s +tttg: c65/213 lr:0.000791 t:5.0s +tttg: c66/213 lr:0.000785 t:5.1s +tttg: c67/213 lr:0.000779 t:5.2s +tttg: c68/213 lr:0.000773 t:5.2s +tttg: c69/213 lr:0.000767 t:5.3s +tttg: c70/213 lr:0.000761 t:5.4s +tttg: c71/213 lr:0.000754 t:5.5s +tttg: c72/213 lr:0.000748 t:5.6s +tttg: c73/213 lr:0.000741 t:5.6s +tttg: c74/213 lr:0.000735 t:5.7s +tttg: c75/213 lr:0.000728 t:5.8s +tttg: c76/213 lr:0.000722 t:5.9s +tttg: c77/213 lr:0.000715 t:5.9s +tttg: c78/213 lr:0.000708 t:6.0s +tttg: c79/213 lr:0.000702 t:6.1s +tttg: c80/213 lr:0.000695 t:6.2s +tttg: c81/213 lr:0.000688 t:6.3s +tttg: c82/213 lr:0.000681 t:6.3s +tttg: c83/213 lr:0.000674 t:6.4s +tttg: c84/213 lr:0.000667 t:6.5s +tttg: c85/213 lr:0.000660 t:6.6s +tttg: c86/213 lr:0.000653 t:6.6s +tttg: c87/213 lr:0.000646 t:6.7s +tttg: c88/213 lr:0.000639 t:6.8s +tttg: c89/213 lr:0.000632 t:6.9s +tttg: c90/213 lr:0.000625 t:6.9s +tttg: c91/213 lr:0.000617 t:7.0s +tttg: c92/213 lr:0.000610 t:7.1s +tttg: c93/213 lr:0.000603 t:7.2s +tttg: c94/213 lr:0.000596 t:7.3s +tttg: c95/213 lr:0.000588 t:7.3s +tttg: c96/213 lr:0.000581 t:7.4s +tttg: c97/213 lr:0.000574 t:7.5s +tttg: c98/213 lr:0.000566 t:7.6s +tttg: c99/213 lr:0.000559 t:7.6s +tttg: c100/213 lr:0.000552 t:7.7s +tttg: c101/213 lr:0.000544 t:7.8s +tttg: c102/213 lr:0.000537 t:7.9s +tttg: c103/213 lr:0.000530 t:7.9s +tttg: c104/213 lr:0.000522 t:8.0s +tttg: c105/213 lr:0.000515 t:8.1s +tttg: c106/213 lr:0.000507 t:8.2s +tttg: c107/213 lr:0.000500 t:8.3s +tttg: c108/213 lr:0.000493 t:8.3s +tttg: c109/213 lr:0.000485 t:8.4s +tttg: c110/213 lr:0.000478 t:8.5s +tttg: c111/213 lr:0.000470 t:8.6s +tttg: c112/213 lr:0.000463 t:8.6s +tttg: c113/213 lr:0.000456 t:8.7s +tttg: c114/213 lr:0.000448 t:8.8s +tttg: c115/213 lr:0.000441 t:8.9s +tttg: c116/213 lr:0.000434 t:8.9s +tttg: c117/213 lr:0.000426 t:9.0s +tttg: c118/213 lr:0.000419 t:9.1s +tttg: c119/213 lr:0.000412 t:9.2s +tttg: c120/213 lr:0.000404 t:9.3s +tttg: c121/213 lr:0.000397 t:9.3s +tttg: c122/213 lr:0.000390 t:9.4s +tttg: c123/213 lr:0.000383 t:9.5s +tttg: c124/213 lr:0.000375 t:9.6s +tttg: c125/213 lr:0.000368 t:9.7s +tttg: c126/213 lr:0.000361 t:9.7s +tttg: c127/213 lr:0.000354 t:9.8s +tttg: c128/213 lr:0.000347 t:9.9s +tttg: c129/213 lr:0.000340 t:10.0s +tttg: c130/213 lr:0.000333 t:10.0s +tttg: c131/213 lr:0.000326 t:10.1s +tttg: c132/213 lr:0.000319 t:10.2s +tttg: c133/213 lr:0.000312 t:10.3s +tttg: c134/213 lr:0.000305 t:10.4s +tttg: c135/213 lr:0.000298 t:10.4s +tttg: c136/213 lr:0.000292 t:10.5s +tttg: c137/213 lr:0.000285 t:10.6s +tttg: c138/213 lr:0.000278 t:10.7s +tttg: c139/213 lr:0.000272 t:10.7s +tttg: c140/213 lr:0.000265 t:10.8s +tttg: c141/213 lr:0.000259 t:10.9s +tttg: c142/213 lr:0.000252 t:11.0s +tttg: c143/213 lr:0.000246 t:11.1s +tttg: c144/213 lr:0.000239 t:11.1s +tttg: c145/213 lr:0.000233 t:11.2s +tttg: c146/213 lr:0.000227 t:11.3s +tttg: c147/213 lr:0.000221 t:11.4s +tttg: c148/213 lr:0.000215 t:11.4s +tttg: c149/213 lr:0.000209 t:11.5s +tttg: c150/213 lr:0.000203 t:11.6s +tttg: c151/213 lr:0.000197 t:11.7s +tttg: c152/213 lr:0.000191 t:11.7s +tttg: c153/213 lr:0.000185 t:11.8s +tttg: c154/213 lr:0.000179 t:11.9s +tttg: c155/213 lr:0.000174 t:12.0s +tttg: c156/213 lr:0.000168 t:12.0s +tttg: c157/213 lr:0.000163 t:12.1s +tttg: c158/213 lr:0.000157 t:12.2s +tttg: c159/213 lr:0.000152 t:12.3s +tttg: c160/213 lr:0.000146 t:12.3s +tttg: c161/213 lr:0.000141 t:12.4s +tttg: c162/213 lr:0.000136 t:12.5s +tttg: c163/213 lr:0.000131 t:12.6s +tttg: c164/213 lr:0.000126 t:12.6s +tttg: c165/213 lr:0.000121 t:12.7s +tttg: c166/213 lr:0.000116 t:12.8s +tttg: c167/213 lr:0.000112 t:12.9s +tttg: c168/213 lr:0.000107 t:13.0s +tttg: c169/213 lr:0.000103 t:13.0s +tttg: c170/213 lr:0.000098 t:13.1s +tttg: c171/213 lr:0.000094 t:13.2s +tttg: c172/213 lr:0.000089 t:13.3s +tttg: c173/213 lr:0.000085 t:13.4s +tttg: c174/213 lr:0.000081 t:13.4s +tttg: c175/213 lr:0.000077 t:13.5s +tttg: c176/213 lr:0.000073 t:13.6s +tttg: c177/213 lr:0.000069 t:13.7s +tttg: c178/213 lr:0.000066 t:13.7s +tttg: c179/213 lr:0.000062 t:13.8s +tttg: c180/213 lr:0.000059 t:13.9s +tttg: c181/213 lr:0.000055 t:14.0s +tttg: c182/213 lr:0.000052 t:14.0s +tttg: c183/213 lr:0.000049 t:14.1s +tttg: c184/213 lr:0.000045 t:14.2s +tttg: c185/213 lr:0.000042 t:14.3s +tttg: c186/213 lr:0.000039 t:14.3s +tttg: c187/213 lr:0.000037 t:14.4s +tttg: c188/213 lr:0.000034 t:14.5s +tttg: c189/213 lr:0.000031 t:14.6s +tttg: c190/213 lr:0.000029 t:14.7s +tttg: c191/213 lr:0.000026 t:14.7s +tttg: c192/213 lr:0.000024 t:14.8s +tttg: c193/213 lr:0.000022 t:14.9s +tttg: c194/213 lr:0.000020 t:15.0s +tttg: c195/213 lr:0.000018 t:15.0s +tttg: c196/213 lr:0.000016 t:15.1s +tttg: c197/213 lr:0.000014 t:15.2s +tttg: c198/213 lr:0.000012 t:15.3s +tttg: c199/213 lr:0.000011 t:15.4s +tttg: c200/213 lr:0.000009 t:15.4s +tttg: c201/213 lr:0.000008 t:15.5s +tttg: c202/213 lr:0.000007 t:15.6s +tttg: c203/213 lr:0.000005 t:15.7s +tttg: c204/213 lr:0.000004 t:15.7s +tttg: c205/213 lr:0.000004 t:15.8s +tttg: c206/213 lr:0.000003 t:15.9s +tttg: c207/213 lr:0.000002 t:16.0s +tttg: c208/213 lr:0.000001 t:16.0s +tttg: c209/213 lr:0.000001 t:16.1s +tttg: c210/213 lr:0.000000 t:16.2s +tttg: c211/213 lr:0.000000 t:16.3s +tttg: c212/213 lr:0.000000 t:16.3s +ttpr: phase:3/3 t:337.9s +ttp: b739/782 bl:2.8205 bb:1.0710 rl:2.6272 rb:1.0594 dl:2227-2253 gd:1 +ttp: b732/782 bl:2.8143 bb:1.0953 rl:2.6394 rb:1.0618 dl:2041-2062 gd:1 +ttp: b727/782 bl:2.7625 bb:1.0516 rl:2.6466 rb:1.0612 dl:1936-1960 gd:1 +ttp: b714/782 bl:2.7982 bb:1.0650 rl:2.6540 rb:1.0614 dl:1711-1725 gd:1 +ttp: b711/782 bl:2.7648 bb:1.0411 rl:2.6590 rb:1.0604 dl:1673-1683 gd:1 +ttp: b697/782 bl:2.7547 bb:1.0378 rl:2.6628 rb:1.0595 dl:1522-1534 gd:1 +ttp: b693/782 bl:2.8063 bb:1.1009 rl:2.6682 rb:1.0610 dl:1485-1494 gd:1 +ttp: b681/782 bl:2.8080 bb:1.0661 rl:2.6729 rb:1.0612 dl:1383-1393 gd:1 +ttp: b677/782 bl:2.8567 bb:1.1074 rl:2.6788 rb:1.0627 dl:1353-1360 gd:1 +ttp: b667/782 bl:2.8120 bb:1.1016 rl:2.6827 rb:1.0639 dl:1288-1295 gd:1 +ttp: b663/782 bl:2.7860 bb:1.0574 rl:2.6856 rb:1.0637 dl:1264-1269 gd:1 +ttp: b653/782 bl:2.7494 bb:1.0314 rl:2.6873 rb:1.0628 dl:1203-1209 gd:1 +ttp: b643/782 bl:2.7872 bb:1.0626 rl:2.6897 rb:1.0628 dl:1150-1155 gd:1 +ttp: b634/782 bl:2.6926 bb:1.0395 rl:2.6898 rb:1.0623 dl:1105-1111 gd:1 +ttp: b625/782 bl:2.6576 bb:0.9984 rl:2.6891 rb:1.0608 dl:1064-1068 gd:1 +ttp: b617/782 bl:2.7368 bb:1.0358 rl:2.6900 rb:1.0603 dl:1027-1031 gd:1 +ttp: b610/782 bl:2.8218 bb:1.0594 rl:2.6926 rb:1.0603 dl:999-1004 gd:1 +ttp: b605/782 bl:2.7343 bb:1.0547 rl:2.6934 rb:1.0602 dl:978-982 gd:1 +ttp: b597/782 bl:2.7644 bb:1.0379 rl:2.6946 rb:1.0598 dl:947-950 gd:1 +ttp: b588/782 bl:2.7338 bb:1.0430 rl:2.6953 rb:1.0595 dl:917-921 gd:1 +ttp: b580/782 bl:2.7262 bb:1.0358 rl:2.6958 rb:1.0591 dl:891-894 gd:1 +ttp: b572/782 bl:2.9324 bb:1.1160 rl:2.6994 rb:1.0600 dl:865-868 gd:1 +ttp: b562/782 bl:2.6973 bb:1.0195 rl:2.6994 rb:1.0594 dl:834-837 gd:1 +ttp: b554/782 bl:2.7277 bb:1.0267 rl:2.6998 rb:1.0589 dl:809-812 gd:1 +ttp: b549/782 bl:2.7498 bb:1.0580 rl:2.7005 rb:1.0589 dl:795-798 gd:1 +ttp: b540/782 bl:2.6864 bb:1.0135 rl:2.7003 rb:1.0583 dl:771-774 gd:1 +ttp: b531/782 bl:2.7656 bb:1.0489 rl:2.7011 rb:1.0581 dl:750-752 gd:1 +ttp: b523/782 bl:2.8011 bb:1.0519 rl:2.7023 rb:1.0581 dl:730-732 gd:1 +ttp: b515/782 bl:2.7730 bb:1.0688 rl:2.7031 rb:1.0582 dl:710-713 gd:1 +ttp: b509/782 bl:2.7359 bb:1.0647 rl:2.7035 rb:1.0583 dl:695-698 gd:1 +ttp: b501/782 bl:2.7831 bb:1.0367 rl:2.7043 rb:1.0580 dl:677-680 gd:1 +ttp: b483/782 bl:2.7377 bb:1.0469 rl:2.7047 rb:1.0579 dl:639-641 gd:1 +ttp: b475/782 bl:2.7178 bb:1.0189 rl:2.7048 rb:1.0575 dl:622-623 gd:1 +ttp: b467/782 bl:2.7850 bb:1.0521 rl:2.7055 rb:1.0575 dl:606-608 gd:1 +ttp: b459/782 bl:2.7281 bb:1.0353 rl:2.7057 rb:1.0573 dl:591-593 gd:1 +ttp: b451/782 bl:2.7613 bb:1.0578 rl:2.7062 rb:1.0573 dl:576-579 gd:1 +ttp: b443/782 bl:2.7562 bb:1.0498 rl:2.7066 rb:1.0572 dl:562-564 gd:1 +ttp: b435/782 bl:2.7232 bb:1.0486 rl:2.7068 rb:1.0571 dl:547-549 gd:1 +ttp: b427/782 bl:2.7385 bb:1.0581 rl:2.7070 rb:1.0571 dl:533-535 gd:1 +ttp: b419/782 bl:2.7842 bb:1.0347 rl:2.7076 rb:1.0570 dl:519-521 gd:1 +ttp: b411/782 bl:2.8016 bb:1.0681 rl:2.7083 rb:1.0570 dl:507-508 gd:1 +ttp: b404/782 bl:2.7803 bb:1.0669 rl:2.7088 rb:1.0571 dl:495-497 gd:1 +ttp: b397/782 bl:2.8813 bb:1.0946 rl:2.7100 rb:1.0574 dl:484-486 gd:1 +ttp: b391/782 bl:2.8053 bb:1.0925 rl:2.7107 rb:1.0576 dl:475-476 gd:1 +ttp: b384/782 bl:2.8364 bb:1.0882 rl:2.7115 rb:1.0578 dl:464-466 gd:1 +ttp: b376/782 bl:2.7097 bb:1.0406 rl:2.7115 rb:1.0577 dl:453-454 gd:1 +ttp: b368/782 bl:2.8432 bb:1.0848 rl:2.7123 rb:1.0579 dl:441-443 gd:1 +ttp: b360/782 bl:2.8310 bb:1.0796 rl:2.7130 rb:1.0580 dl:430-432 gd:1 +ttp: b352/782 bl:2.7426 bb:1.0903 rl:2.7132 rb:1.0582 dl:419-420 gd:1 +ttp: b344/782 bl:2.8795 bb:1.1038 rl:2.7141 rb:1.0585 dl:408-410 gd:1 +ttp: b336/782 bl:2.9400 bb:1.1618 rl:2.7153 rb:1.0590 dl:398-399 gd:1 +ttp: b328/782 bl:2.7817 bb:1.0786 rl:2.7157 rb:1.0591 dl:388-389 gd:1 +ttp: b320/782 bl:2.7501 bb:1.0728 rl:2.7159 rb:1.0592 dl:377-378 gd:1 +ttp: b312/782 bl:2.7207 bb:1.0620 rl:2.7159 rb:1.0592 dl:367-368 gd:1 +ttp: b304/782 bl:2.8937 bb:1.1270 rl:2.7167 rb:1.0595 dl:357-358 gd:1 +ttp: b296/782 bl:2.7985 bb:1.0823 rl:2.7171 rb:1.0596 dl:347-348 gd:1 +ttp: b288/782 bl:2.8066 bb:1.1019 rl:2.7175 rb:1.0598 dl:337-339 gd:1 +ttp: b279/782 bl:2.8438 bb:1.0869 rl:2.7181 rb:1.0599 dl:327-329 gd:1 +ttp: b272/782 bl:2.8577 bb:1.1086 rl:2.7186 rb:1.0601 dl:320-321 gd:1 +ttp: b264/782 bl:2.8842 bb:1.1416 rl:2.7193 rb:1.0605 dl:311-312 gd:1 +ttp: b256/782 bl:2.8776 bb:1.1280 rl:2.7199 rb:1.0607 dl:301-302 gd:1 +ttp: b249/782 bl:2.8887 bb:1.1506 rl:2.7206 rb:1.0611 dl:294-295 gd:1 +ttp: b242/782 bl:2.8891 bb:1.1046 rl:2.7212 rb:1.0612 dl:287-288 gd:1 +ttp: b235/782 bl:2.9295 bb:1.1135 rl:2.7220 rb:1.0614 dl:280-281 gd:1 +ttp: b228/782 bl:2.8578 bb:1.1309 rl:2.7224 rb:1.0617 dl:273-274 gd:1 +ttp: b220/782 bl:2.8517 bb:1.1036 rl:2.7229 rb:1.0618 dl:265-266 gd:1 +ttp: b212/782 bl:2.9297 bb:1.1466 rl:2.7236 rb:1.0621 dl:257-258 gd:1 +ttp: b204/782 bl:2.8992 bb:1.1278 rl:2.7241 rb:1.0623 dl:250-251 gd:1 +ttp: b196/782 bl:2.8883 bb:1.1573 rl:2.7246 rb:1.0626 dl:243-244 gd:1 +ttp: b189/782 bl:2.9585 bb:1.2007 rl:2.7253 rb:1.0630 dl:237-237 gd:1 +ttp: b181/782 bl:2.8641 bb:1.1509 rl:2.7257 rb:1.0632 dl:230-230 gd:1 +ttp: b172/782 bl:2.9993 bb:1.1796 rl:2.7265 rb:1.0635 dl:222-223 gd:1 +ttp: b164/782 bl:2.9680 bb:1.1478 rl:2.7271 rb:1.0638 dl:215-216 gd:1 +ttp: b156/782 bl:2.8864 bb:1.1068 rl:2.7275 rb:1.0639 dl:208-209 gd:1 +ttp: b148/782 bl:2.9651 bb:1.1524 rl:2.7281 rb:1.0641 dl:202-203 gd:1 +ttp: b140/782 bl:2.9589 bb:1.1683 rl:2.7287 rb:1.0644 dl:195-196 gd:1 +ttp: b131/782 bl:3.0241 bb:1.2020 rl:2.7294 rb:1.0647 dl:188-189 gd:1 +ttp: b124/782 bl:2.8738 bb:1.1495 rl:2.7297 rb:1.0649 dl:183-184 gd:1 +ttp: b117/782 bl:2.8443 bb:1.1400 rl:2.7300 rb:1.0650 dl:178-178 gd:1 +ttp: b108/782 bl:2.8609 bb:1.0987 rl:2.7302 rb:1.0651 dl:171-172 gd:1 +ttp: b100/782 bl:2.9322 bb:1.1510 rl:2.7307 rb:1.0653 dl:165-166 gd:1 +ttp: b93/782 bl:2.9501 bb:1.1834 rl:2.7311 rb:1.0655 dl:160-160 gd:1 +ttp: b85/782 bl:2.9834 bb:1.1978 rl:2.7316 rb:1.0657 dl:154-154 gd:1 +ttp: b77/782 bl:3.0246 bb:1.1687 rl:2.7321 rb:1.0659 dl:148-148 gd:1 +ttp: b67/782 bl:3.0574 bb:1.2358 rl:2.7326 rb:1.0662 dl:140-141 gd:1 +ttp: b58/782 bl:2.9549 bb:1.2188 rl:2.7330 rb:1.0664 dl:133-134 gd:1 +ttp: b51/782 bl:3.0237 bb:1.2086 rl:2.7334 rb:1.0666 dl:127-128 gd:1 +ttp: b44/782 bl:3.1357 bb:1.2219 rl:2.7340 rb:1.0669 dl:122-122 gd:1 +ttp: b34/782 bl:3.0721 bb:1.2437 rl:2.7345 rb:1.0671 dl:114-115 gd:1 +ttp: b27/782 bl:3.0932 bb:1.2349 rl:2.7350 rb:1.0673 dl:107-108 gd:1 +ttp: b19/782 bl:3.1405 bb:1.2266 rl:2.7355 rb:1.0675 dl:100-101 gd:1 +ttp: b9/782 bl:3.2070 bb:1.2709 rl:2.7360 rb:1.0677 dl:87-89 gd:1 +ttp: b2/782 bl:3.1371 bb:1.1635 rl:2.7363 rb:1.0678 dl:70-75 gd:1 +quantized_ttt_phased val_loss:2.76454012 val_bpb:1.07023963 eval_time:440584ms +total_eval_time:440.6s diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed42.log b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed42.log new file mode 100644 index 0000000000..214867f35a --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed42.log @@ -0,0 +1,760 @@ +W0428 09:39:54.906000 552616 torch/distributed/run.py:803] +W0428 09:39:54.906000 552616 torch/distributed/run.py:803] ***************************************** +W0428 09:39:54.906000 552616 torch/distributed/run.py:803] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. +W0428 09:39:54.906000 552616 torch/distributed/run.py:803] ***************************************** +Hyperparameters: + adam_eps: 1e-08 + adam_wd: 0.02 + artifact_dir: + attn_clip_sigmas: 13.0 + beta1: 0.9 + beta2: 0.95 + compressor: brotli + data_dir: ./data/ + datasets_dir: ./data/datasets/fineweb10B_sp8192 + distributed: True + ema_decay: 0.9965 + embed_bits: 7 + embed_clip_sigmas: 15.0 + embed_lr: 0.6 + embed_wd: 0.085 + enable_looping_at: 0.35 + eval_seq_len: 2048 + eval_stride: 64 + gate_attn_out: True + gate_attn_width: 24 + global_ttt_batch_seqs: 32 + global_ttt_chunk_tokens: 32768 + global_ttt_epochs: 1 + global_ttt_grad_clip: 1.0 + global_ttt_lr: 0.001 + global_ttt_momentum: 0.9 + global_ttt_respect_doc_boundaries: True + global_ttt_warmup_chunks: 0 + global_ttt_warmup_start_lr: 0.0 + gptq_calibration_batches: 16 + gptq_reserve_seconds: 4.0 + grad_accum_steps: 1 + grad_clip_norm: 0.3 + is_main_process: True + iterations: 20000 + ln_scale: True + local_rank: 0 + logfile: logs/champion_3seed_42.txt + logit_softcap: 30.0 + loop_end: 5 + loop_start: 3 + lqer_asym_enabled: True + lqer_asym_group: 64 + lqer_enabled: True + lqer_factor_bits: 4 + lqer_rank: 4 + lqer_top_k: 3 + matrix_bits: 6 + matrix_clip_sigmas: 12.85 + matrix_lr: 0.026 + max_wallclock_seconds: 600.0 + min_lr: 0.0 + mlp_clip_sigmas: 12.0 + mlp_mult: 4.0 + model_dim: 512 + model_path: final_model.pt + muon_backend_steps: 5 + muon_momentum: 0.97 + muon_momentum_warmup_start: 0.92 + muon_momentum_warmup_steps: 1500 + muon_row_normalize: True + muon_wd: 0.095 + newton_muon_all_reduce_k: True + newton_muon_beta: 0.95 + newton_muon_capture_every: 4 + newton_muon_enabled: False + newton_muon_gamma: 0.2 + newton_muon_k_refresh: 32 + newton_muon_warmup: 100 + num_heads: 8 + num_kv_heads: 4 + num_layers: 11 + num_loops: 2 + parallel_final_lane: mean + parallel_start_layer: 8 + phased_ttt_enabled: True + phased_ttt_num_phases: 3 + phased_ttt_prefix_docs: 2000 + polar_express_ns: True + qk_gain_init: 5.25 + quantized_model_path: final_model.int6.ptz + rank: 0 + rope_base: 10000.0 + rope_dims: 16 + rope_train_seq_len: 2048 + rope_yarn: False + run_id: champion_3seed_42 + scalar_lr: 0.02 + seed: 42 + skip_gates_enabled: True + sliding_window_enabled: False + smear_gate_enabled: True + smear_gate_width: 12 + tie_embeddings: True + tied_embed_init_std: 0.005 + tied_embed_lr: 0.03 + tokenizer_path: ./data/tokenizers/fineweb_8192_bpe.model + train_batch_tokens: 786432 + train_files: ./data/datasets/fineweb10B_sp8192/fineweb_train_*.bin + train_log_every: 500 + train_seq_len: 2048 + ttt_batch_size: 64 + ttt_beta1: 0.0 + ttt_beta2: 0.999 + ttt_chunk_size: 48 + ttt_enabled: True + ttt_eval_batches: + ttt_eval_seq_len: 2048 + ttt_grad_steps: 1 + ttt_k_lora: True + ttt_lora_lr: 0.0001 + ttt_lora_rank: 192 + ttt_mlp_lora: True + ttt_o_lora: True + ttt_optimizer: adam + ttt_weight_decay: 1.0 + val_batch_tokens: 524288 + val_doc_fraction: 1.0 + val_files: ./data/datasets/fineweb10B_sp8192/fineweb_val_*.bin + val_loss_every: 4000 + vocab_size: 8192 + warmdown_frac: 0.75 + warmup_steps: 20 + world_size: 8 + xsa_last_n: 11 +train_shards: 80 +val_tokens: 40540160 +model_params:35946727 +gptq:reserving 4s, effective=596000ms +warmup_cu_buckets:64,128,192,256 iters_each:3 +warmup_step: 1/20 +warmup_step: 2/20 +warmup_step: 3/20 +warmup_step: 4/20 +warmup_step: 5/20 +warmup_step: 6/20 +warmup_step: 10/20 +warmup_step: 20/20 +loop_warmup:enabled encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +loop_warmup_step: 1/20 +loop_warmup_step: 2/20 +loop_warmup_step: 3/20 +loop_warmup_step: 4/20 +loop_warmup_step: 5/20 +loop_warmup_step: 6/20 +loop_warmup_step: 10/20 +loop_warmup_step: 20/20 +0/20000 val_loss: 9.0076 val_bpb: 3.4870 +1/20000 train_loss: 9.0080 train_time: 0.0m tok/s: 12149258 +2/20000 train_loss: 12.2467 train_time: 0.0m tok/s: 11427590 +3/20000 train_loss: 11.2837 train_time: 0.0m tok/s: 10114722 +4/20000 train_loss: 9.6321 train_time: 0.0m tok/s: 9634568 +5/20000 train_loss: 8.2293 train_time: 0.0m tok/s: 9344944 +500/20000 train_loss: 3.2487 train_time: 0.8m tok/s: 8071161 +1000/20000 train_loss: 3.0106 train_time: 1.6m tok/s: 8026480 +1500/20000 train_loss: 3.0169 train_time: 2.5m tok/s: 8019256 +2000/20000 train_loss: 2.9687 train_time: 3.3m tok/s: 8019331 +layer_loop:enabled step:2127 frac:0.350 encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +2500/20000 train_loss: 3.0582 train_time: 4.4m tok/s: 7475855 +3000/20000 train_loss: 2.9000 train_time: 5.6m tok/s: 7034521 +3500/20000 train_loss: 2.9632 train_time: 6.8m tok/s: 6750196 +4000/20000 train_loss: 2.8926 train_time: 8.0m tok/s: 6566613 +4000/20000 val_loss: 2.8696 val_bpb: 1.1109 +4500/20000 train_loss: 2.8410 train_time: 9.2m tok/s: 6433042 +4824/20000 val_loss: 2.7668 val_bpb: 1.0711 +stopping_early: wallclock_cap train_time: 596086ms step: 4824/20000 +peak memory allocated: 40141 MiB reserved: 44206 MiB +ema:applying EMA weights +diagnostic pre-quantization post-ema val_loss:2.76583281 val_bpb:1.07070556 eval_time:6819ms +Serialized model: 135422397 bytes +Code size (uncompressed): 134706 bytes +Code size (compressed): 33710 bytes +GPTQ:collecting Hessians from calibration data... +GPTQ:collected 67 Hessians in 3.5s +Quantized weights: + gptq (int6): blocks.attn.c_k.weight, blocks.attn.c_q.weight, blocks.attn.c_v.weight, blocks.attn.proj.weight, blocks.mlp.fc.weight, blocks.mlp.proj.weight + gptq (int6)+lqer_asym: blocks.mlp.fc.weight + gptq (int7)+lqer_asym: tok_emb.weight + passthrough (float16): blocks.attn.attn_gate_proj.weight, blocks.attn.q_gain, blocks.attn_scale, blocks.mlp_scale, blocks.resid_mix, parallel_post_lambdas, parallel_resid_lambdas, skip_gates, skip_weights, smear_gate.weight, smear_lambda +Serialized model quantized+brotli: 15921161 bytes +Total submission size quantized+brotli: 15954871 bytes +diagnostic quantized val_loss:2.79116183 val_bpb:1.08051090 eval_time:10643ms +ttt_lora:warming up compile (random tokens, no val data) +ttt_lora:compile warmup done (104.3s) + +beginning TTT eval timer +ttt_phased: total_docs:50000 prefix_docs:2000 suffix_docs:48000 num_phases:3 boundaries:[666, 1333, 2000] +ttp: b777/782 bl:2.7181 bb:1.0872 rl:2.7181 rb:1.0872 dl:7190-7938 gd:0 +ttp: b773/782 bl:2.6440 bb:1.0729 rl:2.6874 rb:1.0813 dl:5203-5550 gd:0 +ttp: b769/782 bl:2.7599 bb:1.0922 rl:2.7058 rb:1.0841 dl:4307-4479 gd:0 +ttpp: phase:1/3 pd:1104 gd:666 t:202.0s +tttg: c1/95 lr:0.001000 t:0.3s +tttg: c2/95 lr:0.001000 t:0.4s +tttg: c3/95 lr:0.000999 t:0.5s +tttg: c4/95 lr:0.000997 t:0.6s +tttg: c5/95 lr:0.000996 t:0.7s +tttg: c6/95 lr:0.000993 t:0.8s +tttg: c7/95 lr:0.000990 t:0.8s +tttg: c8/95 lr:0.000986 t:0.9s +tttg: c9/95 lr:0.000982 t:1.0s +tttg: c10/95 lr:0.000978 t:1.1s +tttg: c11/95 lr:0.000972 t:1.1s +tttg: c12/95 lr:0.000967 t:1.2s +tttg: c13/95 lr:0.000960 t:1.3s +tttg: c14/95 lr:0.000954 t:1.4s +tttg: c15/95 lr:0.000946 t:1.4s +tttg: c16/95 lr:0.000938 t:1.5s +tttg: c17/95 lr:0.000930 t:1.6s +tttg: c18/95 lr:0.000921 t:1.7s +tttg: c19/95 lr:0.000912 t:1.7s +tttg: c20/95 lr:0.000903 t:1.8s +tttg: c21/95 lr:0.000892 t:1.9s +tttg: c22/95 lr:0.000882 t:2.0s +tttg: c23/95 lr:0.000871 t:2.0s +tttg: c24/95 lr:0.000859 t:2.1s +tttg: c25/95 lr:0.000848 t:2.2s +tttg: c26/95 lr:0.000835 t:2.3s +tttg: c27/95 lr:0.000823 t:2.4s +tttg: c28/95 lr:0.000810 t:2.4s +tttg: c29/95 lr:0.000797 t:2.5s +tttg: c30/95 lr:0.000783 t:2.6s +tttg: c31/95 lr:0.000769 t:2.7s +tttg: c32/95 lr:0.000755 t:2.8s +tttg: c33/95 lr:0.000740 t:2.8s +tttg: c34/95 lr:0.000726 t:2.9s +tttg: c35/95 lr:0.000710 t:3.0s +tttg: c36/95 lr:0.000695 t:3.0s +tttg: c37/95 lr:0.000680 t:3.1s +tttg: c38/95 lr:0.000664 t:3.2s +tttg: c39/95 lr:0.000648 t:3.3s +tttg: c40/95 lr:0.000632 t:3.4s +tttg: c41/95 lr:0.000616 t:3.4s +tttg: c42/95 lr:0.000600 t:3.5s +tttg: c43/95 lr:0.000583 t:3.6s +tttg: c44/95 lr:0.000567 t:3.7s +tttg: c45/95 lr:0.000550 t:3.7s +tttg: c46/95 lr:0.000533 t:3.8s +tttg: c47/95 lr:0.000517 t:3.9s +tttg: c48/95 lr:0.000500 t:4.0s +tttg: c49/95 lr:0.000483 t:4.0s +tttg: c50/95 lr:0.000467 t:4.1s +tttg: c51/95 lr:0.000450 t:4.2s +tttg: c52/95 lr:0.000433 t:4.3s +tttg: c53/95 lr:0.000417 t:4.3s +tttg: c54/95 lr:0.000400 t:4.4s +tttg: c55/95 lr:0.000384 t:4.5s +tttg: c56/95 lr:0.000368 t:4.6s +tttg: c57/95 lr:0.000352 t:4.7s +tttg: c58/95 lr:0.000336 t:4.7s +tttg: c59/95 lr:0.000320 t:4.8s +tttg: c60/95 lr:0.000305 t:4.9s +tttg: c61/95 lr:0.000290 t:5.0s +tttg: c62/95 lr:0.000274 t:5.0s +tttg: c63/95 lr:0.000260 t:5.1s +tttg: c64/95 lr:0.000245 t:5.2s +tttg: c65/95 lr:0.000231 t:5.3s +tttg: c66/95 lr:0.000217 t:5.3s +tttg: c67/95 lr:0.000203 t:5.4s +tttg: c68/95 lr:0.000190 t:5.5s +tttg: c69/95 lr:0.000177 t:5.6s +tttg: c70/95 lr:0.000165 t:5.6s +tttg: c71/95 lr:0.000152 t:5.7s +tttg: c72/95 lr:0.000141 t:5.8s +tttg: c73/95 lr:0.000129 t:5.9s +tttg: c74/95 lr:0.000118 t:5.9s +tttg: c75/95 lr:0.000108 t:6.0s +tttg: c76/95 lr:0.000097 t:6.1s +tttg: c77/95 lr:0.000088 t:6.2s +tttg: c78/95 lr:0.000079 t:6.2s +tttg: c79/95 lr:0.000070 t:6.3s +tttg: c80/95 lr:0.000062 t:6.4s +tttg: c81/95 lr:0.000054 t:6.5s +tttg: c82/95 lr:0.000046 t:6.5s +tttg: c83/95 lr:0.000040 t:6.6s +tttg: c84/95 lr:0.000033 t:6.7s +tttg: c85/95 lr:0.000028 t:6.8s +tttg: c86/95 lr:0.000022 t:6.9s +tttg: c87/95 lr:0.000018 t:6.9s +tttg: c88/95 lr:0.000014 t:7.0s +tttg: c89/95 lr:0.000010 t:7.1s +tttg: c90/95 lr:0.000007 t:7.2s +tttg: c91/95 lr:0.000004 t:7.2s +tttg: c92/95 lr:0.000003 t:7.3s +tttg: c93/95 lr:0.000001 t:7.4s +tttg: c94/95 lr:0.000000 t:7.5s +ttpr: phase:1/3 t:213.7s +ttp: b760/782 bl:2.8350 bb:1.1134 rl:2.7265 rb:1.0889 dl:3255-3334 gd:0 +ttp: b754/782 bl:2.6758 bb:1.0499 rl:2.7203 rb:1.0840 dl:2839-2899 gd:0 +ttpp: phase:2/3 pd:1808 gd:1333 t:284.2s +tttg: c1/158 lr:0.001000 t:0.1s +tttg: c2/158 lr:0.001000 t:0.2s +tttg: c3/158 lr:0.001000 t:0.2s +tttg: c4/158 lr:0.000999 t:0.3s +tttg: c5/158 lr:0.000998 t:0.4s +tttg: c6/158 lr:0.000997 t:0.5s +tttg: c7/158 lr:0.000996 t:0.5s +tttg: c8/158 lr:0.000995 t:0.6s +tttg: c9/158 lr:0.000994 t:0.7s +tttg: c10/158 lr:0.000992 t:0.8s +tttg: c11/158 lr:0.000990 t:0.9s +tttg: c12/158 lr:0.000988 t:0.9s +tttg: c13/158 lr:0.000986 t:1.0s +tttg: c14/158 lr:0.000983 t:1.1s +tttg: c15/158 lr:0.000981 t:1.2s +tttg: c16/158 lr:0.000978 t:1.3s +tttg: c17/158 lr:0.000975 t:1.3s +tttg: c18/158 lr:0.000971 t:1.4s +tttg: c19/158 lr:0.000968 t:1.5s +tttg: c20/158 lr:0.000964 t:1.6s +tttg: c21/158 lr:0.000960 t:1.6s +tttg: c22/158 lr:0.000957 t:1.7s +tttg: c23/158 lr:0.000952 t:1.8s +tttg: c24/158 lr:0.000948 t:1.9s +tttg: c25/158 lr:0.000943 t:1.9s +tttg: c26/158 lr:0.000939 t:2.0s +tttg: c27/158 lr:0.000934 t:2.1s +tttg: c28/158 lr:0.000929 t:2.2s +tttg: c29/158 lr:0.000924 t:2.2s +tttg: c30/158 lr:0.000918 t:2.3s +tttg: c31/158 lr:0.000913 t:2.4s +tttg: c32/158 lr:0.000907 t:2.5s +tttg: c33/158 lr:0.000901 t:2.5s +tttg: c34/158 lr:0.000895 t:2.6s +tttg: c35/158 lr:0.000889 t:2.7s +tttg: c36/158 lr:0.000882 t:2.8s +tttg: c37/158 lr:0.000876 t:2.8s +tttg: c38/158 lr:0.000869 t:2.9s +tttg: c39/158 lr:0.000862 t:3.0s +tttg: c40/158 lr:0.000855 t:3.1s +tttg: c41/158 lr:0.000848 t:3.1s +tttg: c42/158 lr:0.000841 t:3.2s +tttg: c43/158 lr:0.000834 t:3.3s +tttg: c44/158 lr:0.000826 t:3.4s +tttg: c45/158 lr:0.000818 t:3.4s +tttg: c46/158 lr:0.000811 t:3.5s +tttg: c47/158 lr:0.000803 t:3.6s +tttg: c48/158 lr:0.000795 t:3.7s +tttg: c49/158 lr:0.000787 t:3.8s +tttg: c50/158 lr:0.000778 t:3.8s +tttg: c51/158 lr:0.000770 t:3.9s +tttg: c52/158 lr:0.000761 t:4.0s +tttg: c53/158 lr:0.000753 t:4.1s +tttg: c54/158 lr:0.000744 t:4.1s +tttg: c55/158 lr:0.000735 t:4.2s +tttg: c56/158 lr:0.000727 t:4.3s +tttg: c57/158 lr:0.000718 t:4.4s +tttg: c58/158 lr:0.000709 t:4.5s +tttg: c59/158 lr:0.000699 t:4.5s +tttg: c60/158 lr:0.000690 t:4.6s +tttg: c61/158 lr:0.000681 t:4.7s +tttg: c62/158 lr:0.000672 t:4.8s +tttg: c63/158 lr:0.000662 t:4.8s +tttg: c64/158 lr:0.000653 t:4.9s +tttg: c65/158 lr:0.000643 t:5.0s +tttg: c66/158 lr:0.000633 t:5.1s +tttg: c67/158 lr:0.000624 t:5.1s +tttg: c68/158 lr:0.000614 t:5.2s +tttg: c69/158 lr:0.000604 t:5.3s +tttg: c70/158 lr:0.000594 t:5.4s +tttg: c71/158 lr:0.000585 t:5.5s +tttg: c72/158 lr:0.000575 t:5.5s +tttg: c73/158 lr:0.000565 t:5.6s +tttg: c74/158 lr:0.000555 t:5.7s +tttg: c75/158 lr:0.000545 t:5.8s +tttg: c76/158 lr:0.000535 t:5.8s +tttg: c77/158 lr:0.000525 t:5.9s +tttg: c78/158 lr:0.000515 t:6.0s +tttg: c79/158 lr:0.000505 t:6.1s +tttg: c80/158 lr:0.000495 t:6.2s +tttg: c81/158 lr:0.000485 t:6.2s +tttg: c82/158 lr:0.000475 t:6.3s +tttg: c83/158 lr:0.000465 t:6.4s +tttg: c84/158 lr:0.000455 t:6.5s +tttg: c85/158 lr:0.000445 t:6.5s +tttg: c86/158 lr:0.000435 t:6.6s +tttg: c87/158 lr:0.000425 t:6.7s +tttg: c88/158 lr:0.000415 t:6.8s +tttg: c89/158 lr:0.000406 t:6.8s +tttg: c90/158 lr:0.000396 t:6.9s +tttg: c91/158 lr:0.000386 t:7.0s +tttg: c92/158 lr:0.000376 t:7.1s +tttg: c93/158 lr:0.000367 t:7.1s +tttg: c94/158 lr:0.000357 t:7.2s +tttg: c95/158 lr:0.000347 t:7.3s +tttg: c96/158 lr:0.000338 t:7.4s +tttg: c97/158 lr:0.000328 t:7.5s +tttg: c98/158 lr:0.000319 t:7.5s +tttg: c99/158 lr:0.000310 t:7.6s +tttg: c100/158 lr:0.000301 t:7.7s +tttg: c101/158 lr:0.000291 t:7.8s +tttg: c102/158 lr:0.000282 t:7.9s +tttg: c103/158 lr:0.000273 t:7.9s +tttg: c104/158 lr:0.000265 t:8.0s +tttg: c105/158 lr:0.000256 t:8.1s +tttg: c106/158 lr:0.000247 t:8.2s +tttg: c107/158 lr:0.000239 t:8.2s +tttg: c108/158 lr:0.000230 t:8.3s +tttg: c109/158 lr:0.000222 t:8.4s +tttg: c110/158 lr:0.000213 t:8.5s +tttg: c111/158 lr:0.000205 t:8.5s +tttg: c112/158 lr:0.000197 t:8.6s +tttg: c113/158 lr:0.000189 t:8.7s +tttg: c114/158 lr:0.000182 t:8.8s +tttg: c115/158 lr:0.000174 t:8.8s +tttg: c116/158 lr:0.000166 t:8.9s +tttg: c117/158 lr:0.000159 t:9.0s +tttg: c118/158 lr:0.000152 t:9.1s +tttg: c119/158 lr:0.000145 t:9.1s +tttg: c120/158 lr:0.000138 t:9.2s +tttg: c121/158 lr:0.000131 t:9.3s +tttg: c122/158 lr:0.000124 t:9.4s +tttg: c123/158 lr:0.000118 t:9.4s +tttg: c124/158 lr:0.000111 t:9.5s +tttg: c125/158 lr:0.000105 t:9.6s +tttg: c126/158 lr:0.000099 t:9.7s +tttg: c127/158 lr:0.000093 t:9.8s +tttg: c128/158 lr:0.000087 t:9.8s +tttg: c129/158 lr:0.000082 t:9.9s +tttg: c130/158 lr:0.000076 t:10.0s +tttg: c131/158 lr:0.000071 t:10.1s +tttg: c132/158 lr:0.000066 t:10.1s +tttg: c133/158 lr:0.000061 t:10.2s +tttg: c134/158 lr:0.000057 t:10.3s +tttg: c135/158 lr:0.000052 t:10.4s +tttg: c136/158 lr:0.000048 t:10.5s +tttg: c137/158 lr:0.000043 t:10.6s +tttg: c138/158 lr:0.000040 t:10.6s +tttg: c139/158 lr:0.000036 t:10.7s +tttg: c140/158 lr:0.000032 t:10.8s +tttg: c141/158 lr:0.000029 t:10.9s +tttg: c142/158 lr:0.000025 t:10.9s +tttg: c143/158 lr:0.000022 t:11.0s +tttg: c144/158 lr:0.000019 t:11.1s +tttg: c145/158 lr:0.000017 t:11.2s +tttg: c146/158 lr:0.000014 t:11.2s +tttg: c147/158 lr:0.000012 t:11.3s +tttg: c148/158 lr:0.000010 t:11.4s +tttg: c149/158 lr:0.000008 t:11.5s +tttg: c150/158 lr:0.000006 t:11.5s +tttg: c151/158 lr:0.000005 t:11.6s +tttg: c152/158 lr:0.000004 t:11.7s +tttg: c153/158 lr:0.000003 t:11.8s +tttg: c154/158 lr:0.000002 t:11.8s +tttg: c155/158 lr:0.000001 t:11.9s +tttg: c156/158 lr:0.000000 t:12.0s +tttg: c157/158 lr:0.000000 t:12.1s +ttpr: phase:2/3 t:300.5s +ttp: b751/782 bl:2.7826 bb:1.0682 rl:2.7268 rb:1.0823 dl:2689-2740 gd:0 +ttpp: phase:3/3 pd:2448 gd:2000 t:315.7s +tttg: c1/213 lr:0.001000 t:0.1s +tttg: c2/213 lr:0.001000 t:0.2s +tttg: c3/213 lr:0.001000 t:0.2s +tttg: c4/213 lr:0.001000 t:0.3s +tttg: c5/213 lr:0.000999 t:0.4s +tttg: c6/213 lr:0.000999 t:0.5s +tttg: c7/213 lr:0.000998 t:0.5s +tttg: c8/213 lr:0.000997 t:0.6s +tttg: c9/213 lr:0.000996 t:0.7s +tttg: c10/213 lr:0.000996 t:0.8s +tttg: c11/213 lr:0.000995 t:0.8s +tttg: c12/213 lr:0.000993 t:0.9s +tttg: c13/213 lr:0.000992 t:1.0s +tttg: c14/213 lr:0.000991 t:1.1s +tttg: c15/213 lr:0.000989 t:1.2s +tttg: c16/213 lr:0.000988 t:1.2s +tttg: c17/213 lr:0.000986 t:1.3s +tttg: c18/213 lr:0.000984 t:1.4s +tttg: c19/213 lr:0.000982 t:1.5s +tttg: c20/213 lr:0.000980 t:1.5s +tttg: c21/213 lr:0.000978 t:1.6s +tttg: c22/213 lr:0.000976 t:1.7s +tttg: c23/213 lr:0.000974 t:1.8s +tttg: c24/213 lr:0.000971 t:1.8s +tttg: c25/213 lr:0.000969 t:1.9s +tttg: c26/213 lr:0.000966 t:2.0s +tttg: c27/213 lr:0.000963 t:2.1s +tttg: c28/213 lr:0.000961 t:2.1s +tttg: c29/213 lr:0.000958 t:2.2s +tttg: c30/213 lr:0.000955 t:2.3s +tttg: c31/213 lr:0.000951 t:2.4s +tttg: c32/213 lr:0.000948 t:2.5s +tttg: c33/213 lr:0.000945 t:2.5s +tttg: c34/213 lr:0.000941 t:2.6s +tttg: c35/213 lr:0.000938 t:2.7s +tttg: c36/213 lr:0.000934 t:2.7s +tttg: c37/213 lr:0.000931 t:2.8s +tttg: c38/213 lr:0.000927 t:2.9s +tttg: c39/213 lr:0.000923 t:3.0s +tttg: c40/213 lr:0.000919 t:3.1s +tttg: c41/213 lr:0.000915 t:3.1s +tttg: c42/213 lr:0.000911 t:3.2s +tttg: c43/213 lr:0.000906 t:3.3s +tttg: c44/213 lr:0.000902 t:3.4s +tttg: c45/213 lr:0.000897 t:3.4s +tttg: c46/213 lr:0.000893 t:3.5s +tttg: c47/213 lr:0.000888 t:3.6s +tttg: c48/213 lr:0.000884 t:3.7s +tttg: c49/213 lr:0.000879 t:3.7s +tttg: c50/213 lr:0.000874 t:3.8s +tttg: c51/213 lr:0.000869 t:3.9s +tttg: c52/213 lr:0.000864 t:4.0s +tttg: c53/213 lr:0.000859 t:4.0s +tttg: c54/213 lr:0.000854 t:4.1s +tttg: c55/213 lr:0.000848 t:4.2s +tttg: c56/213 lr:0.000843 t:4.3s +tttg: c57/213 lr:0.000837 t:4.3s +tttg: c58/213 lr:0.000832 t:4.4s +tttg: c59/213 lr:0.000826 t:4.5s +tttg: c60/213 lr:0.000821 t:4.6s +tttg: c61/213 lr:0.000815 t:4.7s +tttg: c62/213 lr:0.000809 t:4.7s +tttg: c63/213 lr:0.000803 t:4.8s +tttg: c64/213 lr:0.000797 t:4.9s +tttg: c65/213 lr:0.000791 t:5.0s +tttg: c66/213 lr:0.000785 t:5.0s +tttg: c67/213 lr:0.000779 t:5.1s +tttg: c68/213 lr:0.000773 t:5.2s +tttg: c69/213 lr:0.000767 t:5.3s +tttg: c70/213 lr:0.000761 t:5.4s +tttg: c71/213 lr:0.000754 t:5.4s +tttg: c72/213 lr:0.000748 t:5.5s +tttg: c73/213 lr:0.000741 t:5.6s +tttg: c74/213 lr:0.000735 t:5.7s +tttg: c75/213 lr:0.000728 t:5.7s +tttg: c76/213 lr:0.000722 t:5.8s +tttg: c77/213 lr:0.000715 t:5.9s +tttg: c78/213 lr:0.000708 t:6.0s +tttg: c79/213 lr:0.000702 t:6.0s +tttg: c80/213 lr:0.000695 t:6.1s +tttg: c81/213 lr:0.000688 t:6.2s +tttg: c82/213 lr:0.000681 t:6.3s +tttg: c83/213 lr:0.000674 t:6.3s +tttg: c84/213 lr:0.000667 t:6.4s +tttg: c85/213 lr:0.000660 t:6.5s +tttg: c86/213 lr:0.000653 t:6.6s +tttg: c87/213 lr:0.000646 t:6.7s +tttg: c88/213 lr:0.000639 t:6.7s +tttg: c89/213 lr:0.000632 t:6.8s +tttg: c90/213 lr:0.000625 t:6.9s +tttg: c91/213 lr:0.000617 t:7.0s +tttg: c92/213 lr:0.000610 t:7.0s +tttg: c93/213 lr:0.000603 t:7.1s +tttg: c94/213 lr:0.000596 t:7.2s +tttg: c95/213 lr:0.000588 t:7.3s +tttg: c96/213 lr:0.000581 t:7.3s +tttg: c97/213 lr:0.000574 t:7.4s +tttg: c98/213 lr:0.000566 t:7.5s +tttg: c99/213 lr:0.000559 t:7.6s +tttg: c100/213 lr:0.000552 t:7.7s +tttg: c101/213 lr:0.000544 t:7.7s +tttg: c102/213 lr:0.000537 t:7.8s +tttg: c103/213 lr:0.000530 t:7.9s +tttg: c104/213 lr:0.000522 t:8.0s +tttg: c105/213 lr:0.000515 t:8.0s +tttg: c106/213 lr:0.000507 t:8.1s +tttg: c107/213 lr:0.000500 t:8.2s +tttg: c108/213 lr:0.000493 t:8.3s +tttg: c109/213 lr:0.000485 t:8.4s +tttg: c110/213 lr:0.000478 t:8.4s +tttg: c111/213 lr:0.000470 t:8.5s +tttg: c112/213 lr:0.000463 t:8.6s +tttg: c113/213 lr:0.000456 t:8.7s +tttg: c114/213 lr:0.000448 t:8.7s +tttg: c115/213 lr:0.000441 t:8.8s +tttg: c116/213 lr:0.000434 t:8.9s +tttg: c117/213 lr:0.000426 t:9.0s +tttg: c118/213 lr:0.000419 t:9.0s +tttg: c119/213 lr:0.000412 t:9.1s +tttg: c120/213 lr:0.000404 t:9.2s +tttg: c121/213 lr:0.000397 t:9.3s +tttg: c122/213 lr:0.000390 t:9.3s +tttg: c123/213 lr:0.000383 t:9.4s +tttg: c124/213 lr:0.000375 t:9.5s +tttg: c125/213 lr:0.000368 t:9.6s +tttg: c126/213 lr:0.000361 t:9.6s +tttg: c127/213 lr:0.000354 t:9.7s +tttg: c128/213 lr:0.000347 t:9.8s +tttg: c129/213 lr:0.000340 t:9.9s +tttg: c130/213 lr:0.000333 t:10.0s +tttg: c131/213 lr:0.000326 t:10.0s +tttg: c132/213 lr:0.000319 t:10.1s +tttg: c133/213 lr:0.000312 t:10.2s +tttg: c134/213 lr:0.000305 t:10.3s +tttg: c135/213 lr:0.000298 t:10.3s +tttg: c136/213 lr:0.000292 t:10.4s +tttg: c137/213 lr:0.000285 t:10.5s +tttg: c138/213 lr:0.000278 t:10.6s +tttg: c139/213 lr:0.000272 t:10.6s +tttg: c140/213 lr:0.000265 t:10.7s +tttg: c141/213 lr:0.000259 t:10.8s +tttg: c142/213 lr:0.000252 t:10.9s +tttg: c143/213 lr:0.000246 t:10.9s +tttg: c144/213 lr:0.000239 t:11.0s +tttg: c145/213 lr:0.000233 t:11.1s +tttg: c146/213 lr:0.000227 t:11.2s +tttg: c147/213 lr:0.000221 t:11.3s +tttg: c148/213 lr:0.000215 t:11.3s +tttg: c149/213 lr:0.000209 t:11.4s +tttg: c150/213 lr:0.000203 t:11.5s +tttg: c151/213 lr:0.000197 t:11.6s +tttg: c152/213 lr:0.000191 t:11.7s +tttg: c153/213 lr:0.000185 t:11.7s +tttg: c154/213 lr:0.000179 t:11.8s +tttg: c155/213 lr:0.000174 t:11.9s +tttg: c156/213 lr:0.000168 t:12.0s +tttg: c157/213 lr:0.000163 t:12.0s +tttg: c158/213 lr:0.000157 t:12.1s +tttg: c159/213 lr:0.000152 t:12.2s +tttg: c160/213 lr:0.000146 t:12.3s +tttg: c161/213 lr:0.000141 t:12.3s +tttg: c162/213 lr:0.000136 t:12.4s +tttg: c163/213 lr:0.000131 t:12.5s +tttg: c164/213 lr:0.000126 t:12.6s +tttg: c165/213 lr:0.000121 t:12.6s +tttg: c166/213 lr:0.000116 t:12.7s +tttg: c167/213 lr:0.000112 t:12.8s +tttg: c168/213 lr:0.000107 t:12.9s +tttg: c169/213 lr:0.000103 t:12.9s +tttg: c170/213 lr:0.000098 t:13.0s +tttg: c171/213 lr:0.000094 t:13.1s +tttg: c172/213 lr:0.000089 t:13.2s +tttg: c173/213 lr:0.000085 t:13.3s +tttg: c174/213 lr:0.000081 t:13.4s +tttg: c175/213 lr:0.000077 t:13.4s +tttg: c176/213 lr:0.000073 t:13.5s +tttg: c177/213 lr:0.000069 t:13.6s +tttg: c178/213 lr:0.000066 t:13.7s +tttg: c179/213 lr:0.000062 t:13.7s +tttg: c180/213 lr:0.000059 t:13.8s +tttg: c181/213 lr:0.000055 t:13.9s +tttg: c182/213 lr:0.000052 t:14.0s +tttg: c183/213 lr:0.000049 t:14.0s +tttg: c184/213 lr:0.000045 t:14.1s +tttg: c185/213 lr:0.000042 t:14.2s +tttg: c186/213 lr:0.000039 t:14.3s +tttg: c187/213 lr:0.000037 t:14.4s +tttg: c188/213 lr:0.000034 t:14.4s +tttg: c189/213 lr:0.000031 t:14.5s +tttg: c190/213 lr:0.000029 t:14.6s +tttg: c191/213 lr:0.000026 t:14.7s +tttg: c192/213 lr:0.000024 t:14.7s +tttg: c193/213 lr:0.000022 t:14.8s +tttg: c194/213 lr:0.000020 t:14.9s +tttg: c195/213 lr:0.000018 t:15.0s +tttg: c196/213 lr:0.000016 t:15.0s +tttg: c197/213 lr:0.000014 t:15.1s +tttg: c198/213 lr:0.000012 t:15.2s +tttg: c199/213 lr:0.000011 t:15.3s +tttg: c200/213 lr:0.000009 t:15.4s +tttg: c201/213 lr:0.000008 t:15.4s +tttg: c202/213 lr:0.000007 t:15.5s +tttg: c203/213 lr:0.000005 t:15.6s +tttg: c204/213 lr:0.000004 t:15.7s +tttg: c205/213 lr:0.000004 t:15.7s +tttg: c206/213 lr:0.000003 t:15.8s +tttg: c207/213 lr:0.000002 t:15.9s +tttg: c208/213 lr:0.000001 t:16.0s +tttg: c209/213 lr:0.000001 t:16.0s +tttg: c210/213 lr:0.000000 t:16.1s +tttg: c211/213 lr:0.000000 t:16.2s +tttg: c212/213 lr:0.000000 t:16.3s +ttpr: phase:3/3 t:336.1s +ttp: b736/782 bl:2.6672 bb:1.0396 rl:2.7223 rb:1.0790 dl:2140-2165 gd:1 +ttp: b734/782 bl:2.7618 bb:1.0532 rl:2.7250 rb:1.0772 dl:2091-2115 gd:1 +ttp: b721/782 bl:2.7378 bb:1.0219 rl:2.7257 rb:1.0738 dl:1832-1846 gd:1 +ttp: b718/782 bl:2.7655 bb:1.0660 rl:2.7278 rb:1.0734 dl:1773-1792 gd:1 +ttp: b705/782 bl:2.7714 bb:1.0675 rl:2.7298 rb:1.0732 dl:1606-1617 gd:1 +ttp: b702/782 bl:2.7917 bb:1.0619 rl:2.7324 rb:1.0727 dl:1572-1581 gd:1 +ttp: b695/782 bl:2.7735 bb:1.0753 rl:2.7340 rb:1.0728 dl:1504-1513 gd:1 +ttp: b684/782 bl:2.7837 bb:1.0702 rl:2.7357 rb:1.0727 dl:1407-1414 gd:1 +ttp: b675/782 bl:2.8286 bb:1.0619 rl:2.7388 rb:1.0723 dl:1341-1347 gd:1 +ttp: b665/782 bl:2.7304 bb:1.0289 rl:2.7385 rb:1.0710 dl:1275-1282 gd:1 +ttp: b661/782 bl:2.7118 bb:1.0167 rl:2.7377 rb:1.0694 dl:1251-1258 gd:1 +ttp: b650/782 bl:2.7803 bb:1.0705 rl:2.7389 rb:1.0694 dl:1188-1193 gd:1 +ttp: b646/782 bl:2.7613 bb:1.0692 rl:2.7394 rb:1.0694 dl:1166-1171 gd:1 +ttp: b636/782 bl:2.7486 bb:1.0662 rl:2.7396 rb:1.0693 dl:1116-1120 gd:1 +ttp: b627/782 bl:2.7237 bb:1.0310 rl:2.7393 rb:1.0684 dl:1073-1077 gd:1 +ttp: b620/782 bl:2.7688 bb:1.0377 rl:2.7399 rb:1.0678 dl:1041-1046 gd:1 +ttp: b615/782 bl:2.8267 bb:1.0614 rl:2.7417 rb:1.0676 dl:1020-1023 gd:1 +ttp: b604/782 bl:2.7173 bb:1.0330 rl:2.7412 rb:1.0670 dl:974-978 gd:1 +ttp: b596/782 bl:2.7667 bb:1.0596 rl:2.7417 rb:1.0668 dl:943-947 gd:1 +ttp: b589/782 bl:2.7406 bb:1.0491 rl:2.7416 rb:1.0665 dl:921-924 gd:1 +ttp: b582/782 bl:2.8517 bb:1.0877 rl:2.7435 rb:1.0669 dl:897-901 gd:1 +ttp: b574/782 bl:2.7689 bb:1.0343 rl:2.7439 rb:1.0663 dl:871-874 gd:1 +ttp: b564/782 bl:2.8583 bb:1.1059 rl:2.7456 rb:1.0669 dl:840-843 gd:1 +ttp: b555/782 bl:2.7517 bb:1.0501 rl:2.7457 rb:1.0667 dl:812-815 gd:1 +ttp: b552/782 bl:2.7868 bb:1.0386 rl:2.7462 rb:1.0663 dl:804-806 gd:1 +ttp: b544/782 bl:2.7418 bb:1.0383 rl:2.7462 rb:1.0659 dl:782-785 gd:1 +ttp: b535/782 bl:2.7800 bb:1.0540 rl:2.7466 rb:1.0657 dl:759-762 gd:1 +ttp: b527/782 bl:2.7252 bb:1.0356 rl:2.7463 rb:1.0654 dl:739-742 gd:1 +ttp: b519/782 bl:2.7256 bb:1.0336 rl:2.7461 rb:1.0650 dl:720-723 gd:1 +ttp: b506/782 bl:2.7990 bb:1.0722 rl:2.7467 rb:1.0651 dl:688-690 gd:1 +ttp: b498/782 bl:2.6658 bb:1.0320 rl:2.7458 rb:1.0647 dl:671-673 gd:1 +ttp: b496/782 bl:2.8226 bb:1.0461 rl:2.7466 rb:1.0645 dl:666-668 gd:1 +ttp: b488/782 bl:2.8097 bb:1.0471 rl:2.7473 rb:1.0643 dl:649-651 gd:1 +ttp: b480/782 bl:2.7803 bb:1.0496 rl:2.7476 rb:1.0642 dl:632-635 gd:1 +ttp: b472/782 bl:2.7905 bb:1.0667 rl:2.7480 rb:1.0642 dl:616-618 gd:1 +ttp: b465/782 bl:2.7999 bb:1.0562 rl:2.7485 rb:1.0641 dl:602-604 gd:1 +ttp: b458/782 bl:2.8041 bb:1.0629 rl:2.7490 rb:1.0641 dl:589-591 gd:1 +ttp: b450/782 bl:2.7490 bb:1.0260 rl:2.7490 rb:1.0638 dl:575-576 gd:1 +ttp: b442/782 bl:2.7922 bb:1.0487 rl:2.7494 rb:1.0636 dl:560-562 gd:1 +ttp: b434/782 bl:2.7183 bb:1.0387 rl:2.7491 rb:1.0634 dl:545-547 gd:1 +ttp: b426/782 bl:2.7054 bb:1.0587 rl:2.7488 rb:1.0634 dl:532-533 gd:1 +ttp: b414/782 bl:2.8084 bb:1.0826 rl:2.7492 rb:1.0635 dl:511-513 gd:1 +ttp: b407/782 bl:2.7650 bb:1.0526 rl:2.7493 rb:1.0634 dl:500-501 gd:1 +ttp: b398/782 bl:2.8740 bb:1.0916 rl:2.7502 rb:1.0636 dl:486-487 gd:1 +ttp: b390/782 bl:2.8020 bb:1.0868 rl:2.7505 rb:1.0638 dl:473-475 gd:1 +ttp: b383/782 bl:2.8200 bb:1.0800 rl:2.7510 rb:1.0639 dl:463-464 gd:1 +ttp: b375/782 bl:2.7990 bb:1.1030 rl:2.7513 rb:1.0642 dl:452-453 gd:1 +ttp: b368/782 bl:2.8367 bb:1.0823 rl:2.7518 rb:1.0643 dl:441-443 gd:1 +ttp: b360/782 bl:2.8265 bb:1.0779 rl:2.7523 rb:1.0644 dl:430-432 gd:1 +ttp: b353/782 bl:2.7879 bb:1.0925 rl:2.7525 rb:1.0645 dl:420-422 gd:1 +ttp: b345/782 bl:2.8509 bb:1.1056 rl:2.7530 rb:1.0647 dl:410-412 gd:1 +ttp: b337/782 bl:2.8210 bb:1.0741 rl:2.7534 rb:1.0648 dl:399-400 gd:1 +ttp: b329/782 bl:2.8198 bb:1.0999 rl:2.7538 rb:1.0650 dl:389-390 gd:1 +ttp: b320/782 bl:2.7532 bb:1.0740 rl:2.7538 rb:1.0650 dl:377-378 gd:1 +ttp: b313/782 bl:2.8172 bb:1.0850 rl:2.7541 rb:1.0651 dl:368-369 gd:1 +ttp: b307/782 bl:2.8894 bb:1.1045 rl:2.7547 rb:1.0653 dl:361-362 gd:1 +ttp: b301/782 bl:2.7829 bb:1.0822 rl:2.7549 rb:1.0654 dl:353-354 gd:1 +ttp: b293/782 bl:2.7542 bb:1.0640 rl:2.7549 rb:1.0654 dl:343-345 gd:1 +ttp: b286/782 bl:2.8757 bb:1.0924 rl:2.7554 rb:1.0655 dl:335-336 gd:1 +ttp: b279/782 bl:2.8394 bb:1.0852 rl:2.7558 rb:1.0656 dl:327-329 gd:1 +ttp: b272/782 bl:2.8573 bb:1.1084 rl:2.7562 rb:1.0658 dl:320-321 gd:1 +ttp: b264/782 bl:2.8850 bb:1.1419 rl:2.7567 rb:1.0661 dl:311-312 gd:1 +ttp: b255/782 bl:2.8663 bb:1.1311 rl:2.7571 rb:1.0663 dl:300-301 gd:1 +ttp: b247/782 bl:2.7867 bb:1.0767 rl:2.7572 rb:1.0664 dl:292-293 gd:1 +ttp: b239/782 bl:2.8831 bb:1.1307 rl:2.7577 rb:1.0666 dl:284-285 gd:1 +ttp: b198/782 bl:2.9541 bb:1.1425 rl:2.7583 rb:1.0668 dl:245-246 gd:1 +ttp: b190/782 bl:2.8603 bb:1.0873 rl:2.7586 rb:1.0669 dl:237-238 gd:1 +ttp: b182/782 bl:2.8380 bb:1.1290 rl:2.7589 rb:1.0671 dl:230-231 gd:1 +ttp: b173/782 bl:2.9579 bb:1.1499 rl:2.7594 rb:1.0673 dl:223-224 gd:1 +ttp: b167/782 bl:2.9464 bb:1.1778 rl:2.7600 rb:1.0676 dl:218-218 gd:1 +ttp: b158/782 bl:2.8747 bb:1.1380 rl:2.7603 rb:1.0678 dl:210-211 gd:1 +ttp: b149/782 bl:2.9599 bb:1.1669 rl:2.7608 rb:1.0680 dl:203-204 gd:1 +ttp: b142/782 bl:2.9537 bb:1.1580 rl:2.7613 rb:1.0683 dl:197-198 gd:1 +ttp: b134/782 bl:3.0194 bb:1.2076 rl:2.7619 rb:1.0686 dl:190-191 gd:1 +ttp: b128/782 bl:2.8411 bb:1.0907 rl:2.7621 rb:1.0686 dl:186-187 gd:1 +ttp: b120/782 bl:2.9612 bb:1.1635 rl:2.7625 rb:1.0688 dl:180-181 gd:1 +ttp: b114/782 bl:2.9907 bb:1.1840 rl:2.7630 rb:1.0691 dl:176-176 gd:1 +ttp: b105/782 bl:3.0612 bb:1.2263 rl:2.7636 rb:1.0694 dl:169-170 gd:1 +ttp: b98/782 bl:2.9635 bb:1.1762 rl:2.7640 rb:1.0696 dl:164-164 gd:1 +ttp: b90/782 bl:3.0098 bb:1.1872 rl:2.7645 rb:1.0699 dl:158-158 gd:1 +ttp: b78/782 bl:2.8959 bb:1.1236 rl:2.7648 rb:1.0699 dl:148-149 gd:1 +ttp: b70/782 bl:3.0556 bb:1.1611 rl:2.7653 rb:1.0701 dl:142-143 gd:1 +ttp: b64/782 bl:2.9938 bb:1.2409 rl:2.7657 rb:1.0704 dl:138-139 gd:1 +ttp: b59/782 bl:3.0410 bb:1.1878 rl:2.7661 rb:1.0706 dl:134-134 gd:1 +ttp: b51/782 bl:3.0265 bb:1.2097 rl:2.7665 rb:1.0708 dl:127-128 gd:1 +ttp: b44/782 bl:3.1242 bb:1.2174 rl:2.7671 rb:1.0710 dl:122-122 gd:1 +ttp: b35/782 bl:3.0135 bb:1.1967 rl:2.7674 rb:1.0712 dl:115-115 gd:1 +ttp: b25/782 bl:3.2810 bb:1.3003 rl:2.7681 rb:1.0715 dl:106-107 gd:1 +ttp: b18/782 bl:3.1400 bb:1.2720 rl:2.7685 rb:1.0717 dl:99-100 gd:1 +ttp: b9/782 bl:3.1890 bb:1.2637 rl:2.7690 rb:1.0719 dl:87-89 gd:1 +ttp: b2/782 bl:3.1434 bb:1.1659 rl:2.7693 rb:1.0720 dl:70-75 gd:1 +quantized_ttt_phased val_loss:2.76205552 val_bpb:1.06927777 eval_time:438339ms +total_eval_time:438.3s diff --git a/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed999.log b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed999.log new file mode 100644 index 0000000000..c3ed18aaee --- /dev/null +++ b/records/track_10min_16mb/2026-04-28_TTT_LORA_RANK_192_on_PR1874/train_seed999.log @@ -0,0 +1,763 @@ +W0428 10:27:18.286000 591851 torch/distributed/run.py:803] +W0428 10:27:18.286000 591851 torch/distributed/run.py:803] ***************************************** +W0428 10:27:18.286000 591851 torch/distributed/run.py:803] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. +W0428 10:27:18.286000 591851 torch/distributed/run.py:803] ***************************************** +Hyperparameters: + adam_eps: 1e-08 + adam_wd: 0.02 + artifact_dir: + attn_clip_sigmas: 13.0 + beta1: 0.9 + beta2: 0.95 + compressor: brotli + data_dir: ./data/ + datasets_dir: ./data/datasets/fineweb10B_sp8192 + distributed: True + ema_decay: 0.9965 + embed_bits: 7 + embed_clip_sigmas: 15.0 + embed_lr: 0.6 + embed_wd: 0.085 + enable_looping_at: 0.35 + eval_seq_len: 2048 + eval_stride: 64 + gate_attn_out: True + gate_attn_width: 24 + global_ttt_batch_seqs: 32 + global_ttt_chunk_tokens: 32768 + global_ttt_epochs: 1 + global_ttt_grad_clip: 1.0 + global_ttt_lr: 0.001 + global_ttt_momentum: 0.9 + global_ttt_respect_doc_boundaries: True + global_ttt_warmup_chunks: 0 + global_ttt_warmup_start_lr: 0.0 + gptq_calibration_batches: 16 + gptq_reserve_seconds: 4.0 + grad_accum_steps: 1 + grad_clip_norm: 0.3 + is_main_process: True + iterations: 20000 + ln_scale: True + local_rank: 0 + logfile: logs/champion_3seed_999.txt + logit_softcap: 30.0 + loop_end: 5 + loop_start: 3 + lqer_asym_enabled: True + lqer_asym_group: 64 + lqer_enabled: True + lqer_factor_bits: 4 + lqer_rank: 4 + lqer_top_k: 3 + matrix_bits: 6 + matrix_clip_sigmas: 12.85 + matrix_lr: 0.026 + max_wallclock_seconds: 600.0 + min_lr: 0.0 + mlp_clip_sigmas: 12.0 + mlp_mult: 4.0 + model_dim: 512 + model_path: final_model.pt + muon_backend_steps: 5 + muon_momentum: 0.97 + muon_momentum_warmup_start: 0.92 + muon_momentum_warmup_steps: 1500 + muon_row_normalize: True + muon_wd: 0.095 + newton_muon_all_reduce_k: True + newton_muon_beta: 0.95 + newton_muon_capture_every: 4 + newton_muon_enabled: False + newton_muon_gamma: 0.2 + newton_muon_k_refresh: 32 + newton_muon_warmup: 100 + num_heads: 8 + num_kv_heads: 4 + num_layers: 11 + num_loops: 2 + parallel_final_lane: mean + parallel_start_layer: 8 + phased_ttt_enabled: True + phased_ttt_num_phases: 3 + phased_ttt_prefix_docs: 2000 + polar_express_ns: True + qk_gain_init: 5.25 + quantized_model_path: final_model.int6.ptz + rank: 0 + rope_base: 10000.0 + rope_dims: 16 + rope_train_seq_len: 2048 + rope_yarn: False + run_id: champion_3seed_999 + scalar_lr: 0.02 + seed: 999 + skip_gates_enabled: True + sliding_window_enabled: False + smear_gate_enabled: True + smear_gate_width: 12 + tie_embeddings: True + tied_embed_init_std: 0.005 + tied_embed_lr: 0.03 + tokenizer_path: ./data/tokenizers/fineweb_8192_bpe.model + train_batch_tokens: 786432 + train_files: ./data/datasets/fineweb10B_sp8192/fineweb_train_*.bin + train_log_every: 500 + train_seq_len: 2048 + ttt_batch_size: 64 + ttt_beta1: 0.0 + ttt_beta2: 0.999 + ttt_chunk_size: 48 + ttt_enabled: True + ttt_eval_batches: + ttt_eval_seq_len: 2048 + ttt_grad_steps: 1 + ttt_k_lora: True + ttt_lora_lr: 0.0001 + ttt_lora_rank: 192 + ttt_mlp_lora: True + ttt_o_lora: True + ttt_optimizer: adam + ttt_weight_decay: 1.0 + val_batch_tokens: 524288 + val_doc_fraction: 1.0 + val_files: ./data/datasets/fineweb10B_sp8192/fineweb_val_*.bin + val_loss_every: 4000 + vocab_size: 8192 + warmdown_frac: 0.75 + warmup_steps: 20 + world_size: 8 + xsa_last_n: 11 +train_shards: 80 +val_tokens: 40540160 +model_params:35946727 +gptq:reserving 4s, effective=596000ms +warmup_cu_buckets:64,128,192,256 iters_each:3 +warmup_step: 1/20 +warmup_step: 2/20 +warmup_step: 3/20 +warmup_step: 4/20 +warmup_step: 5/20 +warmup_step: 6/20 +warmup_step: 10/20 +warmup_step: 20/20 +loop_warmup:enabled encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +loop_warmup_step: 1/20 +loop_warmup_step: 2/20 +loop_warmup_step: 3/20 +loop_warmup_step: 4/20 +loop_warmup_step: 5/20 +loop_warmup_step: 6/20 +loop_warmup_step: 10/20 +loop_warmup_step: 20/20 +0/20000 val_loss: 9.0098 val_bpb: 3.4879 +1/20000 train_loss: 9.0099 train_time: 0.0m tok/s: 12386149 +2/20000 train_loss: 12.2482 train_time: 0.0m tok/s: 11396055 +3/20000 train_loss: 11.2760 train_time: 0.0m tok/s: 10211786 +4/20000 train_loss: 9.5976 train_time: 0.0m tok/s: 9662630 +5/20000 train_loss: 8.1260 train_time: 0.0m tok/s: 9367847 +500/20000 train_loss: 3.2543 train_time: 0.8m tok/s: 8084855 +1000/20000 train_loss: 3.0212 train_time: 1.6m tok/s: 8041318 +1500/20000 train_loss: 3.0272 train_time: 2.4m tok/s: 8031538 +2000/20000 train_loss: 2.9714 train_time: 3.3m tok/s: 8031330 +layer_loop:enabled step:2130 frac:0.350 encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +2500/20000 train_loss: 3.0622 train_time: 4.4m tok/s: 7485709 +3000/20000 train_loss: 2.8979 train_time: 5.6m tok/s: 7042430 +3500/20000 train_loss: 2.9663 train_time: 6.8m tok/s: 6756891 +4000/20000 train_loss: 2.8994 train_time: 8.0m tok/s: 6574085 +4000/20000 val_loss: 2.8727 val_bpb: 1.1121 +4500/20000 train_loss: 2.8477 train_time: 9.2m tok/s: 6438724 +4827/20000 val_loss: 2.7697 val_bpb: 1.0722 +stopping_early: wallclock_cap train_time: 596285ms step: 4827/20000 +peak memory allocated: 40141 MiB reserved: 44206 MiB +ema:applying EMA weights +diagnostic pre-quantization post-ema val_loss:2.76861383 val_bpb:1.07178215 eval_time:6873ms +Serialized model: 135422397 bytes +Code size (uncompressed): 134706 bytes +Code size (compressed): 33710 bytes +GPTQ:collecting Hessians from calibration data... +GPTQ:collected 67 Hessians in 3.5s +Quantized weights: + gptq (int6): blocks.attn.c_k.weight, blocks.attn.c_q.weight, blocks.attn.c_v.weight, blocks.attn.proj.weight, blocks.mlp.fc.weight, blocks.mlp.proj.weight + gptq (int6)+lqer_asym: blocks.mlp.fc.weight + gptq (int7)+lqer_asym: tok_emb.weight + passthrough (float16): blocks.attn.attn_gate_proj.weight, blocks.attn.q_gain, blocks.attn_scale, blocks.mlp_scale, blocks.resid_mix, parallel_post_lambdas, parallel_resid_lambdas, skip_gates, skip_weights, smear_gate.weight, smear_lambda +Serialized model quantized+brotli: 15914086 bytes +Total submission size quantized+brotli: 15947796 bytes +diagnostic quantized val_loss:2.79392202 val_bpb:1.08157942 eval_time:10776ms +ttt_lora:warming up compile (random tokens, no val data) +ttt_lora:compile warmup done (110.9s) + +beginning TTT eval timer +ttt_phased: total_docs:50000 prefix_docs:2000 suffix_docs:48000 num_phases:3 boundaries:[666, 1333, 2000] +ttp: b776/782 bl:2.7141 bb:1.0860 rl:2.7141 rb:1.0860 dl:6364-7180 gd:0 +ttp: b773/782 bl:2.6477 bb:1.0743 rl:2.6847 rb:1.0808 dl:5203-5550 gd:0 +ttp: b768/782 bl:2.6946 bb:1.0814 rl:2.6873 rb:1.0810 dl:4128-4306 gd:0 +ttpp: phase:1/3 pd:1104 gd:666 t:202.9s +tttg: c1/95 lr:0.001000 t:0.3s +tttg: c2/95 lr:0.001000 t:0.4s +tttg: c3/95 lr:0.000999 t:0.5s +tttg: c4/95 lr:0.000997 t:0.6s +tttg: c5/95 lr:0.000996 t:0.7s +tttg: c6/95 lr:0.000993 t:0.8s +tttg: c7/95 lr:0.000990 t:0.9s +tttg: c8/95 lr:0.000986 t:2.5s +tttg: c9/95 lr:0.000982 t:2.6s +tttg: c10/95 lr:0.000978 t:2.6s +tttg: c11/95 lr:0.000972 t:2.7s +tttg: c12/95 lr:0.000967 t:2.8s +tttg: c13/95 lr:0.000960 t:2.9s +tttg: c14/95 lr:0.000954 t:3.0s +tttg: c15/95 lr:0.000946 t:3.0s +tttg: c16/95 lr:0.000938 t:3.1s +tttg: c17/95 lr:0.000930 t:3.2s +tttg: c18/95 lr:0.000921 t:3.3s +tttg: c19/95 lr:0.000912 t:3.4s +tttg: c20/95 lr:0.000903 t:3.4s +tttg: c21/95 lr:0.000892 t:3.5s +tttg: c22/95 lr:0.000882 t:3.6s +tttg: c23/95 lr:0.000871 t:3.7s +tttg: c24/95 lr:0.000859 t:3.7s +tttg: c25/95 lr:0.000848 t:3.8s +tttg: c26/95 lr:0.000835 t:3.9s +tttg: c27/95 lr:0.000823 t:4.0s +tttg: c28/95 lr:0.000810 t:4.1s +tttg: c29/95 lr:0.000797 t:4.1s +tttg: c30/95 lr:0.000783 t:4.2s +tttg: c31/95 lr:0.000769 t:4.3s +tttg: c32/95 lr:0.000755 t:4.4s +tttg: c33/95 lr:0.000740 t:4.5s +tttg: c34/95 lr:0.000726 t:4.5s +tttg: c35/95 lr:0.000710 t:4.6s +tttg: c36/95 lr:0.000695 t:4.7s +tttg: c37/95 lr:0.000680 t:4.8s +tttg: c38/95 lr:0.000664 t:4.9s +tttg: c39/95 lr:0.000648 t:4.9s +tttg: c40/95 lr:0.000632 t:5.0s +tttg: c41/95 lr:0.000616 t:5.1s +tttg: c42/95 lr:0.000600 t:5.2s +tttg: c43/95 lr:0.000583 t:5.2s +tttg: c44/95 lr:0.000567 t:5.3s +tttg: c45/95 lr:0.000550 t:5.4s +tttg: c46/95 lr:0.000533 t:5.5s +tttg: c47/95 lr:0.000517 t:5.6s +tttg: c48/95 lr:0.000500 t:5.6s +tttg: c49/95 lr:0.000483 t:5.7s +tttg: c50/95 lr:0.000467 t:5.8s +tttg: c51/95 lr:0.000450 t:5.9s +tttg: c52/95 lr:0.000433 t:6.0s +tttg: c53/95 lr:0.000417 t:6.0s +tttg: c54/95 lr:0.000400 t:6.1s +tttg: c55/95 lr:0.000384 t:6.2s +tttg: c56/95 lr:0.000368 t:6.3s +tttg: c57/95 lr:0.000352 t:6.4s +tttg: c58/95 lr:0.000336 t:6.4s +tttg: c59/95 lr:0.000320 t:6.5s +tttg: c60/95 lr:0.000305 t:6.6s +tttg: c61/95 lr:0.000290 t:6.7s +tttg: c62/95 lr:0.000274 t:6.7s +tttg: c63/95 lr:0.000260 t:6.8s +tttg: c64/95 lr:0.000245 t:6.9s +tttg: c65/95 lr:0.000231 t:7.0s +tttg: c66/95 lr:0.000217 t:7.1s +tttg: c67/95 lr:0.000203 t:7.1s +tttg: c68/95 lr:0.000190 t:7.2s +tttg: c69/95 lr:0.000177 t:7.3s +tttg: c70/95 lr:0.000165 t:7.4s +tttg: c71/95 lr:0.000152 t:7.4s +tttg: c72/95 lr:0.000141 t:7.5s +tttg: c73/95 lr:0.000129 t:7.6s +tttg: c74/95 lr:0.000118 t:7.7s +tttg: c75/95 lr:0.000108 t:7.8s +tttg: c76/95 lr:0.000097 t:7.8s +tttg: c77/95 lr:0.000088 t:7.9s +tttg: c78/95 lr:0.000079 t:8.0s +tttg: c79/95 lr:0.000070 t:8.1s +tttg: c80/95 lr:0.000062 t:8.2s +tttg: c81/95 lr:0.000054 t:8.2s +tttg: c82/95 lr:0.000046 t:8.3s +tttg: c83/95 lr:0.000040 t:8.4s +tttg: c84/95 lr:0.000033 t:8.5s +tttg: c85/95 lr:0.000028 t:8.5s +tttg: c86/95 lr:0.000022 t:8.6s +tttg: c87/95 lr:0.000018 t:8.7s +tttg: c88/95 lr:0.000014 t:8.8s +tttg: c89/95 lr:0.000010 t:8.9s +tttg: c90/95 lr:0.000007 t:8.9s +tttg: c91/95 lr:0.000004 t:9.0s +tttg: c92/95 lr:0.000003 t:9.1s +tttg: c93/95 lr:0.000001 t:9.2s +tttg: c94/95 lr:0.000000 t:9.2s +ttpr: phase:1/3 t:216.3s +ttp: b760/782 bl:2.8394 bb:1.1151 rl:2.7129 rb:1.0868 dl:3255-3334 gd:0 +ttpp: phase:2/3 pd:1808 gd:1333 t:286.2s +tttg: c1/158 lr:0.001000 t:0.1s +tttg: c2/158 lr:0.001000 t:0.2s +tttg: c3/158 lr:0.001000 t:0.2s +tttg: c4/158 lr:0.000999 t:0.3s +tttg: c5/158 lr:0.000998 t:0.4s +tttg: c6/158 lr:0.000997 t:0.5s +tttg: c7/158 lr:0.000996 t:0.6s +tttg: c8/158 lr:0.000995 t:0.6s +tttg: c9/158 lr:0.000994 t:0.7s +tttg: c10/158 lr:0.000992 t:0.8s +tttg: c11/158 lr:0.000990 t:0.9s +tttg: c12/158 lr:0.000988 t:0.9s +tttg: c13/158 lr:0.000986 t:1.0s +tttg: c14/158 lr:0.000983 t:1.1s +tttg: c15/158 lr:0.000981 t:1.2s +tttg: c16/158 lr:0.000978 t:1.3s +tttg: c17/158 lr:0.000975 t:1.3s +tttg: c18/158 lr:0.000971 t:1.4s +tttg: c19/158 lr:0.000968 t:1.5s +tttg: c20/158 lr:0.000964 t:1.6s +tttg: c21/158 lr:0.000960 t:1.7s +tttg: c22/158 lr:0.000957 t:1.7s +tttg: c23/158 lr:0.000952 t:1.8s +tttg: c24/158 lr:0.000948 t:1.9s +tttg: c25/158 lr:0.000943 t:2.0s +tttg: c26/158 lr:0.000939 t:2.1s +tttg: c27/158 lr:0.000934 t:2.1s +tttg: c28/158 lr:0.000929 t:2.2s +tttg: c29/158 lr:0.000924 t:2.3s +tttg: c30/158 lr:0.000918 t:2.4s +tttg: c31/158 lr:0.000913 t:2.5s +tttg: c32/158 lr:0.000907 t:2.6s +tttg: c33/158 lr:0.000901 t:2.7s +tttg: c34/158 lr:0.000895 t:2.8s +tttg: c35/158 lr:0.000889 t:2.8s +tttg: c36/158 lr:0.000882 t:2.9s +tttg: c37/158 lr:0.000876 t:3.0s +tttg: c38/158 lr:0.000869 t:3.1s +tttg: c39/158 lr:0.000862 t:3.2s +tttg: c40/158 lr:0.000855 t:3.3s +tttg: c41/158 lr:0.000848 t:3.3s +tttg: c42/158 lr:0.000841 t:3.4s +tttg: c43/158 lr:0.000834 t:3.5s +tttg: c44/158 lr:0.000826 t:3.6s +tttg: c45/158 lr:0.000818 t:3.6s +tttg: c46/158 lr:0.000811 t:3.7s +tttg: c47/158 lr:0.000803 t:3.8s +tttg: c48/158 lr:0.000795 t:3.9s +tttg: c49/158 lr:0.000787 t:4.0s +tttg: c50/158 lr:0.000778 t:4.0s +tttg: c51/158 lr:0.000770 t:4.1s +tttg: c52/158 lr:0.000761 t:4.2s +tttg: c53/158 lr:0.000753 t:4.3s +tttg: c54/158 lr:0.000744 t:4.4s +tttg: c55/158 lr:0.000735 t:4.4s +tttg: c56/158 lr:0.000727 t:4.5s +tttg: c57/158 lr:0.000718 t:4.6s +tttg: c58/158 lr:0.000709 t:4.7s +tttg: c59/158 lr:0.000699 t:4.7s +tttg: c60/158 lr:0.000690 t:4.8s +tttg: c61/158 lr:0.000681 t:4.9s +tttg: c62/158 lr:0.000672 t:5.0s +tttg: c63/158 lr:0.000662 t:5.1s +tttg: c64/158 lr:0.000653 t:5.2s +tttg: c65/158 lr:0.000643 t:5.2s +tttg: c66/158 lr:0.000633 t:5.3s +tttg: c67/158 lr:0.000624 t:5.4s +tttg: c68/158 lr:0.000614 t:5.5s +tttg: c69/158 lr:0.000604 t:5.5s +tttg: c70/158 lr:0.000594 t:5.6s +tttg: c71/158 lr:0.000585 t:5.7s +tttg: c72/158 lr:0.000575 t:5.8s +tttg: c73/158 lr:0.000565 t:5.9s +tttg: c74/158 lr:0.000555 t:5.9s +tttg: c75/158 lr:0.000545 t:6.0s +tttg: c76/158 lr:0.000535 t:6.1s +tttg: c77/158 lr:0.000525 t:6.2s +tttg: c78/158 lr:0.000515 t:6.3s +tttg: c79/158 lr:0.000505 t:6.3s +tttg: c80/158 lr:0.000495 t:6.4s +tttg: c81/158 lr:0.000485 t:6.5s +tttg: c82/158 lr:0.000475 t:6.6s +tttg: c83/158 lr:0.000465 t:6.7s +tttg: c84/158 lr:0.000455 t:6.7s +tttg: c85/158 lr:0.000445 t:6.8s +tttg: c86/158 lr:0.000435 t:6.9s +tttg: c87/158 lr:0.000425 t:7.0s +tttg: c88/158 lr:0.000415 t:7.1s +tttg: c89/158 lr:0.000406 t:7.1s +tttg: c90/158 lr:0.000396 t:7.2s +tttg: c91/158 lr:0.000386 t:7.3s +tttg: c92/158 lr:0.000376 t:7.4s +tttg: c93/158 lr:0.000367 t:7.5s +tttg: c94/158 lr:0.000357 t:7.5s +tttg: c95/158 lr:0.000347 t:7.6s +tttg: c96/158 lr:0.000338 t:7.7s +tttg: c97/158 lr:0.000328 t:7.8s +tttg: c98/158 lr:0.000319 t:7.8s +tttg: c99/158 lr:0.000310 t:7.9s +tttg: c100/158 lr:0.000301 t:8.0s +tttg: c101/158 lr:0.000291 t:8.1s +tttg: c102/158 lr:0.000282 t:8.2s +tttg: c103/158 lr:0.000273 t:8.2s +tttg: c104/158 lr:0.000265 t:8.3s +tttg: c105/158 lr:0.000256 t:8.4s +tttg: c106/158 lr:0.000247 t:8.5s +tttg: c107/158 lr:0.000239 t:8.6s +tttg: c108/158 lr:0.000230 t:8.6s +tttg: c109/158 lr:0.000222 t:8.7s +tttg: c110/158 lr:0.000213 t:8.8s +tttg: c111/158 lr:0.000205 t:8.9s +tttg: c112/158 lr:0.000197 t:8.9s +tttg: c113/158 lr:0.000189 t:9.0s +tttg: c114/158 lr:0.000182 t:9.1s +tttg: c115/158 lr:0.000174 t:9.2s +tttg: c116/158 lr:0.000166 t:9.3s +tttg: c117/158 lr:0.000159 t:9.3s +tttg: c118/158 lr:0.000152 t:9.4s +tttg: c119/158 lr:0.000145 t:9.5s +tttg: c120/158 lr:0.000138 t:9.6s +tttg: c121/158 lr:0.000131 t:9.7s +tttg: c122/158 lr:0.000124 t:9.7s +tttg: c123/158 lr:0.000118 t:9.8s +tttg: c124/158 lr:0.000111 t:9.9s +tttg: c125/158 lr:0.000105 t:10.0s +tttg: c126/158 lr:0.000099 t:10.1s +tttg: c127/158 lr:0.000093 t:10.2s +tttg: c128/158 lr:0.000087 t:10.2s +tttg: c129/158 lr:0.000082 t:10.3s +tttg: c130/158 lr:0.000076 t:10.4s +tttg: c131/158 lr:0.000071 t:10.5s +tttg: c132/158 lr:0.000066 t:10.5s +tttg: c133/158 lr:0.000061 t:10.6s +tttg: c134/158 lr:0.000057 t:10.7s +tttg: c135/158 lr:0.000052 t:10.8s +tttg: c136/158 lr:0.000048 t:10.9s +tttg: c137/158 lr:0.000043 t:10.9s +tttg: c138/158 lr:0.000040 t:11.0s +tttg: c139/158 lr:0.000036 t:11.1s +tttg: c140/158 lr:0.000032 t:11.2s +tttg: c141/158 lr:0.000029 t:11.2s +tttg: c142/158 lr:0.000025 t:11.3s +tttg: c143/158 lr:0.000022 t:11.4s +tttg: c144/158 lr:0.000019 t:11.5s +tttg: c145/158 lr:0.000017 t:11.6s +tttg: c146/158 lr:0.000014 t:11.6s +tttg: c147/158 lr:0.000012 t:11.7s +tttg: c148/158 lr:0.000010 t:11.8s +tttg: c149/158 lr:0.000008 t:11.9s +tttg: c150/158 lr:0.000006 t:12.0s +tttg: c151/158 lr:0.000005 t:12.0s +tttg: c152/158 lr:0.000004 t:12.1s +tttg: c153/158 lr:0.000003 t:12.2s +tttg: c154/158 lr:0.000002 t:12.3s +tttg: c155/158 lr:0.000001 t:12.4s +tttg: c156/158 lr:0.000000 t:12.4s +tttg: c157/158 lr:0.000000 t:12.5s +ttpr: phase:2/3 t:302.8s +ttp: b749/782 bl:2.8239 bb:1.0865 rl:2.7260 rb:1.0868 dl:2580-2638 gd:0 +ttpp: phase:3/3 pd:2448 gd:2000 t:318.0s +tttg: c1/213 lr:0.001000 t:0.1s +tttg: c2/213 lr:0.001000 t:0.2s +tttg: c3/213 lr:0.001000 t:0.2s +tttg: c4/213 lr:0.001000 t:0.3s +tttg: c5/213 lr:0.000999 t:0.4s +tttg: c6/213 lr:0.000999 t:0.5s +tttg: c7/213 lr:0.000998 t:0.6s +tttg: c8/213 lr:0.000997 t:0.6s +tttg: c9/213 lr:0.000996 t:0.7s +tttg: c10/213 lr:0.000996 t:0.8s +tttg: c11/213 lr:0.000995 t:0.9s +tttg: c12/213 lr:0.000993 t:1.0s +tttg: c13/213 lr:0.000992 t:1.0s +tttg: c14/213 lr:0.000991 t:1.1s +tttg: c15/213 lr:0.000989 t:1.2s +tttg: c16/213 lr:0.000988 t:1.3s +tttg: c17/213 lr:0.000986 t:1.4s +tttg: c18/213 lr:0.000984 t:1.4s +tttg: c19/213 lr:0.000982 t:1.5s +tttg: c20/213 lr:0.000980 t:1.6s +tttg: c21/213 lr:0.000978 t:1.7s +tttg: c22/213 lr:0.000976 t:1.8s +tttg: c23/213 lr:0.000974 t:1.8s +tttg: c24/213 lr:0.000971 t:1.9s +tttg: c25/213 lr:0.000969 t:2.0s +tttg: c26/213 lr:0.000966 t:2.1s +tttg: c27/213 lr:0.000963 t:2.1s +tttg: c28/213 lr:0.000961 t:2.2s +tttg: c29/213 lr:0.000958 t:2.3s +tttg: c30/213 lr:0.000955 t:2.4s +tttg: c31/213 lr:0.000951 t:2.5s +tttg: c32/213 lr:0.000948 t:2.6s +tttg: c33/213 lr:0.000945 t:2.6s +tttg: c34/213 lr:0.000941 t:2.7s +tttg: c35/213 lr:0.000938 t:2.8s +tttg: c36/213 lr:0.000934 t:2.9s +tttg: c37/213 lr:0.000931 t:3.0s +tttg: c38/213 lr:0.000927 t:3.0s +tttg: c39/213 lr:0.000923 t:3.1s +tttg: c40/213 lr:0.000919 t:3.2s +tttg: c41/213 lr:0.000915 t:3.3s +tttg: c42/213 lr:0.000911 t:3.4s +tttg: c43/213 lr:0.000906 t:3.4s +tttg: c44/213 lr:0.000902 t:3.5s +tttg: c45/213 lr:0.000897 t:3.6s +tttg: c46/213 lr:0.000893 t:3.7s +tttg: c47/213 lr:0.000888 t:3.8s +tttg: c48/213 lr:0.000884 t:3.8s +tttg: c49/213 lr:0.000879 t:3.9s +tttg: c50/213 lr:0.000874 t:4.0s +tttg: c51/213 lr:0.000869 t:4.1s +tttg: c52/213 lr:0.000864 t:4.1s +tttg: c53/213 lr:0.000859 t:4.2s +tttg: c54/213 lr:0.000854 t:4.3s +tttg: c55/213 lr:0.000848 t:4.4s +tttg: c56/213 lr:0.000843 t:4.5s +tttg: c57/213 lr:0.000837 t:4.6s +tttg: c58/213 lr:0.000832 t:4.6s +tttg: c59/213 lr:0.000826 t:4.7s +tttg: c60/213 lr:0.000821 t:4.8s +tttg: c61/213 lr:0.000815 t:4.9s +tttg: c62/213 lr:0.000809 t:5.0s +tttg: c63/213 lr:0.000803 t:5.1s +tttg: c64/213 lr:0.000797 t:5.1s +tttg: c65/213 lr:0.000791 t:5.2s +tttg: c66/213 lr:0.000785 t:5.3s +tttg: c67/213 lr:0.000779 t:5.4s +tttg: c68/213 lr:0.000773 t:5.5s +tttg: c69/213 lr:0.000767 t:5.5s +tttg: c70/213 lr:0.000761 t:5.6s +tttg: c71/213 lr:0.000754 t:5.7s +tttg: c72/213 lr:0.000748 t:5.8s +tttg: c73/213 lr:0.000741 t:5.9s +tttg: c74/213 lr:0.000735 t:5.9s +tttg: c75/213 lr:0.000728 t:6.0s +tttg: c76/213 lr:0.000722 t:6.1s +tttg: c77/213 lr:0.000715 t:6.2s +tttg: c78/213 lr:0.000708 t:6.3s +tttg: c79/213 lr:0.000702 t:6.3s +tttg: c80/213 lr:0.000695 t:6.4s +tttg: c81/213 lr:0.000688 t:6.5s +tttg: c82/213 lr:0.000681 t:6.6s +tttg: c83/213 lr:0.000674 t:6.7s +tttg: c84/213 lr:0.000667 t:6.8s +tttg: c85/213 lr:0.000660 t:6.8s +tttg: c86/213 lr:0.000653 t:6.9s +tttg: c87/213 lr:0.000646 t:7.0s +tttg: c88/213 lr:0.000639 t:7.1s +tttg: c89/213 lr:0.000632 t:7.1s +tttg: c90/213 lr:0.000625 t:7.2s +tttg: c91/213 lr:0.000617 t:7.3s +tttg: c92/213 lr:0.000610 t:7.4s +tttg: c93/213 lr:0.000603 t:7.5s +tttg: c94/213 lr:0.000596 t:7.5s +tttg: c95/213 lr:0.000588 t:7.6s +tttg: c96/213 lr:0.000581 t:7.7s +tttg: c97/213 lr:0.000574 t:7.8s +tttg: c98/213 lr:0.000566 t:7.9s +tttg: c99/213 lr:0.000559 t:7.9s +tttg: c100/213 lr:0.000552 t:8.0s +tttg: c101/213 lr:0.000544 t:8.1s +tttg: c102/213 lr:0.000537 t:8.2s +tttg: c103/213 lr:0.000530 t:8.3s +tttg: c104/213 lr:0.000522 t:8.3s +tttg: c105/213 lr:0.000515 t:8.4s +tttg: c106/213 lr:0.000507 t:8.5s +tttg: c107/213 lr:0.000500 t:8.6s +tttg: c108/213 lr:0.000493 t:8.7s +tttg: c109/213 lr:0.000485 t:8.7s +tttg: c110/213 lr:0.000478 t:8.8s +tttg: c111/213 lr:0.000470 t:8.9s +tttg: c112/213 lr:0.000463 t:9.0s +tttg: c113/213 lr:0.000456 t:9.1s +tttg: c114/213 lr:0.000448 t:9.1s +tttg: c115/213 lr:0.000441 t:9.2s +tttg: c116/213 lr:0.000434 t:9.3s +tttg: c117/213 lr:0.000426 t:9.4s +tttg: c118/213 lr:0.000419 t:9.5s +tttg: c119/213 lr:0.000412 t:9.5s +tttg: c120/213 lr:0.000404 t:9.6s +tttg: c121/213 lr:0.000397 t:9.7s +tttg: c122/213 lr:0.000390 t:9.8s +tttg: c123/213 lr:0.000383 t:9.9s +tttg: c124/213 lr:0.000375 t:10.0s +tttg: c125/213 lr:0.000368 t:10.0s +tttg: c126/213 lr:0.000361 t:10.1s +tttg: c127/213 lr:0.000354 t:10.2s +tttg: c128/213 lr:0.000347 t:10.3s +tttg: c129/213 lr:0.000340 t:10.3s +tttg: c130/213 lr:0.000333 t:10.4s +tttg: c131/213 lr:0.000326 t:10.5s +tttg: c132/213 lr:0.000319 t:10.6s +tttg: c133/213 lr:0.000312 t:10.7s +tttg: c134/213 lr:0.000305 t:10.7s +tttg: c135/213 lr:0.000298 t:10.8s +tttg: c136/213 lr:0.000292 t:10.9s +tttg: c137/213 lr:0.000285 t:11.0s +tttg: c138/213 lr:0.000278 t:11.1s +tttg: c139/213 lr:0.000272 t:11.2s +tttg: c140/213 lr:0.000265 t:11.2s +tttg: c141/213 lr:0.000259 t:11.3s +tttg: c142/213 lr:0.000252 t:11.4s +tttg: c143/213 lr:0.000246 t:11.5s +tttg: c144/213 lr:0.000239 t:11.6s +tttg: c145/213 lr:0.000233 t:11.6s +tttg: c146/213 lr:0.000227 t:11.7s +tttg: c147/213 lr:0.000221 t:11.8s +tttg: c148/213 lr:0.000215 t:11.9s +tttg: c149/213 lr:0.000209 t:11.9s +tttg: c150/213 lr:0.000203 t:12.0s +tttg: c151/213 lr:0.000197 t:12.1s +tttg: c152/213 lr:0.000191 t:12.2s +tttg: c153/213 lr:0.000185 t:12.3s +tttg: c154/213 lr:0.000179 t:12.3s +tttg: c155/213 lr:0.000174 t:12.4s +tttg: c156/213 lr:0.000168 t:12.5s +tttg: c157/213 lr:0.000163 t:12.6s +tttg: c158/213 lr:0.000157 t:12.7s +tttg: c159/213 lr:0.000152 t:12.7s +tttg: c160/213 lr:0.000146 t:12.8s +tttg: c161/213 lr:0.000141 t:12.9s +tttg: c162/213 lr:0.000136 t:13.0s +tttg: c163/213 lr:0.000131 t:13.1s +tttg: c164/213 lr:0.000126 t:13.2s +tttg: c165/213 lr:0.000121 t:13.3s +tttg: c166/213 lr:0.000116 t:13.3s +tttg: c167/213 lr:0.000112 t:13.4s +tttg: c168/213 lr:0.000107 t:13.5s +tttg: c169/213 lr:0.000103 t:13.6s +tttg: c170/213 lr:0.000098 t:13.6s +tttg: c171/213 lr:0.000094 t:13.7s +tttg: c172/213 lr:0.000089 t:13.8s +tttg: c173/213 lr:0.000085 t:13.9s +tttg: c174/213 lr:0.000081 t:14.0s +tttg: c175/213 lr:0.000077 t:14.0s +tttg: c176/213 lr:0.000073 t:14.1s +tttg: c177/213 lr:0.000069 t:14.2s +tttg: c178/213 lr:0.000066 t:14.3s +tttg: c179/213 lr:0.000062 t:14.4s +tttg: c180/213 lr:0.000059 t:14.5s +tttg: c181/213 lr:0.000055 t:14.5s +tttg: c182/213 lr:0.000052 t:14.6s +tttg: c183/213 lr:0.000049 t:14.7s +tttg: c184/213 lr:0.000045 t:14.8s +tttg: c185/213 lr:0.000042 t:14.9s +tttg: c186/213 lr:0.000039 t:14.9s +tttg: c187/213 lr:0.000037 t:15.0s +tttg: c188/213 lr:0.000034 t:15.1s +tttg: c189/213 lr:0.000031 t:15.2s +tttg: c190/213 lr:0.000029 t:15.3s +tttg: c191/213 lr:0.000026 t:15.4s +tttg: c192/213 lr:0.000024 t:15.4s +tttg: c193/213 lr:0.000022 t:15.5s +tttg: c194/213 lr:0.000020 t:15.6s +tttg: c195/213 lr:0.000018 t:15.7s +tttg: c196/213 lr:0.000016 t:15.8s +tttg: c197/213 lr:0.000014 t:15.8s +tttg: c198/213 lr:0.000012 t:15.9s +tttg: c199/213 lr:0.000011 t:16.0s +tttg: c200/213 lr:0.000009 t:16.1s +tttg: c201/213 lr:0.000008 t:16.2s +tttg: c202/213 lr:0.000007 t:16.2s +tttg: c203/213 lr:0.000005 t:16.3s +tttg: c204/213 lr:0.000004 t:16.4s +tttg: c205/213 lr:0.000004 t:16.5s +tttg: c206/213 lr:0.000003 t:16.6s +tttg: c207/213 lr:0.000002 t:16.7s +tttg: c208/213 lr:0.000001 t:16.7s +tttg: c209/213 lr:0.000001 t:16.8s +tttg: c210/213 lr:0.000000 t:16.9s +tttg: c211/213 lr:0.000000 t:17.0s +tttg: c212/213 lr:0.000000 t:17.1s +ttpr: phase:3/3 t:339.1s +ttp: b740/782 bl:2.7300 bb:1.0307 rl:2.7264 rb:1.0813 dl:2254-2285 gd:1 +ttp: b731/782 bl:2.7681 bb:1.0565 rl:2.7296 rb:1.0794 dl:2017-2041 gd:1 +ttp: b720/782 bl:2.8121 bb:1.0741 rl:2.7349 rb:1.0790 dl:1816-1832 gd:1 +ttp: b712/782 bl:2.8220 bb:1.0744 rl:2.7398 rb:1.0787 dl:1684-1697 gd:1 +ttp: b704/782 bl:2.7351 bb:1.0201 rl:2.7395 rb:1.0756 dl:1595-1606 gd:1 +ttp: b696/782 bl:2.8044 bb:1.0720 rl:2.7425 rb:1.0754 dl:1513-1522 gd:1 +ttp: b688/782 bl:2.7382 bb:1.0446 rl:2.7423 rb:1.0741 dl:1441-1450 gd:1 +ttp: b680/782 bl:2.7940 bb:1.0511 rl:2.7443 rb:1.0732 dl:1375-1383 gd:1 +ttp: b672/782 bl:2.8911 bb:1.1030 rl:2.7495 rb:1.0743 dl:1321-1327 gd:1 +ttp: b665/782 bl:2.7303 bb:1.0289 rl:2.7489 rb:1.0727 dl:1275-1282 gd:1 +ttp: b657/782 bl:2.7731 bb:1.0414 rl:2.7497 rb:1.0717 dl:1227-1234 gd:1 +ttp: b649/782 bl:2.7969 bb:1.0549 rl:2.7510 rb:1.0712 dl:1183-1188 gd:1 +ttp: b641/782 bl:2.7521 bb:1.0362 rl:2.7510 rb:1.0702 dl:1140-1144 gd:1 +ttp: b633/782 bl:2.8168 bb:1.0990 rl:2.7527 rb:1.0710 dl:1101-1105 gd:1 +ttp: b625/782 bl:2.6624 bb:1.0002 rl:2.7506 rb:1.0692 dl:1064-1068 gd:1 +ttp: b617/782 bl:2.7384 bb:1.0364 rl:2.7503 rb:1.0684 dl:1027-1031 gd:1 +ttp: b609/782 bl:2.7810 bb:1.0557 rl:2.7509 rb:1.0682 dl:994-999 gd:1 +ttp: b601/782 bl:2.7468 bb:1.0556 rl:2.7509 rb:1.0679 dl:963-966 gd:1 +ttp: b593/782 bl:2.7873 bb:1.0424 rl:2.7516 rb:1.0674 dl:933-937 gd:1 +ttp: b585/782 bl:2.7528 bb:1.0614 rl:2.7516 rb:1.0673 dl:908-911 gd:1 +ttp: b577/782 bl:2.7463 bb:1.0388 rl:2.7515 rb:1.0668 dl:880-884 gd:1 +ttp: b569/782 bl:2.7551 bb:1.0526 rl:2.7515 rb:1.0665 dl:855-858 gd:1 +ttp: b564/782 bl:2.8610 bb:1.1069 rl:2.7533 rb:1.0672 dl:840-843 gd:1 +ttp: b554/782 bl:2.7268 bb:1.0264 rl:2.7529 rb:1.0665 dl:809-812 gd:1 +ttp: b550/782 bl:2.7987 bb:1.0738 rl:2.7536 rb:1.0666 dl:798-801 gd:1 +ttp: b542/782 bl:2.8297 bb:1.0719 rl:2.7547 rb:1.0667 dl:777-779 gd:1 +ttp: b532/782 bl:2.8052 bb:1.0533 rl:2.7554 rb:1.0665 dl:752-754 gd:1 +ttp: b524/782 bl:2.8025 bb:1.0473 rl:2.7560 rb:1.0663 dl:732-735 gd:1 +ttp: b516/782 bl:2.8501 bb:1.0730 rl:2.7572 rb:1.0663 dl:713-715 gd:1 +ttp: b510/782 bl:2.7500 bb:1.0171 rl:2.7571 rb:1.0657 dl:698-700 gd:1 +ttp: b503/782 bl:2.8144 bb:1.0717 rl:2.7578 rb:1.0658 dl:683-685 gd:1 +ttp: b495/782 bl:2.7527 bb:1.0510 rl:2.7577 rb:1.0656 dl:664-666 gd:1 +ttp: b487/782 bl:2.8070 bb:1.0724 rl:2.7583 rb:1.0657 dl:647-649 gd:1 +ttp: b479/782 bl:2.7000 bb:1.0305 rl:2.7577 rb:1.0653 dl:630-632 gd:1 +ttp: b469/782 bl:2.7781 bb:1.1047 rl:2.7579 rb:1.0657 dl:610-611 gd:1 +ttp: b461/782 bl:2.7660 bb:1.0548 rl:2.7580 rb:1.0656 dl:595-597 gd:1 +ttp: b453/782 bl:2.7537 bb:1.0567 rl:2.7579 rb:1.0655 dl:580-582 gd:1 +ttp: b444/782 bl:2.6675 bb:1.0107 rl:2.7571 rb:1.0650 dl:564-566 gd:1 +ttp: b436/782 bl:2.8345 bb:1.0634 rl:2.7578 rb:1.0650 dl:549-551 gd:1 +ttp: b427/782 bl:2.7443 bb:1.0603 rl:2.7577 rb:1.0649 dl:533-535 gd:1 +ttp: b419/782 bl:2.7920 bb:1.0376 rl:2.7579 rb:1.0647 dl:519-521 gd:1 +ttp: b413/782 bl:2.6340 bb:0.9932 rl:2.7569 rb:1.0641 dl:510-511 gd:1 +ttp: b405/782 bl:2.8176 bb:1.0650 rl:2.7574 rb:1.0641 dl:497-498 gd:1 +ttp: b398/782 bl:2.8710 bb:1.0904 rl:2.7583 rb:1.0643 dl:486-487 gd:1 +ttp: b390/782 bl:2.8085 bb:1.0894 rl:2.7586 rb:1.0645 dl:473-475 gd:1 +ttp: b382/782 bl:2.8969 bb:1.1277 rl:2.7596 rb:1.0650 dl:461-463 gd:1 +ttp: b374/782 bl:2.7469 bb:1.0674 rl:2.7595 rb:1.0650 dl:450-452 gd:1 +ttp: b367/782 bl:2.8216 bb:1.0598 rl:2.7599 rb:1.0649 dl:440-441 gd:1 +ttp: b359/782 bl:2.7913 bb:1.0788 rl:2.7601 rb:1.0650 dl:429-430 gd:1 +ttp: b352/782 bl:2.7472 bb:1.0921 rl:2.7600 rb:1.0652 dl:419-420 gd:1 +ttp: b343/782 bl:2.7932 bb:1.0658 rl:2.7602 rb:1.0652 dl:407-408 gd:1 +ttp: b337/782 bl:2.8283 bb:1.0769 rl:2.7606 rb:1.0653 dl:399-400 gd:1 +ttp: b329/782 bl:2.8154 bb:1.0982 rl:2.7609 rb:1.0654 dl:389-390 gd:1 +ttp: b322/782 bl:2.7512 bb:1.0751 rl:2.7609 rb:1.0655 dl:380-381 gd:1 +ttp: b315/782 bl:2.7176 bb:1.0686 rl:2.7607 rb:1.0655 dl:370-371 gd:1 +ttp: b308/782 bl:2.7836 bb:1.0814 rl:2.7608 rb:1.0656 dl:362-363 gd:1 +ttp: b302/782 bl:2.8273 bb:1.0965 rl:2.7611 rb:1.0658 dl:354-355 gd:1 +ttp: b295/782 bl:2.8326 bb:1.1168 rl:2.7615 rb:1.0660 dl:345-347 gd:1 +ttp: b288/782 bl:2.8120 bb:1.1040 rl:2.7617 rb:1.0662 dl:337-339 gd:1 +ttp: b280/782 bl:2.8176 bb:1.0935 rl:2.7620 rb:1.0663 dl:329-329 gd:1 +ttp: b271/782 bl:2.7683 bb:1.0669 rl:2.7620 rb:1.0663 dl:319-320 gd:1 +ttp: b264/782 bl:2.8889 bb:1.1434 rl:2.7625 rb:1.0666 dl:311-312 gd:1 +ttp: b256/782 bl:2.8827 bb:1.1300 rl:2.7630 rb:1.0669 dl:301-302 gd:1 +ttp: b248/782 bl:2.8749 bb:1.0971 rl:2.7635 rb:1.0670 dl:293-294 gd:1 +ttp: b240/782 bl:2.9009 bb:1.1513 rl:2.7640 rb:1.0673 dl:285-286 gd:1 +ttp: b232/782 bl:2.9215 bb:1.1300 rl:2.7646 rb:1.0676 dl:277-278 gd:1 +ttp: b224/782 bl:2.8128 bb:1.1049 rl:2.7648 rb:1.0677 dl:269-270 gd:1 +ttp: b216/782 bl:2.9222 bb:1.1120 rl:2.7654 rb:1.0679 dl:261-262 gd:1 +ttp: b207/782 bl:2.8231 bb:1.1106 rl:2.7655 rb:1.0680 dl:253-254 gd:1 +ttp: b199/782 bl:2.9435 bb:1.1282 rl:2.7661 rb:1.0682 dl:246-247 gd:1 +ttp: b192/782 bl:2.8969 bb:1.1419 rl:2.7666 rb:1.0684 dl:239-240 gd:1 +ttp: b184/782 bl:2.9043 bb:1.1532 rl:2.7670 rb:1.0687 dl:232-233 gd:1 +ttp: b176/782 bl:2.8004 bb:1.0986 rl:2.7671 rb:1.0688 dl:225-226 gd:1 +ttp: b168/782 bl:2.9109 bb:1.1408 rl:2.7675 rb:1.0690 dl:218-219 gd:1 +ttp: b159/782 bl:2.9913 bb:1.1785 rl:2.7681 rb:1.0693 dl:211-212 gd:1 +ttp: b152/782 bl:2.8812 bb:1.1241 rl:2.7684 rb:1.0694 dl:205-206 gd:1 +ttp: b145/782 bl:2.8920 bb:1.1347 rl:2.7688 rb:1.0696 dl:200-200 gd:1 +ttp: b137/782 bl:2.9277 bb:1.1798 rl:2.7692 rb:1.0699 dl:193-194 gd:1 +ttp: b130/782 bl:3.1259 bb:1.2287 rl:2.7700 rb:1.0703 dl:187-188 gd:1 +ttp: b123/782 bl:2.9461 bb:1.1771 rl:2.7705 rb:1.0705 dl:182-183 gd:1 +ttp: b117/782 bl:2.8447 bb:1.1402 rl:2.7706 rb:1.0707 dl:178-178 gd:1 +ttp: b110/782 bl:3.0045 bb:1.1665 rl:2.7712 rb:1.0709 dl:173-173 gd:1 +ttp: b101/782 bl:2.9651 bb:1.1638 rl:2.7716 rb:1.0711 dl:166-167 gd:1 +ttp: b94/782 bl:2.9811 bb:1.1757 rl:2.7720 rb:1.0713 dl:160-161 gd:1 +ttp: b86/782 bl:3.0264 bb:1.2596 rl:2.7725 rb:1.0716 dl:154-155 gd:1 +ttp: b79/782 bl:3.0252 bb:1.2010 rl:2.7730 rb:1.0719 dl:149-150 gd:1 +ttp: b72/782 bl:2.9232 bb:1.1879 rl:2.7733 rb:1.0721 dl:144-144 gd:1 +ttp: b66/782 bl:3.0928 bb:1.2668 rl:2.7739 rb:1.0724 dl:139-140 gd:1 +ttp: b59/782 bl:3.0563 bb:1.1938 rl:2.7743 rb:1.0726 dl:134-134 gd:1 +ttp: b48/782 bl:2.9859 bb:1.1673 rl:2.7747 rb:1.0728 dl:125-126 gd:1 +ttp: b41/782 bl:3.1338 bb:1.2808 rl:2.7752 rb:1.0731 dl:119-120 gd:1 +ttp: b33/782 bl:3.0866 bb:1.2083 rl:2.7757 rb:1.0733 dl:113-114 gd:1 +ttp: b26/782 bl:3.0646 bb:1.2494 rl:2.7761 rb:1.0735 dl:107-107 gd:1 +ttp: b16/782 bl:3.0433 bb:1.2134 rl:2.7764 rb:1.0736 dl:97-98 gd:1 +ttp: b6/782 bl:3.2782 bb:1.2790 rl:2.7769 rb:1.0739 dl:82-84 gd:1 +quantized_ttt_phased val_loss:2.76484430 val_bpb:1.07035739 eval_time:434330ms +total_eval_time:434.3s