diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/README.md b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/README.md new file mode 100644 index 0000000000..5278898fd9 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/README.md @@ -0,0 +1,197 @@ +# Record: Leaky ReLU Slope + GPTQ Reverse-Cholesky Speedup + PR #1938 (val_bpb = 1.06242) + +> **Note:** This README captures only the bare submission record. The full +> set of insights from our parameter-golf run — every PR iteration we tried, +> the hyperparameter-tuning experiments behind each design choice, and the +> ablation results that drove our decisions — is being compiled into a +> detailed write-up. A more detailed write-up is at: https://www.junchengbillyli.com/llm-notes.html + +**val_bpb (3-seed mean) = 1.06242** | σ ≈ 0.00013 | **~15.95 MB** | 8×H100 SXM | 600 s training + 600 s eval + +A joint effort by **Tim Shen ([@TimS-ml](https://github.com/TimS-ml))** and **Billy Li ([@lijuncheng16](https://github.com/lijuncheng16))**, with thanks to **Prof. Lin Hao (Fordham University)** for sponsoring the **8×H100 SXM** and **4×RTX 4090** compute used in this submission, **[Xingyuan Ding](https://www.linkedin.com/in/xingyuan-ding-1b1349130)** for additional experiments, **Bill (Yiyuan) Li** for meaningful discussions on tokenizers, **Lijun Yu ([@Lijun-Yu](https://github.com/Lijun-Yu))** for his invaluable insights, and **Hang Zhou ([@greyjoeyzhou](https://github.com/greyjoeyzhou))** for project discussions. + +## TL;DR + +Extends [PR #1938](https://github.com/openai/parameter-golf/pull/1938) (Billy Li & Tim Shen's *S0/PR1851 + Cap Tokenizer + LQER + Global TTT*, val_bpb=1.0713) with two algorithmically free wins: + +1. **Leaky ReLU squared slope 0.5 → 0.3** — `−0.00073` BPB free win; size-neutral, wallclock-neutral. (4-point sweep confirms 0.3 is the minimum — see Key Change 1.) +2. **GPTQ reverse-Cholesky + triangular solve** instead of the standard `chol → cholesky_inverse → chol(upper)` — mathematically equivalent within fp32 ULP, **2.07–2.24× faster on RTX 4090 cuSOLVER microbench** at the GPTQ workload range. (Key Change 2.) + +Both are hardcoded inside `train_gpt.py` (the variant from [PR #1867](https://github.com/openai/parameter-golf/pull/1867)), which also ships **this PR's compliance-tuned defaults on top of PR #1938**: `LQER_TOP_K=1`, `GATED_ATTN_QUANT_GATE=1`, `TTT_BATCH_SIZE=16`, `PHASED_TTT_NUM_PHASES=3`, `GPTQ_RESERVE_SECONDS=16`. + +## Result + +| Seed | **Post-TTT val_bpb (final)** | Artifact bytes | +|------|-----------------------------:|---------------:| +| 1334 | **1.06257** | 15,947,664 | +| 42 | **1.06232** | 15,945,920 | +| 999 | **1.06237** | 15,946,532 | +| **Mean** | **1.06242** (σ ≈ 0.00013) | **15,946,705** | + + +## GPTQ reserve-time accounting +> **(04-30):** We've noticed that several +> leaderboard submissions appear to exceed the 10-minute training cap once the +> full GPTQ pipeline (Hessian collection, quantization, serialize, compress) is +> accounted for. From our own measurements, `gptq_reserve_seconds=0.5s` is +> **far insufficient**: GPTQ Hessian collection takes **~3.5-4 s** (depending +> on calibration batch size), GPTQ quantization itself **~10 s**, and the +> serialize+compress step adds another **~60-70 s for Brotli** or **~90-100 s +> for lrzip pergroup**. Among the top leaderboard PRs we surveyed, observed +> `gptq_reserve_seconds` values range across **0.5 / 4 / 8 s**; this submission +> uses **16 s** so that the full pipeline completes inside the 600 s training +> cap with margin. The few-second discrepancy is unlikely to be large enough +> to materially change the leaderboard score or ranking, but we think it's +> worth flagging. + +## Key Change 1: Leaky ReLU² slope = 0.3 + +4-point sweep at fixed seed=42 / 1.0× batch / 600 s wallclock: + +| slope | TTT BPB | Δ vs 0.30 | +|------:|--------:|----------:| +| 0.25 | 1.06151 | +0.00012 | +| **0.30** | **1.06139** | 0 | +| 0.35 | 1.06192 | +0.00053 | +| 0.50 (prior baseline) | 1.06212 | +0.00073 | +| 0.70 | 1.06267 | +0.00128 | + +Shallow V minimum at 0.3, size-neutral, no wallclock cost. Hardcoded in `train_gpt.py` lines 694-695 (Triton kernel) and line 910 (eager fallback). + +## Key Change 2: GPTQ reverse-Cholesky Hinv path + +Replaces + +```python +Hinv = torch.cholesky_inverse(torch.linalg.cholesky(H)) # 1 chol + 2 tri-solve +Hinv = torch.linalg.cholesky(Hinv, upper=True) # 1 chol on dense H^{-1} +``` + +with the mathematically equivalent single-pass + +```python +H_flip = torch.flip(H, dims=(0, 1)) +L_flip = torch.linalg.cholesky(H_flip) +U = torch.flip(L_flip, dims=(0, 1)) +Hinv = torch.linalg.solve_triangular(U, eye, upper=True) +``` + +(The proof uses `chol(H^{-1}, upper)` uniqueness under the positive-diagonal constraint; full derivation in the authors' Stage 7 ablation note.) + +**RTX 4090 cuSOLVER fp32 microbench:** + +| n | baseline | reverse_cholesky | speedup | +|--:|---------:|-----------------:|--------:| +| 512 | 0.78 ms | 0.38 ms | **2.07×** | +| 1024 | 1.80 ms | 0.82 ms | **2.18×** | +| 2048 | 3.91 ms | 1.75 ms | **2.23×** | +| 4096 | 12.99 ms | 5.81 ms | **2.24×** | + +Numerics: max relative error ≤ 5.3e-7 across `n=64..2048`; artifact bytes byte-equivalent within brotli noise. Hardcoded in `train_gpt.py` lines 1870-1874. + +## Compliance-tuned defaults (this PR vs PR #1938) + +| Hparam | PR #1938 | This | Reason | +|--------|---------:|-----:|--------| +| `LQER_TOP_K` | 3 | **1** | top-error matrix (`tok_emb`) only; −0.00044 BPB, saves bytes | +| `GATED_ATTN_QUANT_GATE` | 0 | **1** | int8 row-quant for `attn_gate_w`; −0.00011 BPB | +| `TTT_BATCH_SIZE` | 64 | **16** | smaller phased batch | +| `PHASED_TTT_NUM_PHASES` | 1 | **3** | −0.00118 BPB | +| `GPTQ_RESERVE_SECONDS` | 4 | **16** | observed Hessian (3.5 s) + quantize (12.2 s) ≈ 16 s; required for `train+GPTQ ≤ 600 s` | +| `LEAKY_RELU_SQ_SLOPE` (in script) | 0.5 | **0.3** | Key Change 1 | +| GPTQ Hinv path (in script) | `cholesky_inverse + chol(upper)` | **reverse Cholesky + tri-solve** | Key Change 2 | + +All other hparams inherit from `train_gpt.py`'s `Hyperparameters` defaults, which match the PR #1938 envelope. + +## Architecture + +11L × 512d × 8H / 4KV, MLP 4× (2048 hidden), **LeakyReLU(0.3)²**. Partial RoPE (16/64 dims), layerwise LN scale, tied embeddings (vocab 8192, caseops-augmented), logit softcap=30.0. Depth recurrence (loops layers 3-5, ×2, activated at frac=0.35). Parallel residuals from layer 8. Skip gates. SmearGate with BOS mask. Sparse attention gates. `model_params = 35,945,671`. + +## Quantization + +Full-Hessian GPTQ + SDClip, on the reverse-Cholesky Hinv path: + +- **GPTQ int6** (clip_sigmas=12.85): all attn (`c_q`, `c_k`, `c_v`, `proj`) and MLP (`fc`, `proj`) weights +- **GPTQ int7 + LQER asymmetric** (rank=4, factor int4, group_size=64): `tok_emb.weight` only (`LQER_TOP_K=1`) +- **Dedicated int8 row-quant**: `attn_gate_w` (`GATED_ATTN_QUANT_GATE=1`) +- **fp16 passthrough**: scalar params + small parameter weights +- **Brotli-11** final compression → artifact ≈ 15.95 MB + +## TTT + +Phased TTT, 3 phases × 2000 prefix docs, score-first, Adam optimizer, cosine LR (peak 1e-4). LoRA rank=96 over K, MLP, O projections. `TTT_BATCH_SIZE=16`. The script's `total_eval_time` is the canonical eval timer (matches the convention used by past SOTA records). + +## Compliance + +| Cap | Limit | Observed | Margin | +|-----|------:|---------:|-------:| +| Artifact (decimal) | 16,000,000 bytes | 15,947,664 (max of 3 seeds) | 52,336 bytes | +| `train + GPTQ` | 600 s | 584.1 s + 15.6 s ≈ 599.7 s | ~0.3 s | +| `total_eval_time` | 600 s | 482.6 s / 485.6 s / 587.7 s | 12–118 s | + +## Dataset + +This submission uses the **pre-built case-op augmented FineWeb-10B** tokenization from +[`romeerp/parameter-golf-caseops-v1`](https://huggingface.co/datasets/romeerp/parameter-golf-caseops-v1) +(pre-built shards), the same dataset that PR #1729 / PR #1736 / PR #1851 use. +The bijective case-op tokenizer (`fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model`, +shipped in `tokenizers/`) and the build script (`prepare_caseops_data.py` + +`lossless_caps.py`) are included for byte-exact rebuild, but **using the +pre-built shards from `romeerp/parameter-golf-caseops-v1` is the recommended +path**. + +## Reproducing + +```bash +# Option A (recommended): use pre-built shards from HF. +huggingface-cli download romeerp/parameter-golf-caseops-v1 \ + --repo-type dataset \ + --local-dir ./data/datasets/fineweb10B_sp8192_caseops/ + +# Option B: rebuild locally with the shipped scripts: prepare_caseops_data.py + +# Either way, the script expects shards at +# ./data/datasets/fineweb10B_sp8192_caseops/datasets/datasets/fineweb10B_sp8192_lossless_caps_caseops_v1_reserved/ +# (the path layout is preserved across both options). + +export RUN_ID=repro_seed42 +export SEED=42 +torchrun --nproc_per_node=8 --standalone train_gpt.py +``` + +`Hyperparameters` defaults already encode this PR's compliance-tuned envelope (this PR + b-series, on top of PR #1938); no other env exports are needed. + +## Builds On + +| Layer | Origin | +|-------|--------| +| **PR #1938** ([@lijuncheng16](https://github.com/lijuncheng16) & [@TimS-ml](https://github.com/TimS-ml) — *S0/PR1851 + Cap Tokenizer + LQER + Global TTT*, val_bpb=1.0713) | base submission stack | +| **PR #1867** ([@lijuncheng16](https://github.com/lijuncheng16) & [@TimS-ml](https://github.com/TimS-ml)) | training script | +| **PR #1851** ([@aquariouseworkman](https://github.com/aquariouseworkman) — SmearGate BOS fix + LQER asymmetric + phased TTT) | architecture / quantization | +| PR #1797 ([@dexhunter](https://github.com/dexhunter), audit by [@cocohearts](https://github.com/cocohearts)) | SmearGate, LQER asym | +| PR #1787 ([@nprime06](https://github.com/nprime06)) | SparseAttnGate, FusedCE, MIN_LR | +| PR #1729 / PR #1736 ([@romeerp](https://github.com/romeerp)) | CaseOps tokenizer + phased TTT | +| PR #1394 ([@clarkkev](https://github.com/clarkkev)) | GPTQ + SDClip + SP8192 | +| PR #549 ([@abaybektursun](https://github.com/abaybektursun)) | Score-first TTT framework | + +## Acknowledgments + +A joint effort by **Tim Shen ([@TimS-ml](https://github.com/TimS-ml))** and **Billy Li ([@lijuncheng16](https://github.com/lijuncheng16))**. + +With thanks to: + +- **Prof. Lin Hao (Fordham University)** — for sponsoring the 8×H100 SXM and 4×RTX 4090 compute used to produce all sweep, training, and microbench results in this record. +- **[Xingyuan Ding](https://www.linkedin.com/in/xingyuan-ding-1b1349130)** — for experiments and A100 support. +- **Bill (Yiyuan) Li** — for meaningful discussions on tokenizers. +- **Lijun Yu ([@Lijun-Yu](https://github.com/Lijun-Yu))** - for his invaluable insights. +- **Hang Zhou ([@greyjoeyzhou](https://github.com/greyjoeyzhou))** — for project discussions and for the concurrent auto-research agent infrastructure. + +Additional credits (technique stack): + +- [@aquariouseworkman](https://github.com/aquariouseworkman) — PR #1851 SmearGate BOS-fix base stack +- [@cocohearts](https://github.com/cocohearts) — SmearGate BOS audit (PR #1797) +- [@dexhunter](https://github.com/dexhunter) — SmearGate + LQER asymmetric, phased TTT (PR #1797 / PR #1736) +- [@romeerp](https://github.com/romeerp) — CaseOps tokenizer (PR #1729 / PR #1736) +- [@nprime06](https://github.com/nprime06) — SparseAttnGate / FusedCE / MIN_LR (PR #1787) +- [@abaybektursun](https://github.com/abaybektursun) — Score-first TTT framework (PR #549) +- [@clarkkev](https://github.com/clarkkev) — GPTQ + SDClip + SP8192 (PR #1394) diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/lossless_caps.py b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/lossless_caps.py new file mode 100644 index 0000000000..98e472f824 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/lossless_caps.py @@ -0,0 +1,833 @@ +"""Lossless capitalization pre-encoding helpers. + +This module provides a narrow, reversible transform that only touches +ASCII capital letters `A-Z`. Each uppercase ASCII letter is rewritten as +``, where `sentinel` is a private-use Unicode +character that is escaped by doubling if it appears literally in the +input text. + +Example with the default sentinel `\\uE000`: + + "The NASA Launch" -> "\\uE000the \\uE000n\\uE000a\\uE000s\\uE000a \\uE000launch" + +The transform is intentionally simple for v1: + +- lowercase ASCII letters are unchanged +- uppercase ASCII letters become sentinel + lowercase letter +- non-ASCII characters are left untouched +- literal sentinel characters are escaped as sentinel + sentinel + +This makes the transform exactly invertible while allowing a downstream +tokenizer to reuse lowercase subwords across case variants. +""" + +from __future__ import annotations + +import json +from pathlib import Path +from typing import Callable, Iterable + +LOSSLESS_CAPS_V1 = "lossless_caps_v1" +LOSSLESS_CAPS_V2 = "lossless_caps_v2" +LOSSLESS_CAPS_V3 = "lossless_caps_v3" +LOSSLESS_CAPS_V4 = "lossless_caps_v4" +LOSSLESS_CAPS_V5 = "lossless_caps_v5" +LOSSLESS_CAPS_V6 = "lossless_caps_v6" +LOSSLESS_CAPS_V7 = "lossless_caps_v7" +LOSSLESS_CAPS_CASEOPS_V1 = "lossless_caps_caseops_v1" +IDENTITY = "identity" +DEFAULT_SENTINEL = "\uE000" +DEFAULT_V2_TITLE = "\uE001" +DEFAULT_V2_ALLCAPS = "\uE002" +DEFAULT_V2_CAPNEXT = "\uE003" +DEFAULT_V2_ESC = "\uE004" +DEFAULT_V5_TITLE_MIN_LEN = 7 +DEFAULT_V6_ALLCAPS_MIN_LEN = 3 +DEFAULT_V7_ALLCAPS_MIN_LEN = 4 + + +class LosslessCapsError(ValueError): + """Raised when a transformed string is malformed.""" + + +def _is_ascii_upper(ch: str) -> bool: + return "A" <= ch <= "Z" + + +def _is_ascii_lower(ch: str) -> bool: + return "a" <= ch <= "z" + + +def _is_ascii_alpha(ch: str) -> bool: + return _is_ascii_lower(ch) or _is_ascii_upper(ch) + + +def _validate_distinct_single_chars(*chars: str) -> None: + if any(len(ch) != 1 for ch in chars): + raise ValueError("all control characters must be exactly one character") + if len(set(chars)) != len(chars): + raise ValueError("control characters must be distinct") + + +def encode_lossless_caps_v1(text: str, *, sentinel: str = DEFAULT_SENTINEL) -> str: + """Encode ASCII capitals reversibly using a one-character sentinel.""" + if len(sentinel) != 1: + raise ValueError("sentinel must be exactly one character") + out: list[str] = [] + for ch in text: + if ch == sentinel: + out.append(sentinel) + out.append(sentinel) + elif _is_ascii_upper(ch): + out.append(sentinel) + out.append(ch.lower()) + else: + out.append(ch) + return "".join(out) + + +def decode_lossless_caps_v1(text: str, *, sentinel: str = DEFAULT_SENTINEL) -> str: + """Decode the `lossless_caps_v1` transform back to the original text.""" + if len(sentinel) != 1: + raise ValueError("sentinel must be exactly one character") + out: list[str] = [] + i = 0 + n = len(text) + while i < n: + ch = text[i] + if ch != sentinel: + out.append(ch) + i += 1 + continue + if i + 1 >= n: + raise LosslessCapsError("dangling capitalization sentinel at end of string") + nxt = text[i + 1] + if nxt == sentinel: + out.append(sentinel) + elif _is_ascii_lower(nxt): + out.append(nxt.upper()) + else: + raise LosslessCapsError( + f"invalid sentinel escape sequence {sentinel + nxt!r}; " + "expected doubled sentinel or sentinel + lowercase ASCII letter" + ) + i += 2 + return "".join(out) + + +def encode_lossless_caps_v2( + text: str, + *, + title: str = DEFAULT_V2_TITLE, + allcaps: str = DEFAULT_V2_ALLCAPS, + capnext: str = DEFAULT_V2_CAPNEXT, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Encode ASCII word capitalization with cheap word-level markers. + + Rules over maximal ASCII alphabetic runs: + - lowercase words stay unchanged + - TitleCase words become `title + lowercase(word)` + - ALLCAPS words become `allcaps + lowercase(word)` + - mixed-case words use: + - optional `title` when the first letter is uppercase + - `capnext + lowercase(letter)` for subsequent uppercase letters + - literal control characters are escaped as `esc + literal` + """ + _validate_distinct_single_chars(title, allcaps, capnext, esc) + controls = {title, allcaps, capnext, esc} + out: list[str] = [] + i = 0 + n = len(text) + while i < n: + ch = text[i] + if ch in controls: + out.append(esc) + out.append(ch) + i += 1 + continue + if not _is_ascii_alpha(ch): + out.append(ch) + i += 1 + continue + + j = i + 1 + while j < n and _is_ascii_alpha(text[j]): + j += 1 + word = text[i:j] + lower_word = word.lower() + + if word.islower(): + out.append(word) + elif len(word) >= 2 and word.isupper(): + out.append(allcaps) + out.append(lower_word) + elif _is_ascii_upper(word[0]) and word[1:].islower(): + out.append(title) + out.append(lower_word) + else: + if _is_ascii_upper(word[0]): + out.append(title) + out.append(lower_word[0]) + for orig_ch, lower_ch in zip(word[1:], lower_word[1:], strict=True): + if _is_ascii_upper(orig_ch): + out.append(capnext) + out.append(lower_ch) + i = j + return "".join(out) + + +def decode_lossless_caps_v2( + text: str, + *, + title: str = DEFAULT_V2_TITLE, + allcaps: str = DEFAULT_V2_ALLCAPS, + capnext: str = DEFAULT_V2_CAPNEXT, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Decode the `lossless_caps_v2` transform back to the original text.""" + _validate_distinct_single_chars(title, allcaps, capnext, esc) + out: list[str] = [] + pending_escape = False + pending_word_mode: str | None = None + active_allcaps = False + pending_capnext = False + in_ascii_word = False + + for ch in text: + if pending_escape: + if pending_word_mode is not None and not _is_ascii_alpha(ch): + raise LosslessCapsError("escaped control char cannot satisfy pending word capitalization mode") + out.append(ch) + pending_escape = False + if _is_ascii_alpha(ch): + in_ascii_word = True + else: + in_ascii_word = False + active_allcaps = False + continue + + if ch == esc: + pending_escape = True + continue + if ch == title: + if pending_word_mode is not None or in_ascii_word or pending_capnext: + raise LosslessCapsError("invalid title marker placement") + pending_word_mode = "title" + continue + if ch == allcaps: + if pending_word_mode is not None or in_ascii_word or pending_capnext: + raise LosslessCapsError("invalid allcaps marker placement") + pending_word_mode = "allcaps" + continue + if ch == capnext: + if pending_capnext: + raise LosslessCapsError("duplicate capnext marker") + pending_capnext = True + continue + + if _is_ascii_alpha(ch): + at_word_start = not in_ascii_word + if at_word_start: + if pending_word_mode == "allcaps": + out.append(ch.upper()) + active_allcaps = True + elif pending_word_mode == "title": + out.append(ch.upper()) + elif pending_capnext: + out.append(ch.upper()) + else: + out.append(ch) + pending_word_mode = None + pending_capnext = False + in_ascii_word = True + continue + + if pending_word_mode is not None: + raise LosslessCapsError("word capitalization marker leaked into the middle of a word") + if active_allcaps: + out.append(ch.upper()) + elif pending_capnext: + out.append(ch.upper()) + else: + out.append(ch) + pending_capnext = False + continue + + if pending_word_mode is not None or pending_capnext: + raise LosslessCapsError("capitalization marker not followed by an ASCII letter") + out.append(ch) + in_ascii_word = False + active_allcaps = False + + if pending_escape: + raise LosslessCapsError("dangling escape marker at end of string") + if pending_word_mode is not None or pending_capnext: + raise LosslessCapsError("dangling capitalization marker at end of string") + return "".join(out) + + +def encode_lossless_caps_v3( + text: str, + *, + title: str = DEFAULT_V2_TITLE, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Encode only common word-level capitalization patterns. + + Rules over maximal ASCII alphabetic runs: + - lowercase words stay unchanged + - TitleCase words become `title + lowercase(word)` + - ALLCAPS words become `allcaps + lowercase(word)` + - all other mixed-case words are left unchanged + - literal control characters are escaped as `esc + literal` + """ + _validate_distinct_single_chars(title, allcaps, esc) + controls = {title, allcaps, esc} + out: list[str] = [] + i = 0 + n = len(text) + while i < n: + ch = text[i] + if ch in controls: + out.append(esc) + out.append(ch) + i += 1 + continue + if not _is_ascii_alpha(ch): + out.append(ch) + i += 1 + continue + + j = i + 1 + while j < n and _is_ascii_alpha(text[j]): + j += 1 + word = text[i:j] + + if word.islower(): + out.append(word) + elif len(word) >= 2 and word.isupper(): + out.append(allcaps) + out.append(word.lower()) + elif _is_ascii_upper(word[0]) and word[1:].islower(): + out.append(title) + out.append(word.lower()) + else: + out.append(word) + i = j + return "".join(out) + + +def decode_lossless_caps_v3( + text: str, + *, + title: str = DEFAULT_V2_TITLE, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Decode the `lossless_caps_v3` transform back to the original text.""" + _validate_distinct_single_chars(title, allcaps, esc) + out: list[str] = [] + pending_escape = False + pending_word_mode: str | None = None + active_allcaps = False + in_ascii_word = False + + for ch in text: + if pending_escape: + if pending_word_mode is not None and not _is_ascii_alpha(ch): + raise LosslessCapsError("escaped control char cannot satisfy pending word capitalization mode") + out.append(ch) + pending_escape = False + if _is_ascii_alpha(ch): + in_ascii_word = True + else: + in_ascii_word = False + active_allcaps = False + continue + + if ch == esc: + pending_escape = True + continue + if ch == title: + if pending_word_mode is not None or in_ascii_word: + raise LosslessCapsError("invalid title marker placement") + pending_word_mode = "title" + continue + if ch == allcaps: + if pending_word_mode is not None or in_ascii_word: + raise LosslessCapsError("invalid allcaps marker placement") + pending_word_mode = "allcaps" + continue + + if _is_ascii_alpha(ch): + at_word_start = not in_ascii_word + if at_word_start: + if pending_word_mode == "allcaps": + out.append(ch.upper()) + active_allcaps = True + elif pending_word_mode == "title": + out.append(ch.upper()) + else: + out.append(ch) + pending_word_mode = None + in_ascii_word = True + continue + + if pending_word_mode is not None: + raise LosslessCapsError("word capitalization marker leaked into the middle of a word") + out.append(ch.upper() if active_allcaps else ch) + continue + + if pending_word_mode is not None: + raise LosslessCapsError("capitalization marker not followed by an ASCII letter") + out.append(ch) + in_ascii_word = False + active_allcaps = False + + if pending_escape: + raise LosslessCapsError("dangling escape marker at end of string") + if pending_word_mode is not None: + raise LosslessCapsError("dangling capitalization marker at end of string") + return "".join(out) + + +def encode_lossless_caps_v4( + text: str, + *, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Encode only ALLCAPS ASCII words, leaving all other case untouched.""" + _validate_distinct_single_chars(allcaps, esc) + controls = {allcaps, esc} + out: list[str] = [] + i = 0 + n = len(text) + while i < n: + ch = text[i] + if ch in controls: + out.append(esc) + out.append(ch) + i += 1 + continue + if not _is_ascii_alpha(ch): + out.append(ch) + i += 1 + continue + j = i + 1 + while j < n and _is_ascii_alpha(text[j]): + j += 1 + word = text[i:j] + if len(word) >= 2 and word.isupper(): + out.append(allcaps) + out.append(word.lower()) + else: + out.append(word) + i = j + return "".join(out) + + +def decode_lossless_caps_v4( + text: str, + *, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Decode the `lossless_caps_v4` transform back to the original text.""" + _validate_distinct_single_chars(allcaps, esc) + out: list[str] = [] + pending_escape = False + pending_allcaps = False + in_ascii_word = False + active_allcaps = False + + for ch in text: + if pending_escape: + if pending_allcaps and not _is_ascii_alpha(ch): + raise LosslessCapsError("escaped control char cannot satisfy pending allcaps mode") + out.append(ch) + pending_escape = False + if _is_ascii_alpha(ch): + in_ascii_word = True + else: + in_ascii_word = False + active_allcaps = False + continue + + if ch == esc: + pending_escape = True + continue + if ch == allcaps: + if pending_allcaps or in_ascii_word: + raise LosslessCapsError("invalid allcaps marker placement") + pending_allcaps = True + continue + + if _is_ascii_alpha(ch): + if not in_ascii_word: + active_allcaps = pending_allcaps + pending_allcaps = False + in_ascii_word = True + out.append(ch.upper() if active_allcaps else ch) + continue + + if pending_allcaps: + raise LosslessCapsError("allcaps marker not followed by an ASCII letter") + out.append(ch) + in_ascii_word = False + active_allcaps = False + + if pending_escape: + raise LosslessCapsError("dangling escape marker at end of string") + if pending_allcaps: + raise LosslessCapsError("dangling allcaps marker at end of string") + return "".join(out) + + +def encode_lossless_caps_v5( + text: str, + *, + title: str = DEFAULT_V2_TITLE, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, + title_min_len: int = DEFAULT_V5_TITLE_MIN_LEN, +) -> str: + """Encode ALLCAPS words and only sufficiently long TitleCase words.""" + _validate_distinct_single_chars(title, allcaps, esc) + controls = {title, allcaps, esc} + out: list[str] = [] + i = 0 + n = len(text) + while i < n: + ch = text[i] + if ch in controls: + out.append(esc) + out.append(ch) + i += 1 + continue + if not _is_ascii_alpha(ch): + out.append(ch) + i += 1 + continue + j = i + 1 + while j < n and _is_ascii_alpha(text[j]): + j += 1 + word = text[i:j] + if len(word) >= 2 and word.isupper(): + out.append(allcaps) + out.append(word.lower()) + elif len(word) >= title_min_len and _is_ascii_upper(word[0]) and word[1:].islower(): + out.append(title) + out.append(word.lower()) + else: + out.append(word) + i = j + return "".join(out) + + +def decode_lossless_caps_v5( + text: str, + *, + title: str = DEFAULT_V2_TITLE, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Decode the `lossless_caps_v5` transform back to the original text.""" + return decode_lossless_caps_v3(text, title=title, allcaps=allcaps, esc=esc) + + +def encode_lossless_caps_v6( + text: str, + *, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, + allcaps_min_len: int = DEFAULT_V6_ALLCAPS_MIN_LEN, +) -> str: + """Encode only ALLCAPS words with length >= allcaps_min_len.""" + _validate_distinct_single_chars(allcaps, esc) + controls = {allcaps, esc} + out: list[str] = [] + i = 0 + n = len(text) + while i < n: + ch = text[i] + if ch in controls: + out.append(esc) + out.append(ch) + i += 1 + continue + if not _is_ascii_alpha(ch): + out.append(ch) + i += 1 + continue + j = i + 1 + while j < n and _is_ascii_alpha(text[j]): + j += 1 + word = text[i:j] + if len(word) >= allcaps_min_len and word.isupper(): + out.append(allcaps) + out.append(word.lower()) + else: + out.append(word) + i = j + return "".join(out) + + +def decode_lossless_caps_v6( + text: str, + *, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Decode the `lossless_caps_v6` transform back to the original text.""" + return decode_lossless_caps_v4(text, allcaps=allcaps, esc=esc) + + +def encode_lossless_caps_v7( + text: str, + *, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, + allcaps_min_len: int = DEFAULT_V7_ALLCAPS_MIN_LEN, +) -> str: + """Encode only ALLCAPS words with length >= 4.""" + return encode_lossless_caps_v6( + text, + allcaps=allcaps, + esc=esc, + allcaps_min_len=allcaps_min_len, + ) + + +def decode_lossless_caps_v7( + text: str, + *, + allcaps: str = DEFAULT_V2_ALLCAPS, + esc: str = DEFAULT_V2_ESC, +) -> str: + """Decode the `lossless_caps_v7` transform back to the original text.""" + return decode_lossless_caps_v6(text, allcaps=allcaps, esc=esc) + + +def get_text_transform(name: str | None) -> Callable[[str], str]: + """Return the forward text transform for the given config name.""" + normalized = IDENTITY if name in {None, "", IDENTITY} else str(name) + if normalized == IDENTITY: + return lambda text: text + if normalized == LOSSLESS_CAPS_V1: + return encode_lossless_caps_v1 + if normalized == LOSSLESS_CAPS_V2: + return encode_lossless_caps_v2 + if normalized == LOSSLESS_CAPS_V3: + return encode_lossless_caps_v3 + if normalized == LOSSLESS_CAPS_V4: + return encode_lossless_caps_v4 + if normalized == LOSSLESS_CAPS_V5: + return encode_lossless_caps_v5 + if normalized == LOSSLESS_CAPS_V6: + return encode_lossless_caps_v6 + if normalized == LOSSLESS_CAPS_V7: + return encode_lossless_caps_v7 + if normalized == LOSSLESS_CAPS_CASEOPS_V1: + return encode_lossless_caps_v2 + raise ValueError(f"unsupported text_transform={name!r}") + + +def get_text_inverse_transform(name: str | None) -> Callable[[str], str]: + """Return the inverse transform for the given config name.""" + normalized = IDENTITY if name in {None, "", IDENTITY} else str(name) + if normalized == IDENTITY: + return lambda text: text + if normalized == LOSSLESS_CAPS_V1: + return decode_lossless_caps_v1 + if normalized == LOSSLESS_CAPS_V2: + return decode_lossless_caps_v2 + if normalized == LOSSLESS_CAPS_V3: + return decode_lossless_caps_v3 + if normalized == LOSSLESS_CAPS_V4: + return decode_lossless_caps_v4 + if normalized == LOSSLESS_CAPS_V5: + return decode_lossless_caps_v5 + if normalized == LOSSLESS_CAPS_V6: + return decode_lossless_caps_v6 + if normalized == LOSSLESS_CAPS_V7: + return decode_lossless_caps_v7 + if normalized == LOSSLESS_CAPS_CASEOPS_V1: + return decode_lossless_caps_v2 + raise ValueError(f"unsupported text_transform={name!r}") + + +def normalize_text_transform_name(name: str | None) -> str: + """Normalize empty/None transform names to the identity transform.""" + return IDENTITY if name in {None, "", IDENTITY} else str(name) + + +def get_text_transform_control_symbols(name: str | None) -> list[str]: + """Return reserved control symbols used by a transform, if any.""" + normalized = normalize_text_transform_name(name) + if normalized == IDENTITY: + return [] + if normalized == LOSSLESS_CAPS_V1: + return [DEFAULT_SENTINEL] + if normalized == LOSSLESS_CAPS_V2: + return [DEFAULT_V2_TITLE, DEFAULT_V2_ALLCAPS, DEFAULT_V2_CAPNEXT, DEFAULT_V2_ESC] + if normalized == LOSSLESS_CAPS_CASEOPS_V1: + return [DEFAULT_V2_TITLE, DEFAULT_V2_ALLCAPS, DEFAULT_V2_CAPNEXT, DEFAULT_V2_ESC] + if normalized in {LOSSLESS_CAPS_V3, LOSSLESS_CAPS_V5}: + return [DEFAULT_V2_TITLE, DEFAULT_V2_ALLCAPS, DEFAULT_V2_ESC] + if normalized in {LOSSLESS_CAPS_V4, LOSSLESS_CAPS_V6, LOSSLESS_CAPS_V7}: + return [DEFAULT_V2_ALLCAPS, DEFAULT_V2_ESC] + raise ValueError(f"unsupported text_transform={name!r}") + + +def infer_text_transform_from_manifest(tokenizer_path: str | Path) -> str: + """Best-effort lookup of a tokenizer's text transform from a local manifest.""" + tokenizer_path = Path(tokenizer_path).expanduser().resolve() + manifest_candidates = [ + tokenizer_path.parent.parent / "manifest.json", + tokenizer_path.parent / "manifest.json", + ] + for manifest_path in manifest_candidates: + if not manifest_path.is_file(): + continue + try: + payload = json.loads(manifest_path.read_text(encoding="utf-8")) + except (OSError, json.JSONDecodeError): + continue + tokenizers = payload.get("tokenizers") + if not isinstance(tokenizers, list): + continue + for tokenizer_meta in tokenizers: + if not isinstance(tokenizer_meta, dict): + continue + model_path = tokenizer_meta.get("model_path") or tokenizer_meta.get("path") + if not model_path: + continue + candidate = (manifest_path.parent / str(model_path)).resolve() + if candidate == tokenizer_path: + return normalize_text_transform_name(tokenizer_meta.get("text_transform")) + return IDENTITY + + +def surface_piece_original_byte_counts( + surfaces: Iterable[str], + *, + text_transform_name: str | None = None, + sentinel: str = DEFAULT_SENTINEL, +) -> list[int]: + """Return exact original UTF-8 byte counts contributed by each surface piece. + + `surfaces` must be the exact decoded text fragments emitted by SentencePiece + in order, e.g. `piece.surface` from `encode_as_immutable_proto`. + """ + normalized = normalize_text_transform_name(text_transform_name) + if normalized == IDENTITY: + return [len(surface.encode("utf-8")) for surface in surfaces] + if normalized == LOSSLESS_CAPS_V1: + if len(sentinel) != 1: + raise ValueError("sentinel must be exactly one character") + sentinel_bytes = len(sentinel.encode("utf-8")) + pending_sentinel = False + counts: list[int] = [] + for surface in surfaces: + piece_bytes = 0 + for ch in surface: + if pending_sentinel: + if ch == sentinel: + piece_bytes += sentinel_bytes + elif _is_ascii_lower(ch): + piece_bytes += 1 + else: + raise LosslessCapsError( + f"invalid continuation {ch!r} after capitalization sentinel" + ) + pending_sentinel = False + continue + if ch == sentinel: + pending_sentinel = True + else: + piece_bytes += len(ch.encode("utf-8")) + counts.append(piece_bytes) + if pending_sentinel: + raise LosslessCapsError("dangling capitalization sentinel across piece boundary") + return counts + if normalized not in {LOSSLESS_CAPS_V2, LOSSLESS_CAPS_V3, LOSSLESS_CAPS_V4, LOSSLESS_CAPS_V5, LOSSLESS_CAPS_V6, LOSSLESS_CAPS_V7, LOSSLESS_CAPS_CASEOPS_V1}: + raise ValueError(f"unsupported text_transform={text_transform_name!r}") + + title = DEFAULT_V2_TITLE + allcaps = DEFAULT_V2_ALLCAPS + capnext = DEFAULT_V2_CAPNEXT + esc = DEFAULT_V2_ESC + if normalized in {LOSSLESS_CAPS_V2, LOSSLESS_CAPS_CASEOPS_V1}: + _validate_distinct_single_chars(title, allcaps, capnext, esc) + elif normalized in {LOSSLESS_CAPS_V4, LOSSLESS_CAPS_V6, LOSSLESS_CAPS_V7}: + _validate_distinct_single_chars(allcaps, esc) + else: + _validate_distinct_single_chars(title, allcaps, esc) + pending_escape = False + pending_word_mode: str | None = None + active_allcaps = False + pending_capnext = False + in_ascii_word = False + counts: list[int] = [] + for surface in surfaces: + piece_bytes = 0 + for ch in surface: + if pending_escape: + if pending_word_mode is not None and not _is_ascii_alpha(ch): + raise LosslessCapsError("escaped control char cannot satisfy pending word capitalization mode") + piece_bytes += len(ch.encode("utf-8")) + pending_escape = False + if _is_ascii_alpha(ch): + in_ascii_word = True + else: + in_ascii_word = False + active_allcaps = False + continue + if ch == esc: + pending_escape = True + continue + if normalized in {LOSSLESS_CAPS_V2, LOSSLESS_CAPS_V3, LOSSLESS_CAPS_V5, LOSSLESS_CAPS_CASEOPS_V1} and ch == title: + if pending_word_mode is not None or in_ascii_word or pending_capnext: + raise LosslessCapsError("invalid title marker placement") + pending_word_mode = "title" + continue + if ch == allcaps: + if pending_word_mode is not None or in_ascii_word or pending_capnext: + raise LosslessCapsError("invalid allcaps marker placement") + pending_word_mode = "allcaps" + continue + if normalized in {LOSSLESS_CAPS_V2, LOSSLESS_CAPS_CASEOPS_V1} and ch == capnext: + if pending_capnext: + raise LosslessCapsError("duplicate capnext marker") + pending_capnext = True + continue + + if _is_ascii_alpha(ch): + at_word_start = not in_ascii_word + if at_word_start: + piece_bytes += 1 + active_allcaps = pending_word_mode == "allcaps" + pending_word_mode = None + pending_capnext = False + in_ascii_word = True + continue + if pending_word_mode is not None: + raise LosslessCapsError("word capitalization marker leaked into the middle of a word") + piece_bytes += 1 + pending_capnext = False + continue + + if pending_word_mode is not None or pending_capnext: + raise LosslessCapsError("capitalization marker not followed by an ASCII letter") + piece_bytes += len(ch.encode("utf-8")) + in_ascii_word = False + active_allcaps = False + counts.append(piece_bytes) + if pending_escape: + raise LosslessCapsError("dangling escape marker across piece boundary") + if pending_word_mode is not None or pending_capnext: + raise LosslessCapsError("dangling capitalization marker across piece boundary") + return counts diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/prepare_caseops_data.py b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/prepare_caseops_data.py new file mode 100644 index 0000000000..5c3f13e69c --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/prepare_caseops_data.py @@ -0,0 +1,177 @@ +"""Prepare CaseOps-tokenized FineWeb shards + per-token byte sidecar. + +CaseOps (``lossless_caps_caseops_v1``) is a bijective, character-level text +transform that introduces four operator tokens in place of explicit +capitalization: TITLE, ALLCAPS, CAPNEXT, ESC. The transform is fully +reversible — no information is lost relative to the untransformed UTF-8 +text, so BPB stays computable on TRUE byte counts. + +Forward pipeline: + 1. Read the canonical FineWeb-10B doc stream (``docs_selected.jsonl`` + produced by ``data/download_hf_docs_and_tokenize.py`` in the root repo). + 2. Apply ``encode_lossless_caps_v2`` (the caseops_v1 alias) to each doc. + 3. Tokenize with the shipped SP model + ``tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model`` + (reserves TITLE/ALLCAPS/CAPNEXT/ESC + sentinel as user_defined_symbols). + 4. Write uint16 train/val shards (``fineweb_{train,val}_XXXXXX.bin``). + 5. For the VAL stream only, emit per-token byte sidecar shards + (``fineweb_val_bytes_XXXXXX.bin``, uint16 parallel arrays) that record + each token's ORIGINAL pre-transform UTF-8 byte count. BPB is computed + from these canonical bytes so the score is on the untransformed text + (not the transformed representation). + +Output layout — matches what ``train_gpt.py`` expects under +``DATA_DIR=./data`` with ``CASEOPS_ENABLED=1``: + + data/datasets/fineweb10B_sp8192_caseops/datasets/ + tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model + datasets/fineweb10B_sp8192_lossless_caps_caseops_v1_reserved/ + fineweb_train_000000.bin + fineweb_train_000001.bin + ... + fineweb_val_000000.bin + fineweb_val_bytes_000000.bin + +Usage: + + python3 prepare_caseops_data.py \\ + --docs ./fineweb10B_raw/docs_selected.jsonl \\ + --out ./data/datasets/fineweb10B_sp8192_caseops/datasets \\ + --sp ./tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model + +Requirements: sentencepiece, numpy. CPU-only. Runs once; reused across seeds. +""" +from __future__ import annotations + +import argparse +import json +import pathlib +import struct +import sys + +import numpy as np +import sentencepiece as spm + +# Local import — lossless_caps.py ships next to this script. +sys.path.insert(0, str(pathlib.Path(__file__).resolve().parent)) +from lossless_caps import ( # noqa: E402 + LOSSLESS_CAPS_CASEOPS_V1, + encode_lossless_caps_v2, + surface_piece_original_byte_counts, +) + + +SHARD_MAGIC = 20240520 +SHARD_VERSION = 1 +SHARD_TOKENS = 10_000_000 # tokens per shard — matches the main pipeline +BOS_ID = 1 # SP model's control token; train_gpt.py:_find_docs requires BOS per doc + + +def _write_shard(out_path: pathlib.Path, arr: np.ndarray) -> None: + """Write a uint16 shard in the standard header-prefixed format.""" + assert arr.dtype == np.uint16 + header = np.zeros(256, dtype=np.int32) + header[0] = SHARD_MAGIC + header[1] = SHARD_VERSION + header[2] = int(arr.size) + with out_path.open("wb") as fh: + fh.write(header.tobytes()) + fh.write(arr.tobytes()) + + +def _iter_docs(docs_path: pathlib.Path): + """Yield doc strings from a jsonl file (one json object per line).""" + with docs_path.open("r", encoding="utf-8") as fh: + for line in fh: + line = line.strip() + if not line: + continue + obj = json.loads(line) + # Support both {"text": ...} and raw strings. + yield obj["text"] if isinstance(obj, dict) else obj + + +def _token_original_byte_counts( + sp: spm.SentencePieceProcessor, + original_text: str, + transformed_text: str, +) -> np.ndarray: + """Per-token canonical (pre-transform) UTF-8 byte counts. + + Delegates to ``surface_piece_original_byte_counts`` in ``lossless_caps.py`` + — the canonical exporter used by the PR #1729 / HF-hosted CaseOps dataset. + Operator pieces (U+E001..U+E004) contribute 0 original bytes; letter pieces + contribute their pre-transform UTF-8 byte count. + """ + proto = sp.encode_as_immutable_proto(transformed_text) + byte_counts = surface_piece_original_byte_counts( + (piece.surface for piece in proto.pieces), + text_transform_name=LOSSLESS_CAPS_CASEOPS_V1, + ) + return np.asarray(list(byte_counts), dtype=np.uint16) + + +def main() -> None: + ap = argparse.ArgumentParser(description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter) + ap.add_argument("--docs", required=True, type=pathlib.Path, help="Path to docs_selected.jsonl") + ap.add_argument("--out", required=True, type=pathlib.Path, help="Output datasets dir") + ap.add_argument("--sp", required=True, type=pathlib.Path, help="Path to CaseOps SP model") + ap.add_argument("--val-docs", type=int, default=10_000, help="Validation docs count") + args = ap.parse_args() + + sp = spm.SentencePieceProcessor(model_file=str(args.sp)) + print(f"loaded sp: vocab={sp.vocab_size()}", flush=True) + + train_out = args.out / "datasets" / "fineweb10B_sp8192_lossless_caps_caseops_v1_reserved" + train_out.mkdir(parents=True, exist_ok=True) + + val_buf_tokens: list[int] = [] + val_buf_bytes: list[int] = [] + train_buf: list[int] = [] + val_written = 0 + train_written = 0 + n_docs = 0 + + for text in _iter_docs(args.docs): + transformed = encode_lossless_caps_v2(text) + token_ids = [BOS_ID] + sp.encode(transformed, out_type=int) + if n_docs < args.val_docs: + # Validation doc — also compute byte sidecar + byte_counts = _token_original_byte_counts(sp, text, transformed) + val_buf_tokens.extend(token_ids) + val_buf_bytes.append(0) # BOS contributes 0 original bytes + val_buf_bytes.extend(int(b) for b in byte_counts) + if len(val_buf_tokens) >= SHARD_TOKENS: + _write_shard(train_out / f"fineweb_val_{val_written:06d}.bin", + np.array(val_buf_tokens[:SHARD_TOKENS], dtype=np.uint16)) + _write_shard(train_out / f"fineweb_val_bytes_{val_written:06d}.bin", + np.array(val_buf_bytes[:SHARD_TOKENS], dtype=np.uint16)) + val_buf_tokens = val_buf_tokens[SHARD_TOKENS:] + val_buf_bytes = val_buf_bytes[SHARD_TOKENS:] + val_written += 1 + else: + train_buf.extend(token_ids) + if len(train_buf) >= SHARD_TOKENS: + _write_shard(train_out / f"fineweb_train_{train_written:06d}.bin", + np.array(train_buf[:SHARD_TOKENS], dtype=np.uint16)) + train_buf = train_buf[SHARD_TOKENS:] + train_written += 1 + n_docs += 1 + if n_docs % 10_000 == 0: + print(f" processed {n_docs} docs train_shards={train_written} val_shards={val_written}", flush=True) + + # Flush tail buffers into final (possibly short) shards. + if val_buf_tokens: + _write_shard(train_out / f"fineweb_val_{val_written:06d}.bin", + np.array(val_buf_tokens, dtype=np.uint16)) + _write_shard(train_out / f"fineweb_val_bytes_{val_written:06d}.bin", + np.array(val_buf_bytes, dtype=np.uint16)) + if train_buf: + _write_shard(train_out / f"fineweb_train_{train_written:06d}.bin", + np.array(train_buf, dtype=np.uint16)) + + print(f"done. docs={n_docs} train_shards={train_written + (1 if train_buf else 0)} val_shards={val_written + (1 if val_buf_tokens else 0)}") + + +if __name__ == "__main__": + main() diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/submission.json b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/submission.json new file mode 100644 index 0000000000..162864c629 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/submission.json @@ -0,0 +1,37 @@ +{ + "author": "Tim Shen (@TimS-ml) & Billy Li (@lijuncheng16)", + "github_id": "TimS-ml", + "name": "Record: Leaky ReLU Slope + GPTQ Reverse-Cholesky Speedup + PR #1938 (val_bpb = 1.06242)", + "date": "2026-04-29", + "track": "10min_16mb", + "val_bpb": 1.06242, + "val_bpb_std": 0.00013, + "seeds": [1334, 42, 999], + "seed_results": { + "1334": {"val_bpb": 1.06257073, "artifact_bytes": 15947664}, + "42": {"val_bpb": 1.06232455, "artifact_bytes": 15945920}, + "999": {"val_bpb": 1.06236517, "artifact_bytes": 15946532} + }, + "hardware": "8xH100 80GB SXM", + "pytorch_version": "2.9.1+cu128", + "technique_summary": "SP8192/CaseOps + 11L GQA + LeakyReLU(0.3)\u00b2 + Depth Recurrence (L3-5 \u00d72) + Parallel Residuals (L8+) + SmearGate + SparseAttnGate + GPTQ int6 Reverse-Cholesky + LQER asym (top-1) + Int8 Attn Gate + Phased TTT (3ph) + Brotli", + "compliance": { + "train_under_600s": true, + "artifact_under_16mb": true, + "eval_under_600s": true, + "three_seeds": true, + "score_first_ttt": true, + "smeargate_bos_fix": true + }, + "attribution": { + "pr1938": "@lijuncheng16; @TimS-ml (PR #1938)", + "pr1867": "@lijuncheng16; @TimS-ml (PR #1867)", + "pr1851": "@aquariouseworkman (PR #1851)", + "smeargate_bos_audit": "@cocohearts (PR #1797)", + "lqer_asym": "@dexhunter (PR #1797)", + "caseops_tokenizer": "@romeerp (PR #1729, #1736); shards from romeerp/parameter-golf-caseops-v1", + "sparse_attn_gate": "@nprime06 (PR #1787)", + "phased_ttt": "@dexhunter (PR #1736), @abaybektursun (PR #549)", + "gptq_sdclip": "@clarkkev (PR #1394)" + } +} diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model new file mode 100644 index 0000000000..fffc8bb306 Binary files /dev/null and b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model differ diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.vocab b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.vocab new file mode 100644 index 0000000000..9940e493c6 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/tokenizers/fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.vocab @@ -0,0 +1,8192 @@ + 0 + 0 + 0 + 0 + 0 + 0 + 0 + 0 +<0x00> 0 +<0x01> 0 +<0x02> 0 +<0x03> 0 +<0x04> 0 +<0x05> 0 +<0x06> 0 +<0x07> 0 +<0x08> 0 +<0x09> 0 +<0x0A> 0 +<0x0B> 0 +<0x0C> 0 +<0x0D> 0 +<0x0E> 0 +<0x0F> 0 +<0x10> 0 +<0x11> 0 +<0x12> 0 +<0x13> 0 +<0x14> 0 +<0x15> 0 +<0x16> 0 +<0x17> 0 +<0x18> 0 +<0x19> 0 +<0x1A> 0 +<0x1B> 0 +<0x1C> 0 +<0x1D> 0 +<0x1E> 0 +<0x1F> 0 +<0x20> 0 +<0x21> 0 +<0x22> 0 +<0x23> 0 +<0x24> 0 +<0x25> 0 +<0x26> 0 +<0x27> 0 +<0x28> 0 +<0x29> 0 +<0x2A> 0 +<0x2B> 0 +<0x2C> 0 +<0x2D> 0 +<0x2E> 0 +<0x2F> 0 +<0x30> 0 +<0x31> 0 +<0x32> 0 +<0x33> 0 +<0x34> 0 +<0x35> 0 +<0x36> 0 +<0x37> 0 +<0x38> 0 +<0x39> 0 +<0x3A> 0 +<0x3B> 0 +<0x3C> 0 +<0x3D> 0 +<0x3E> 0 +<0x3F> 0 +<0x40> 0 +<0x41> 0 +<0x42> 0 +<0x43> 0 +<0x44> 0 +<0x45> 0 +<0x46> 0 +<0x47> 0 +<0x48> 0 +<0x49> 0 +<0x4A> 0 +<0x4B> 0 +<0x4C> 0 +<0x4D> 0 +<0x4E> 0 +<0x4F> 0 +<0x50> 0 +<0x51> 0 +<0x52> 0 +<0x53> 0 +<0x54> 0 +<0x55> 0 +<0x56> 0 +<0x57> 0 +<0x58> 0 +<0x59> 0 +<0x5A> 0 +<0x5B> 0 +<0x5C> 0 +<0x5D> 0 +<0x5E> 0 +<0x5F> 0 +<0x60> 0 +<0x61> 0 +<0x62> 0 +<0x63> 0 +<0x64> 0 +<0x65> 0 +<0x66> 0 +<0x67> 0 +<0x68> 0 +<0x69> 0 +<0x6A> 0 +<0x6B> 0 +<0x6C> 0 +<0x6D> 0 +<0x6E> 0 +<0x6F> 0 +<0x70> 0 +<0x71> 0 +<0x72> 0 +<0x73> 0 +<0x74> 0 +<0x75> 0 +<0x76> 0 +<0x77> 0 +<0x78> 0 +<0x79> 0 +<0x7A> 0 +<0x7B> 0 +<0x7C> 0 +<0x7D> 0 +<0x7E> 0 +<0x7F> 0 +<0x80> 0 +<0x81> 0 +<0x82> 0 +<0x83> 0 +<0x84> 0 +<0x85> 0 +<0x86> 0 +<0x87> 0 +<0x88> 0 +<0x89> 0 +<0x8A> 0 +<0x8B> 0 +<0x8C> 0 +<0x8D> 0 +<0x8E> 0 +<0x8F> 0 +<0x90> 0 +<0x91> 0 +<0x92> 0 +<0x93> 0 +<0x94> 0 +<0x95> 0 +<0x96> 0 +<0x97> 0 +<0x98> 0 +<0x99> 0 +<0x9A> 0 +<0x9B> 0 +<0x9C> 0 +<0x9D> 0 +<0x9E> 0 +<0x9F> 0 +<0xA0> 0 +<0xA1> 0 +<0xA2> 0 +<0xA3> 0 +<0xA4> 0 +<0xA5> 0 +<0xA6> 0 +<0xA7> 0 +<0xA8> 0 +<0xA9> 0 +<0xAA> 0 +<0xAB> 0 +<0xAC> 0 +<0xAD> 0 +<0xAE> 0 +<0xAF> 0 +<0xB0> 0 +<0xB1> 0 +<0xB2> 0 +<0xB3> 0 +<0xB4> 0 +<0xB5> 0 +<0xB6> 0 +<0xB7> 0 +<0xB8> 0 +<0xB9> 0 +<0xBA> 0 +<0xBB> 0 +<0xBC> 0 +<0xBD> 0 +<0xBE> 0 +<0xBF> 0 +<0xC0> 0 +<0xC1> 0 +<0xC2> 0 +<0xC3> 0 +<0xC4> 0 +<0xC5> 0 +<0xC6> 0 +<0xC7> 0 +<0xC8> 0 +<0xC9> 0 +<0xCA> 0 +<0xCB> 0 +<0xCC> 0 +<0xCD> 0 +<0xCE> 0 +<0xCF> 0 +<0xD0> 0 +<0xD1> 0 +<0xD2> 0 +<0xD3> 0 +<0xD4> 0 +<0xD5> 0 +<0xD6> 0 +<0xD7> 0 +<0xD8> 0 +<0xD9> 0 +<0xDA> 0 +<0xDB> 0 +<0xDC> 0 +<0xDD> 0 +<0xDE> 0 +<0xDF> 0 +<0xE0> 0 +<0xE1> 0 +<0xE2> 0 +<0xE3> 0 +<0xE4> 0 +<0xE5> 0 +<0xE6> 0 +<0xE7> 0 +<0xE8> 0 +<0xE9> 0 +<0xEA> 0 +<0xEB> 0 +<0xEC> 0 +<0xED> 0 +<0xEE> 0 +<0xEF> 0 +<0xF0> 0 +<0xF1> 0 +<0xF2> 0 +<0xF3> 0 +<0xF4> 0 +<0xF5> 0 +<0xF6> 0 +<0xF7> 0 +<0xF8> 0 +<0xF9> 0 +<0xFA> 0 +<0xFB> 0 +<0xFC> 0 +<0xFD> 0 +<0xFE> 0 +<0xFF> 0 +▁t -0 +▁a -1 +in -2 +he -3 +re -4 +on -5 +er -6 +▁the -7 +▁s -8 +▁w -9 +or -10 +at -11 +ou -12 +nd -13 +it -14 +▁c -15 +es -16 +▁f -17 +is -18 +en -19 +ing -20 +▁b -21 +▁p -22 +▁o -23 +an -24 +al -25 +ed -26 +▁to -27 +ar -28 +▁m -29 +▁and -30 +▁in -31 +▁of -32 +le -33 +▁d -34 +as -35 +ic -36 +▁h -37 +om -38 +ion -39 +▁th -40 +il -41 +st -42 +▁l -43 +ro -44 +ent -45 +ve -46 +▁y -47 +▁e -48 +▁re -49 +▁n -50 +▁g -51 +ac -52 +et -53 +▁you -54 +ly -55 +the -56 +id -57 +ay -58 +▁for -59 +▁is -60 +▁on -61 +▁be -62 +am -63 +ow -64 +se -65 +ot -66 +ad -67 +ol -68 +ig -69 +ct -70 +im -71 +ch -72 +▁u -73 +ver -74 +ith -75 +ut -76 +▁st -77 +el -78 +ation -79 +▁with -80 +ir -81 +▁that -82 +th -83 +ur -84 +ce -85 +▁he -86 +▁it -87 +ill -88 +ter -89 +if -90 +▁al -91 +▁an -92 +ul -93 +ke -94 +our -95 +ag -96 +ers -97 +▁pro -98 +▁wh -99 +▁as -100 +▁are -101 +▁we -102 +▁ha -103 +oo -104 +out -105 +un -106 +pp -107 +us -108 +ab -109 +ate -110 +ess -111 +▁at -112 +▁con -113 +▁com -114 +▁or -115 +ra -116 +em -117 +ore -118 +ri -119 +est -120 +igh -121 +rom -122 +▁- -123 +▁ne -124 +op -125 +▁se -126 +▁your -127 +qu -128 +ld -129 +and -130 +ist -131 +▁( -132 +res -133 +ment -134 +▁ex -135 +ant -136 +pe -137 +ity -138 +art -139 +▁v -140 +ive -141 +▁r -142 +all -143 +▁have -144 +ort -145 +ust -146 +▁was -147 +um -148 +▁this -149 +▁from -150 +▁de -151 +oc -152 +▁sh -153 +ies -154 +os -155 +▁su -156 +ain -157 +▁will -158 +▁can -159 +▁ch -160 +ight -161 +▁by -162 +nt -163 +ome -164 +ard -165 +▁not -166 +te -167 +ud -168 +▁le -169 +red -170 +we -171 +▁wor -172 +ost -173 +ie -174 +ge -175 +▁pl -176 +▁ab -177 +ther -178 +iv -179 +king -180 +ide -181 +ast -182 +for -183 +per -184 +gh -185 +.. -186 +rou -187 +fe -188 +ial -189 +▁all -190 +pl -191 +ack -192 +ine -193 +▁j -194 +ould -195 +od -196 +ice -197 +ell -198 +▁has -199 +ind -200 +act -201 +ne -202 +ure -203 +cl -204 +one -205 +ear -206 +▁do -207 +so -208 +▁k -209 +are -210 +▁us -211 +▁ad -212 +ake -213 +age -214 +ks -215 +▁me -216 +ip -217 +▁out -218 +▁but -219 +ry -220 +ap -221 +ally -222 +▁up -223 +▁whe -224 +ions -225 +com -226 +to -227 +able -228 +▁our -229 +ail -230 +▁en -231 +▁more -232 +▁comp -233 +very -234 +ite -235 +og -236 +▁my -237 +▁“ -238 +ime -239 +▁so -240 +ich -241 +her -242 +▁cl -243 +▁their -244 +ood -245 +ong -246 +ated -247 +ber -248 +▁sa -249 +pt -250 +ame -251 +iz -252 +▁they -253 +▁one -254 +▁ac -255 +du -256 +ike -257 +▁te -258 +ous -259 +▁about -260 +ak -261 +ans -262 +ase -263 +ace -264 +ass -265 +▁cont -266 +ia -267 +ru -268 +con -269 +▁im -270 +ire -271 +ign -272 +▁fe -273 +▁who -274 +ance -275 +ree -276 +▁off -277 +ach -278 +▁man -279 +bl -280 +reat -281 +now -282 +▁go -283 +▁new -284 +au -285 +▁" -286 +ays -287 +▁his -288 +av -289 +ick -290 +ib -291 +▁year -292 +▁app -293 +ff -294 +▁res -295 +form -296 +▁qu -297 +erv -298 +ove -299 +ary -300 +port -301 +ction -302 +ult -303 +this -304 +ations -305 +▁per -306 +ook -307 +ile -308 +▁also -309 +▁get -310 +day -311 +▁time -312 +▁which -313 +ents -314 +ep -315 +▁like -316 +ount -317 +vel -318 +▁some -319 +ue -320 +▁any -321 +ven -322 +▁other -323 +▁tr -324 +▁sp -325 +cess -326 +ph -327 +▁been -328 +che -329 +▁part -330 +ings -331 +ical -332 +▁over -333 +sh -334 +ark -335 +pro -336 +▁un -337 +▁dis -338 +ress -339 +ence -340 +be -341 +wh -342 +▁them -343 +low -344 +ition -345 +ors -346 +int -347 +▁pe -348 +ild -349 +▁her -350 +▁ar -351 +▁ag -352 +▁when -353 +ord -354 +▁ro -355 +ov -356 +ound -357 +ough -358 +ple -359 +▁just -360 +su -361 +▁said -362 +ish -363 +own -364 +irst -365 +ang -366 +ren -367 +▁need -368 +ck -369 +ose -370 +ob -371 +▁spe -372 +ll -373 +▁pre -374 +▁what -375 +▁there -376 +▁pr -377 +gr -378 +▁if -379 +old -380 +ens -381 +wn -382 +you -383 +▁work -384 +de -385 +▁than -386 +lp -387 +vers -388 +▁would -389 +▁know -390 +ning -391 +oy -392 +ater -393 +fter -394 +▁had -395 +▁am -396 +ade -397 +▁des -398 +▁were -399 +▁sc -400 +rough -401 +▁rec -402 +clud -403 +itt -404 +▁how -405 +▁comm -406 +ory -407 +▁into -408 +ople -409 +... -410 +man -411 +▁– -412 +▁help -413 +nder -414 +ful -415 +▁first -416 +▁em -417 +▁its -418 +je -419 +ool -420 +▁add -421 +ft -422 +▁make -423 +▁tw -424 +but -425 +▁prov -426 +▁bl -427 +ink -428 +anc -429 +▁people -430 +end -431 +▁includ -432 +▁col -433 +▁she -434 +ict -435 +rent -436 +get -437 +iew -438 +▁produ -439 +ian -440 +▁serv -441 +ex -442 +round -443 +ious -444 +ates -445 +tern -446 +▁every -447 +▁only -448 +▁bec -449 +aw -450 +▁bu -451 +▁may -452 +▁no -453 +▁back -454 +▁through -455 +ating -456 +up -457 +ual -458 +ices -459 +sp -460 +new -461 +▁ph -462 +▁want -463 +▁most -464 +arch -465 +▁see -466 +ments -467 +▁well -468 +bs -469 +pport -470 +ities -471 +wor -472 +▁use -473 +▁two -474 +iss -475 +co -476 +▁bet -477 +les -478 +cts -479 +▁after -480 +ise -481 +ause -482 +ife -483 +hing -484 +ons -485 +fore -486 +iness -487 +▁play -488 +ty -489 +ily -490 +row -491 +▁| -492 +▁act -493 +▁& -494 +▁co -495 +air -496 +lect -497 +ting -498 +▁fl -499 +enc -500 +whe -501 +br -502 +▁exper -503 +xt -504 +▁stud -505 +cial -506 +urn -507 +cent -508 +sel -509 +▁these -510 +ond -511 +tr -512 +ts -513 +▁years -514 +als -515 +vent -516 +▁cons -517 +▁under -518 +▁$ -519 +aking -520 +alth -521 +▁bus -522 +▁ke -523 +▁very -524 +fr -525 +ubl -526 +▁loc -527 +▁good -528 +▁sm -529 +oth -530 +▁now -531 +cre -532 +▁many -533 +ational -534 +ased -535 +▁wee -536 +▁reg -537 +ific -538 +▁gu -539 +its -540 +▁tra -541 +▁way -542 +ures -543 +formation -544 +▁dif -545 +erson -546 +ange -547 +any -548 +▁great -549 +fect -550 +ms -551 +ience -552 +oint -553 +▁start -554 +ert -555 +▁fin -556 +▁day -557 +other -558 +▁rel -559 +▁sy -560 +▁could -561 +▁rem -562 +uring -563 +read -564 +ath -565 +▁ind -566 +▁don -567 +ility -568 +ible -569 +ident -570 +▁even -571 +▁mu -572 +▁here -573 +▁own -574 +ics -575 +pr -576 +). -577 +▁where -578 +hed -579 +▁inv -580 +▁acc -581 +▁home -582 +velop -583 +oss -584 +line -585 +▁show -586 +amp -587 +comm -588 +▁find -589 +▁should -590 +▁ent -591 +ble -592 +▁high -593 +▁then -594 +chool -595 +ular -596 +▁down -597 +▁br -598 +▁right -599 +ix -600 +▁read -601 +▁best -602 +ited -603 +▁car -604 +▁av -605 +olog -606 +|| -607 +me -608 +▁fam -609 +▁ass -610 +▁set -611 +▁too -612 +fl -613 +gan -614 +ied -615 +▁min -616 +arn -617 +▁mon -618 +view -619 +▁take -620 +ock -621 +ward -622 +▁design -623 +▁che -624 +io -625 +pect -626 +ull -627 +sc -628 +gram -629 +stem -630 +▁business -631 +▁much -632 +▁information -633 +way -634 +eng -635 +az -636 +ax -637 +▁last -638 +ins -639 +▁inst -640 +ys -641 +▁long -642 +ener -643 +mon -644 +rest -645 +omet -646 +ative -647 +ason -648 +▁made -649 +▁because -650 +app -651 +▁did -652 +.” -653 +min -654 +rit -655 +ject -656 +med -657 +ps -658 +land -659 +▁call -660 +atch -661 +let -662 +▁before -663 +ollow -664 +not -665 +▁team -666 +ave -667 +ash -668 +how -669 +oun -670 +▁sur -671 +there -672 +uc -673 +▁cre -674 +ner -675 +▁world -676 +▁him -677 +amer -678 +▁mem -679 +▁again -680 +ient -681 +▁requ -682 +▁hand -683 +▁person -684 +▁art -685 +▁op -686 +▁diffe -687 +my -688 +▁develop -689 +mer -690 +▁gr -691 +riend -692 +▁does -693 +▁life -694 +▁vis -695 +), -696 +arket -697 +ouse -698 +▁being -699 +▁inte -700 +aut -701 +▁end -702 +▁dec -703 +ros -704 +ustom -705 +bo -706 +ale -707 +▁such -708 +ality -709 +ower -710 +unity -711 +▁pres -712 +▁look -713 +ving -714 +▁those -715 +ince -716 +chn -717 +▁med -718 +no -719 +▁sk -720 +ger -721 +▁while -722 +▁free -723 +▁love -724 +▁provid -725 +der -726 +ailable -727 +irect -728 +▁support -729 +ages -730 +ters -731 +akes -732 +ton -733 +ank -734 +▁around -735 +▁sim -736 +▁used -737 +▁each -738 +ced -739 +▁pol -740 +ier -741 +ces -742 +▁post -743 +ement -744 +they -745 +ives -746 +▁att -747 +what -748 +▁cur -749 +ah -750 +▁bo -751 +col -752 +▁lead -753 +▁mod -754 +ather -755 +ital -756 +ional -757 +by -758 +ually -759 +▁char -760 +ize -761 +self -762 +▁ser -763 +▁care -764 +she -765 +rand -766 +ivers -767 +▁eas -768 +▁ed -769 +▁child -770 +und -771 +ublic -772 +▁loo -773 +▁det -774 +▁follow -775 +▁really -776 +▁ret -777 +▁program -778 +▁system -779 +imes -780 +▁run -781 +▁num -782 +▁cour -783 +ets -784 +do -785 +cur -786 +ub -787 +▁trans -788 +ines -789 +▁found -790 +▁list -791 +▁inter -792 +▁feel -793 +▁think -794 +▁available -795 +led -796 +ittle -797 +▁going -798 +ness -799 +▁count -800 +iet -801 +ways -802 +▁somet -803 +fin -804 +▁del -805 +ann -806 +iel -807 +▁fun -808 +▁month -809 +ene -810 +hip -811 +▁ra -812 +▁str -813 +▁top -814 +▁custom -815 +▁still -816 +▁ins -817 +ames -818 +▁prof -819 +cond -820 +,” -821 +ten -822 +▁place -823 +mar -824 +▁cle -825 +with -826 +▁though -827 +▁few -828 +gg -829 +▁sign -830 +▁fr -831 +jo -832 +▁both -833 +▁three -834 +arm -835 +▁sub -836 +cept -837 +que -838 +▁i -839 +▁gl -840 +▁different -841 +▁happ -842 +bers -843 +orn -844 +▁health -845 +lease -846 +ved -847 +lic -848 +▁week -849 +olut -850 +ired -851 +▁open -852 +▁adv -853 +▁pur -854 +▁keep -855 +ross -856 +▁exp -857 +▁lot -858 +▁friend -859 +when -860 +part -861 +ants -862 +▁market -863 +▁say -864 +ata -865 +ween -866 +▁family -867 +▁school -868 +of -869 +▁little -870 +pres -871 +alk -872 +▁game -873 +ars -874 +▁same -875 +▁company -876 +less -877 +." -878 +▁using -879 +ering -880 +atur -881 +▁process -882 +iving -883 +▁report -884 +▁techn -885 +▁feat -886 +▁lar -887 +lo -888 +americ -889 +▁fil -890 +▁import -891 +thing -892 +▁ev -893 +▁point -894 +go -895 +▁including -896 +▁comple -897 +▁experience -898 +ology -899 +▁things -900 +▁met -901 +ves -902 +▁during -903 +▁water -904 +▁next -905 +▁between -906 +stand -907 +search -908 +▁aut -909 +ues -910 +▁real -911 +ished -912 +ized -913 +ever -914 +ody -915 +ining -916 +▁disc -917 +▁state -918 +▁service -919 +tra -920 +att -921 +▁come -922 +▁mus -923 +ware -924 +roup -925 +▁win -926 +▁allow -927 +pre -928 +▁small -929 +▁tri -930 +▁dri -931 +gy -932 +▁contin -933 +▁ear -934 +cer -935 +ists -936 +▁form -937 +▁full -938 +▁sure -939 +▁always -940 +▁rest -941 +ric -942 +nor -943 +work -944 +▁cr -945 +ield -946 +▁big -947 +▁var -948 +▁organ -949 +▁rece -950 +▁put -951 +▁opt -952 +joy -953 +▁online -954 +▁build -955 +oot -956 +▁try -957 +▁cent -958 +par -959 +els -960 +▁appro -961 +car -962 +▁interest -963 +▁plan -964 +▁phot -965 +ik -966 +▁better -967 +▁days -968 +▁bel -969 +▁pri -970 +▁pat -971 +uss -972 +▁since -973 +that -974 +sw -975 +ards -976 +▁got -977 +▁poss -978 +▁class -979 +ffect -980 +cri -981 +▁something -982 +▁cap -983 +ours -984 +▁main -985 +▁quest -986 +▁partic -987 +istr -988 +pa -989 +▁special -990 +▁ide -991 +reen -992 +▁pass -993 +fer -994 +▁ve -995 +▁today -996 +▁result -997 +▁sl -998 +gu -999 +▁pay -1000 +off -1001 +▁mov -1002 +chr -1003 +▁ty -1004 +▁hard -1005 +ipp -1006 +▁gener -1007 +▁direct -1008 +▁perform -1009 +▁stand -1010 +▁bro -1011 +illion -1012 +▁head -1013 +iversity -1014 +▁provide -1015 +can -1016 +ott -1017 +▁resp -1018 +▁let -1019 +▁second -1020 +ok -1021 +oney -1022 +ploy -1023 +ont -1024 +ients -1025 +▁give -1026 +▁val -1027 +▁wom -1028 +▁another -1029 +▁manag -1030 +▁cor -1031 +▁current -1032 +vis -1033 +▁meet -1034 +," -1035 +▁without -1036 +here -1037 +ope -1038 +▁able -1039 +▁book -1040 +▁services -1041 +sy -1042 +rence -1043 +▁wr -1044 +▁ann -1045 +may -1046 +ley -1047 +▁vide -1048 +▁number -1049 +▁int -1050 +raph -1051 +▁area -1052 +▁bre -1053 +▁soc -1054 +▁old -1055 +ww -1056 +vir -1057 +ium -1058 +▁cost -1059 +▁must -1060 +ically -1061 +ems -1062 +ately -1063 +▁— -1064 +ior -1065 +por -1066 +▁access -1067 +▁ter -1068 +▁invest -1069 +▁important -1070 +son -1071 +▁unt -1072 +dr -1073 +bu -1074 +▁working -1075 +ense -1076 +▁students -1077 +▁webs -1078 +▁public -1079 +most -1080 +gin -1081 +ling -1082 +▁never -1083 +▁sal -1084 +▁light -1085 +ertain -1086 +ral -1087 +▁fact -1088 +aj -1089 +▁suc -1090 +just -1091 +▁order -1092 +▁proble -1093 +▁grow -1094 +cont -1095 +hes -1096 +▁visit -1097 +▁incre -1098 +▁proper -1099 +▁might -1100 +vern -1101 +▁local -1102 +▁imp -1103 +more -1104 +▁site -1105 +ruct -1106 +outh -1107 +ample -1108 +add -1109 +▁sing -1110 +▁enjoy -1111 +ember -1112 +▁four -1113 +room -1114 +▁event -1115 +gether -1116 +▁level -1117 +▁exc -1118 +count -1119 +fort -1120 +▁fac -1121 +book -1122 +▁looking -1123 +comp -1124 +ds -1125 +ute -1126 +▁food -1127 +ission -1128 +play -1129 +ording -1130 +pend -1131 +▁test -1132 +rol -1133 +tend -1134 +▁night -1135 +▁season -1136 +hor -1137 +▁expl -1138 +▁data -1139 +▁until -1140 +▁spec -1141 +val -1142 +ential -1143 +▁room -1144 +ories -1145 +▁offer -1146 +▁dr -1147 +▁children -1148 +▁making -1149 +▁el -1150 +ency -1151 +▁less -1152 +▁within -1153 +▁eng -1154 +▁email -1155 +▁products -1156 +come -1157 +cle -1158 +ently -1159 +▁view -1160 +▁' -1161 +lection -1162 +▁profess -1163 +▁pop -1164 +serv -1165 +▁product -1166 +▁sit -1167 +des -1168 +ids -1169 +use -1170 +ieve -1171 +▁job -1172 +ats -1173 +▁oper -1174 +▁prom -1175 +▁power -1176 +▁pos -1177 +ases -1178 +year -1179 +ains -1180 +ched -1181 +ature -1182 +▁addition -1183 +▁contact -1184 +▁supp -1185 +▁intern -1186 +▁est -1187 +ste -1188 +tt -1189 +your -1190 +▁community -1191 +▁law -1192 +ney -1193 +tw -1194 +ared -1195 +ribut -1196 +af -1197 +ique -1198 +ills -1199 +▁quality -1200 +aff -1201 +▁turn -1202 +▁against -1203 +▁prote -1204 +▁impro -1205 +▁past -1206 +ches -1207 +icle -1208 +inc -1209 +ivid -1210 +university -1211 +pos -1212 +reg -1213 +▁together -1214 +▁ob -1215 +pm -1216 +utes -1217 +▁away -1218 +gl -1219 +▁mar -1220 +▁project -1221 +ator -1222 +ified -1223 +after -1224 +ined -1225 +▁conf -1226 +rap -1227 +ode -1228 +▁website -1229 +sm -1230 +iday -1231 +▁indust -1232 +▁ask -1233 +▁easy -1234 +ability -1235 +▁complete -1236 +ider -1237 +▁needs -1238 +irl -1239 +▁eff -1240 +ina -1241 +please -1242 +face -1243 +▁creat -1244 +▁learn -1245 +back -1246 +▁pract -1247 +these -1248 +▁success -1249 +▁tru -1250 +ips -1251 +time -1252 +▁bas -1253 +ju -1254 +▁beaut -1255 +▁please -1256 +▁detail -1257 +free -1258 +▁equ -1259 +▁pain -1260 +▁leg -1261 +ready -1262 +augh -1263 +▁members -1264 +▁change -1265 +cal -1266 +atter -1267 +!! -1268 +▁course -1269 +▁chang -1270 +ng -1271 +leg -1272 +▁include -1273 +▁large -1274 +christ -1275 +arent -1276 +iety -1277 +▁money -1278 +yle -1279 +▁along -1280 +ries -1281 +▁says -1282 +▁group -1283 +ift -1284 +oci -1285 +▁check -1286 +ream -1287 +ised -1288 +▁fav -1289 +▁dist -1290 +bro -1291 +aster -1292 +opp -1293 +▁applic -1294 +▁name -1295 +▁house -1296 +▁near -1297 +over -1298 +▁offic -1299 +aterial -1300 +ird -1301 +bar -1302 +alf -1303 +load -1304 +eu -1305 +▁expect -1306 +▁music -1307 +cut -1308 +ism -1309 +ring -1310 +▁video -1311 +▁activ -1312 +uk -1313 +une -1314 +▁times -1315 +▁present -1316 +ane -1317 +iting -1318 +cour -1319 +▁buy -1320 +igin -1321 +▁why -1322 +ump -1323 +side -1324 +▁employ -1325 +iver -1326 +net -1327 +ising -1328 +▁perfect -1329 +ably -1330 +ocus -1331 +▁opport -1332 +ered -1333 +▁thing -1334 +ote -1335 +spe -1336 +hn -1337 +ery -1338 +▁below -1339 +▁. -1340 +veral -1341 +urs -1342 +cor -1343 +ny -1344 +oor -1345 +state -1346 +▁air -1347 +▁ref -1348 +▁pa -1349 +ify -1350 +ball -1351 +ury -1352 +enef -1353 +itions -1354 +▁case -1355 +▁research -1356 +▁account -1357 +ots -1358 +vernment -1359 +▁live -1360 +ored -1361 +ived -1362 +▁line -1363 +▁beh -1364 +▁hum -1365 +aring -1366 +▁create -1367 +la -1368 +▁following -1369 +▁writ -1370 +▁short -1371 +▁hist -1372 +▁hot -1373 +▁makes -1374 +▁sw -1375 +▁kind -1376 +▁review -1377 +▁soft -1378 +▁ce -1379 +▁left -1380 +▁page -1381 +▁treat -1382 +itive -1383 +ision -1384 +▁/ -1385 +viron -1386 +dis -1387 +▁having -1388 +▁side -1389 +▁dem -1390 +▁conne -1391 +uck -1392 +fact -1393 +▁getting -1394 +vious -1395 +▁story -1396 +▁works -1397 +▁share -1398 +ization -1399 +while -1400 +▁across -1401 +▁minutes -1402 +▁enough -1403 +▁move -1404 +▁ever -1405 +don -1406 +iam -1407 +osp -1408 +▁personal -1409 +ffe -1410 +ma -1411 +▁far -1412 +ca -1413 +▁already -1414 +▁trad -1415 +▁cond -1416 +tain -1417 +▁understand -1418 +ention -1419 +▁effect -1420 +▁const -1421 +▁record -1422 +john -1423 +▁won -1424 +▁hours -1425 +▁possible -1426 +▁done -1427 +mber -1428 +▁million -1429 +▁news -1430 +▁par -1431 +aim -1432 +ush -1433 +▁comes -1434 +light -1435 +▁based -1436 +▁several -1437 +▁nor -1438 +▁cut -1439 +▁educ -1440 +▁women -1441 +ze -1442 +ienc -1443 +pol -1444 +unch -1445 +▁camp -1446 +▁dev -1447 +▁social -1448 +house -1449 +ividual -1450 +aces -1451 +▁become -1452 +ured -1453 +▁mil -1454 +ister -1455 +oice -1456 +right -1457 +ron -1458 +uture -1459 +american -1460 +ajor -1461 +▁bit -1462 +▁country -1463 +iven -1464 +▁boo -1465 +city -1466 +▁benef -1467 +▁focus -1468 +▁es -1469 +▁often -1470 +ford -1471 +▁cou -1472 +▁men -1473 +raw -1474 +▁tell -1475 +▁: -1476 +nes -1477 +▁development -1478 +mal -1479 +▁net -1480 +ances -1481 +intern -1482 +▁pot -1483 +▁jo -1484 +▁quick -1485 +▁... -1486 +iful -1487 +▁vers -1488 +▁hold -1489 +▁material -1490 +rec -1491 +▁city -1492 +▁body -1493 +ork -1494 +▁friends -1495 +jan -1496 +▁ext -1497 +school -1498 +head -1499 +ides -1500 +▁didn -1501 +pri -1502 +▁bring -1503 +▁fund -1504 +▁purch -1505 +▁sum -1506 +ograph -1507 +vironment -1508 +ows -1509 +▁blog -1510 +raft -1511 +inst -1512 +▁issu -1513 +yes -1514 +▁elect -1515 +▁mult -1516 +ference -1517 +ai -1518 +nov -1519 +ource -1520 +osed -1521 +post -1522 +▁return -1523 +ns -1524 +▁doing -1525 +met -1526 +olution -1527 +▁pie -1528 +from -1529 +ham -1530 +idd -1531 +▁tre -1532 +well -1533 +mc -1534 +▁others -1535 +▁months -1536 +ners -1537 +▁ste -1538 +▁mom -1539 +▁wa -1540 +ones -1541 +▁individual -1542 +▁pack -1543 +har -1544 +▁non -1545 +▁travel -1546 +▁doesn -1547 +▁called -1548 +ser -1549 +where -1550 +lege -1551 +▁started -1552 +▁space -1553 +po -1554 +gar -1555 +▁deal -1556 +▁compet -1557 +aug -1558 +▁took -1559 +however -1560 +▁store -1561 +▁mind -1562 +▁sol -1563 +some -1564 +▁came -1565 +., -1566 +ination -1567 +conom -1568 +▁clean -1569 +reet -1570 +str -1571 +▁future -1572 +▁la -1573 +▁yet -1574 +▁lo -1575 +ateg -1576 +▁rele -1577 +wood -1578 +▁industry -1579 +▁price -1580 +▁cover -1581 +▁pict -1582 +▁games -1583 +acy -1584 +▁means -1585 +curity -1586 +▁low -1587 +will -1588 +roll -1589 +ged -1590 +arning -1591 +unic -1592 +bed -1593 +world -1594 +▁oc -1595 +sk -1596 +lish -1597 +west -1598 +na -1599 +▁major -1600 +▁percent -1601 +▁key -1602 +aust -1603 +▁control -1604 +top -1605 +▁* -1606 +ply -1607 +▁lim -1608 +▁heart -1609 +▁least -1610 +▁went -1611 +▁tem -1612 +jects -1613 +duct -1614 +▁offers -1615 +▁five -1616 +▁range -1617 +cr -1618 +▁chall -1619 +▁once -1620 +ye -1621 +see -1622 +ption -1623 +anies -1624 +▁tool -1625 +god -1626 +ilities -1627 +▁features -1628 +▁fore -1629 +high -1630 +ster -1631 +bre -1632 +▁early -1633 +▁x -1634 +▁origin -1635 +▁series -1636 +▁specific -1637 +▁land -1638 +▁wind -1639 +united -1640 +▁-- -1641 +▁tal -1642 +▁comb -1643 +▁hope -1644 +▁require -1645 +anks -1646 +commend -1647 +hol -1648 +▁step -1649 +illed -1650 +ither -1651 +dep -1652 +▁discuss -1653 +▁upd -1654 +▁invol -1655 +▁mean -1656 +org -1657 +ilt -1658 +ask -1659 +pet -1660 +▁girl -1661 +▁single -1662 +nam -1663 +▁ago -1664 +mat -1665 +▁thought -1666 +aken -1667 +▁priv -1668 +▁close -1669 +▁told -1670 +resh -1671 +itor -1672 +▁strong -1673 +▁beautiful -1674 +▁appear -1675 +stud -1676 +char -1677 +york -1678 +▁happen -1679 +// -1680 +▁questions -1681 +last -1682 +▁favor -1683 +oose -1684 +cy -1685 +▁stay -1686 +down -1687 +board -1688 +▁government -1689 +north -1690 +mit -1691 +▁customers -1692 +like -1693 +tract -1694 +urther -1695 +▁added -1696 +▁brand -1697 +euro -1698 +gs -1699 +:// -1700 +▁study -1701 +aps -1702 +rel -1703 +present -1704 +wed -1705 +▁charac -1706 +▁begin -1707 +oct -1708 +rain -1709 +ians -1710 +▁redu -1711 +ted -1712 +▁hig -1713 +ibr -1714 +opy -1715 +press -1716 +▁mess -1717 +▁port -1718 +▁everything -1719 +fully -1720 +▁actually -1721 +irm -1722 +ources -1723 +▁tax -1724 +▁install -1725 +atural -1726 +▁clear -1727 +▁ener -1728 +lab -1729 +bur -1730 +wards -1731 +hers -1732 +ern -1733 +icy -1734 +▁search -1735 +ret -1736 +ama -1737 +▁taking -1738 +south -1739 +put -1740 +▁field -1741 +ensive -1742 +hy -1743 +rs -1744 +▁technology -1745 +▁announ -1746 +▁address -1747 +▁environment -1748 +▁def -1749 +atest -1750 +▁certain -1751 +▁unique -1752 +acc -1753 +▁watch -1754 +▁improve -1755 +ancial -1756 +lebr -1757 +cast -1758 +apr -1759 +▁face -1760 +isk -1761 +▁staff -1762 +gle -1763 +▁media -1764 +ship -1765 +▁host -1766 +▁super -1767 +▁results -1768 +▁content -1769 +▁bar -1770 +ural -1771 +▁training -1772 +▁half -1773 +center -1774 +▁anal -1775 +▁typ -1776 +▁mor -1777 +ports -1778 +rop -1779 +▁young -1780 +▁someone -1781 +▁companies -1782 +ocu -1783 +▁tot -1784 +friday -1785 +county -1786 +▁later -1787 +unities -1788 +▁sound -1789 +▁recommend -1790 +national -1791 +ctor -1792 +ief -1793 +▁vol -1794 +▁building -1795 +ram -1796 +▁everyone -1797 +sure -1798 +tty -1799 +eral -1800 +uff -1801 +▁contr -1802 +▁miss -1803 +enn -1804 +ided -1805 +▁press -1806 +▁known -1807 +bus -1808 +▁card -1809 +▁oil -1810 +▁professional -1811 +▁particip -1812 +▁office -1813 +▁whole -1814 +▁problem -1815 +public -1816 +cially -1817 +urch -1818 +▁z -1819 +▁break -1820 +ests -1821 +▁mot -1822 +phone -1823 +▁film -1824 +▁provides -1825 +▁wond -1826 +sue -1827 +▁official -1828 +ica -1829 +▁ident -1830 +▁bed -1831 +coming -1832 +▁six -1833 +satur -1834 +▁particular -1835 +▁ess -1836 +itted -1837 +care -1838 +ables -1839 +▁respons -1840 +▁simple -1841 +name -1842 +▁seen -1843 +ging -1844 +▁color -1845 +▁log -1846 +▁size -1847 +ape -1848 +pat -1849 +▁front -1850 +health -1851 +ibility -1852 +hod -1853 +prof -1854 +ison -1855 +▁age -1856 +▁issues -1857 +place -1858 +▁mag -1859 +▁web -1860 +▁areas -1861 +elf -1862 +▁sell -1863 +europe -1864 +park -1865 +▁publ -1866 +▁tour -1867 +▁fall -1868 +▁dig -1869 +oved -1870 +imate -1871 +▁cred -1872 +march -1873 +▁given -1874 +set -1875 +pecially -1876 +▁performance -1877 +lin -1878 +sund -1879 +▁reason -1880 +▁anything -1881 +▁events -1882 +▁htt -1883 +ee -1884 +▁energy -1885 +atic -1886 +▁due -1887 +▁designed -1888 +bor -1889 +▁management -1890 +ilar -1891 +tc -1892 +▁black -1893 +▁author -1894 +▁believe -1895 +▁above -1896 +▁prob -1897 +june -1898 +▁cal -1899 +▁details -1900 +ety -1901 +▁relations -1902 +▁assist -1903 +mo -1904 +used -1905 +▁latest -1906 +▁phone -1907 +▁foot -1908 +then -1909 +ledge -1910 +▁print -1911 +iod -1912 +asing -1913 +saturday -1914 +obal -1915 +▁aud -1916 +was -1917 +gest -1918 +▁effort -1919 +mor -1920 +▁econom -1921 +obile -1922 +direct -1923 +▁pretty -1924 +▁skin -1925 +aur -1926 +▁sex -1927 +▁history -1928 +ug -1929 +ino -1930 +ille -1931 +sept -1932 +▁claim -1933 +▁weeks -1934 +ites -1935 +ouch -1936 +inter -1937 +▁property -1938 +ulation -1939 +▁connect -1940 +based -1941 +▁according -1942 +july -1943 +▁ca -1944 +▁link -1945 +▁select -1946 +▁experienc -1947 +ators -1948 +key -1949 +ii -1950 +tv -1951 +▁soon -1952 +▁ant -1953 +▁idea -1954 +▁points -1955 +▁includes -1956 +inking -1957 +▁living -1958 +itter -1959 +▁pick -1960 +every -1961 +▁popular -1962 +itch -1963 +weet -1964 +▁comput -1965 +ges -1966 +ogn -1967 +▁protect -1968 +▁phys -1969 +aily -1970 +▁defin -1971 +▁among -1972 +▁continue -1973 +▁inc -1974 +▁question -1975 +his -1976 +▁cult -1977 +sal -1978 +▁compl -1979 +news -1980 +irt -1981 +itional -1982 +▁fre -1983 +astic -1984 +dav -1985 +▁fire -1986 +ruction -1987 +mas -1988 +.... -1989 +manag -1990 +ufact -1991 +▁white -1992 +nding -1993 +feb -1994 +vert -1995 +▁receive -1996 +ands -1997 +april -1998 +▁talk -1999 +▁opportunity -2000 +▁parent -2001 +first -2002 +▁taken -2003 +▁dou -2004 +kend -2005 +uation -2006 +▁track -2007 +▁type -2008 +▁further -2009 +▁value -2010 +▁celebr -2011 +ental -2012 +oh -2013 +li -2014 +▁date -2015 +▁amount -2016 +ming -2017 +▁kids -2018 +▁items -2019 +▁currently -2020 +▁docu -2021 +idence -2022 +▁comfort -2023 +▁di -2024 +▁hon -2025 +▁required -2026 +ires -2027 +selves -2028 +▁descri -2029 +▁answ -2030 +▁coming -2031 +▁fit -2032 +monday -2033 +▁various -2034 +ding -2035 +ese -2036 +▁safe -2037 +▁human -2038 +▁entire -2039 +▁text -2040 +▁trying -2041 +▁whether -2042 +▁bad -2043 +▁sent -2044 +▁forward -2045 +▁, -2046 +aching -2047 +▁yourself -2048 +sunday -2049 +den -2050 +elt -2051 +iate -2052 +▁concer -2053 +▁summer -2054 +▁consider -2055 +mic -2056 +▁stop -2057 +many -2058 +fo -2059 +▁sugg -2060 +▁wide -2061 +wes -2062 +▁happy -2063 +ume -2064 +word -2065 +▁ready -2066 +itting -2067 +▁member -2068 +▁almost -2069 +aged -2070 +ises -2071 +▁diff -2072 +icated -2073 +ler -2074 +pper -2075 +uary -2076 +▁dam -2077 +▁box -2078 +ained -2079 +▁needed -2080 +▁attend -2081 +black -2082 +soc -2083 +▁example -2084 +▁style -2085 +sing -2086 +▁final -2087 +▁ach -2088 +ried -2089 +▁either -2090 +▁software -2091 +ival -2092 +▁via -2093 +mus -2094 +azing -2095 +▁meas -2096 +▁famil -2097 +ux -2098 +▁red -2099 +president -2100 +ole -2101 +▁crit -2102 +▁wanted -2103 +▁hom -2104 +home -2105 +▁provided -2106 +▁especially -2107 +urance -2108 +ana -2109 +ray -2110 +icult -2111 +▁third -2112 +▁recently -2113 +sim -2114 +▁ways -2115 +▁seem -2116 +▁isn -2117 +ek -2118 +▁potential -2119 +lim -2120 +▁relationship -2121 +▁held -2122 +verage -2123 +mr -2124 +▁quite -2125 +▁running -2126 +▁ensure -2127 +ples -2128 +▁couple -2129 +life -2130 +war -2131 +click -2132 +▁sle -2133 +▁road -2134 +▁insp -2135 +ively -2136 +▁previous -2137 +▁favorite -2138 +itten -2139 +▁regular -2140 +▁nice -2141 +gress -2142 +▁hol -2143 +mb -2144 +▁paper -2145 +▁song -2146 +▁common -2147 +clus -2148 +▁issue -2149 +▁options -2150 +▁veh -2151 +ground -2152 +▁financial -2153 +▁throughout -2154 +▁cup -2155 +▁users -2156 +▁natural -2157 +design -2158 +wind -2159 +law -2160 +▁bott -2161 +▁period -2162 +cas -2163 +▁flow -2164 +aby -2165 +▁clients -2166 +▁engine -2167 +▁init -2168 +coun -2169 +know -2170 +▁choose -2171 +cons -2172 +ees -2173 +▁morning -2174 +aced -2175 +▁created -2176 +▁jour -2177 +.) -2178 +▁sales -2179 +▁version -2180 +ada -2181 +ung -2182 +▁else -2183 +▁takes -2184 +▁extra -2185 +▁players -2186 +▁inde -2187 +▁behind -2188 +▁cool -2189 +▁download -2190 +bi -2191 +aging -2192 +▁er -2193 +▁reading -2194 +ishing -2195 +▁lab -2196 +sen -2197 +san -2198 +▁recent -2199 +▁writing -2200 +band -2201 +▁click -2202 +▁consum -2203 +▁fresh -2204 +uge -2205 +▁probably -2206 +mich -2207 +▁stri -2208 +ecut -2209 +cil -2210 +▁role -2211 +▁war -2212 +▁tit -2213 +loc -2214 +▁asked -2215 +ober -2216 +sa -2217 +▁problems -2218 +▁received -2219 +ague -2220 +▁represent -2221 +also -2222 +ipping -2223 +inks -2224 +class -2225 +▁walk -2226 +▁wonder -2227 +▁leave -2228 +anced -2229 +angu -2230 +lar -2231 +▁sun -2232 +icles -2233 +▁deter -2234 +road -2235 +▁security -2236 +▁ele -2237 +overed -2238 +ken -2239 +▁party -2240 +▁education -2241 +▁article -2242 +ades -2243 +rew -2244 +star -2245 +▁shows -2246 +istrict -2247 +list -2248 +gener -2249 +▁collect -2250 +rodu -2251 +tues -2252 +cap -2253 +aving -2254 +august -2255 +wel -2256 +▁simply -2257 +▁increase -2258 +▁impact -2259 +▁books -2260 +▁hy -2261 +▁action -2262 +▁anyone -2263 +eth -2264 +mark -2265 +ev -2266 +artment -2267 +hib -2268 +▁etc -2269 +states -2270 +thurs -2271 +▁release -2272 +▁original -2273 +ony -2274 +▁ful -2275 +october -2276 +gree -2277 +ror -2278 +istration -2279 +▁self -2280 +▁model -2281 +olutions -2282 +▁knowledge -2283 +atform -2284 +▁customer -2285 +▁hit -2286 +▁fast -2287 +ffic -2288 +▁coun -2289 +▁medical -2290 +chen -2291 +mod -2292 +▁mix -2293 +january -2294 +▁table -2295 +▁outside -2296 +▁exam -2297 +▁changes -2298 +▁however -2299 +▁skills -2300 +associ -2301 +▁likely -2302 +september -2303 +▁difficult -2304 +▁method -2305 +▁accept -2306 +oid -2307 +ooking -2308 +▁inside -2309 +box -2310 +▁pan -2311 +bel -2312 +-- -2313 +tod -2314 +oogle -2315 +▁abs -2316 +▁goal -2317 +cing -2318 +▁ple -2319 +▁board -2320 +hen -2321 +ification -2322 +wednes -2323 +idge -2324 +▁ground -2325 +▁programs -2326 +▁police -2327 +▁career -2328 +▁town -2329 +▁match -2330 +tuesday -2331 +▁signific -2332 +facebook -2333 +bit -2334 +▁activities -2335 +adem -2336 +▁learning -2337 +▁systems -2338 +▁weekend -2339 +▁manufact -2340 +▁ball -2341 +▁meeting -2342 +america -2343 +▁looks -2344 +aker -2345 +check -2346 +▁private -2347 +▁contract -2348 +▁position -2349 +iple -2350 +▁continu -2351 +▁cir -2352 +▁recogn -2353 +thursday -2354 +▁matter -2355 +▁screen -2356 +▁words -2357 +▁necess -2358 +▁nothing -2359 +▁application -2360 +▁lives -2361 +▁contribut -2362 +▁ri -2363 +best -2364 +ules -2365 +oly -2366 +adu -2367 +nal -2368 +ville -2369 +▁fail -2370 +▁hair -2371 +cover -2372 +rem -2373 +▁anim -2374 +▁photos -2375 +irth -2376 +austral -2377 +▁mat -2378 +▁included -2379 +▁lay -2380 +wednesday -2381 +▁projects -2382 +▁sat -2383 +water -2384 +friend -2385 +hold -2386 +▁worth -2387 +ening -2388 +field -2389 +▁round -2390 +▁longer -2391 +service -2392 +ops -2393 +ude -2394 +▁stor -2395 +business -2396 +ellow -2397 +▁practice -2398 +▁exist -2399 +▁additional -2400 +market -2401 +iqu -2402 +ought -2403 +▁located -2404 +▁similar -2405 +pal -2406 +arter -2407 +phot -2408 +date -2409 +nove -2410 +▁former -2411 +iter -2412 +tim -2413 +street -2414 +▁treatment -2415 +found -2416 +▁amazing -2417 +department -2418 +ights -2419 +▁total -2420 +▁fig -2421 +thanks -2422 +▁suggest -2423 +iol -2424 +▁playing -2425 +aign -2426 +miss -2427 +ls -2428 +▁risk -2429 +arget -2430 +win -2431 +▁associ -2432 +▁kit -2433 +▁communic -2434 +action -2435 +▁send -2436 +▁drive -2437 +▁repl -2438 +▁worked -2439 +dom -2440 +real -2441 +bon -2442 +wal -2443 +soft -2444 +▁variety -2445 +▁seems -2446 +▁choice -2447 +del -2448 +▁rather -2449 +▁infl -2450 +ipment -2451 +izing -2452 +▁death -2453 +mag -2454 +▁saf -2455 +example -2456 +unt -2457 +▁touch -2458 +▁providing -2459 +google -2460 +airs -2461 +▁purchase -2462 +▁true -2463 +▁autom -2464 +▁growth -2465 +about -2466 +▁comment -2467 +▁plans -2468 +▁student -2469 +▁loss -2470 +▁draw -2471 +ba -2472 +▁polit -2473 +▁general -2474 +good -2475 +ga -2476 +trans -2477 +▁chance -2478 +rect -2479 +▁production -2480 +▁significant -2481 +▁international -2482 +great -2483 +bal -2484 +hot -2485 +▁late -2486 +ipe -2487 +lier -2488 +ercial -2489 +group -2490 +mission -2491 +▁cook -2492 +empt -2493 +▁teac -2494 +▁character -2495 +▁function -2496 +bt -2497 +▁arri -2498 +calif -2499 +power -2500 +▁leading -2501 +▁court -2502 +▁file -2503 +▁allows -2504 +▁bud -2505 +international -2506 +ef -2507 +iff -2508 +ivery -2509 +opping -2510 +why -2511 +▁image -2512 +hood -2513 +▁network -2514 +▁cell -2515 +ession -2516 +▁wel -2517 +ha -2518 +▁sn -2519 +▁challeng -2520 +▁leaders -2521 +▁national -2522 +mp -2523 +green -2524 +▁ideas -2525 +▁wall -2526 +▁built -2527 +▁cand -2528 +▁remain -2529 +▁boy -2530 +iddle -2531 +▁encour -2532 +▁collection -2533 +▁sil -2534 +eds -2535 +hel -2536 +▁movie -2537 +november -2538 +▁investig -2539 +sl -2540 +labor -2541 +imum -2542 +▁mach -2543 +▁moment -2544 +gen -2545 +ondon -2546 +▁platform -2547 +ached -2548 +rich -2549 +▁decided -2550 +bert -2551 +▁quickly -2552 +▁rad -2553 +web -2554 +imin -2555 +white -2556 +▁exact -2557 +▁tick -2558 +▁piece -2559 +call -2560 +▁easily -2561 +cret -2562 +▁usually -2563 +▁bag -2564 +under -2565 +▁prevent -2566 +▁stat -2567 +▁estab -2568 +▁excell -2569 +▁wood -2570 +▁display -2571 +▁bal -2572 +▁prior -2573 +sch -2574 +london -2575 +▁cause -2576 +cup -2577 +▁regard -2578 +▁credit -2579 +even -2580 +▁decision -2581 +sur -2582 +ederal -2583 +east -2584 +▁computer -2585 +▁location -2586 +▁clos -2587 +▁higher -2588 +mm -2589 +▁lost -2590 +dece -2591 +idents -2592 +icks -2593 +▁depend -2594 +aste -2595 +itut -2596 +▁cannot -2597 +▁ill -2598 +super -2599 +▁http -2600 +▁sche -2601 +▁apply -2602 +▁feature -2603 +▁introdu -2604 +amed -2605 +▁daily -2606 +using -2607 +▁safety -2608 +▁commit -2609 +▁expected -2610 +bec -2611 +oud -2612 +techn -2613 +jack -2614 +▁rep -2615 +zy -2616 +▁heav -2617 +step -2618 +today -2619 +december -2620 +▁hor -2621 +▁mass -2622 +ospital -2623 +▁involved -2624 +pped -2625 +▁respect -2626 +sam -2627 +▁myself -2628 +▁weight -2629 +fam -2630 +▁user -2631 +david -2632 +reng -2633 +▁base -2634 +ky -2635 +▁released -2636 +aves -2637 +▁photo -2638 +▁parents -2639 +▁goes -2640 +▁serious -2641 +▁integr -2642 +mac -2643 +▁rate -2644 +▁note -2645 +oes -2646 +find -2647 +▁enter -2648 +arden -2649 +ruary -2650 +▁floor -2651 +english -2652 +▁green -2653 +▁cy -2654 +▁vict -2655 +ibrary -2656 +order -2657 +roups -2658 +illing -2659 +▁mother -2660 +▁reported -2661 +iding -2662 +▁saw -2663 +attle -2664 +▁rights -2665 +▁stra -2666 +▁woman -2667 +february -2668 +mem -2669 +ester -2670 +▁deep -2671 +works -2672 +erg -2673 +wide -2674 +▁smo -2675 +though -2676 +child -2677 +oring -2678 +▁dress -2679 +ead -2680 +▁hands -2681 +▁word -2682 +▁expert -2683 +bum -2684 +▁dom -2685 +▁squ -2686 +big -2687 +▁benefits -2688 +▁feed -2689 +▁gen -2690 +▁marketing -2691 +ban -2692 +iction -2693 +▁digital -2694 +night -2695 +▁center -2696 +▁af -2697 +thern -2698 +program -2699 +iant -2700 +ructure -2701 +▁hear -2702 +▁campaign -2703 +www -2704 +twitter -2705 +mary -2706 +▁tools -2707 +californ -2708 +ength -2709 +college -2710 +once -2711 +since -2712 +▁related -2713 +▁langu -2714 +▁surpr -2715 +men -2716 +▁save -2717 +body -2718 +umb -2719 +bscri -2720 +▁effic -2721 +▁costs -2722 +▁helps -2723 +▁door -2724 +▁gives -2725 +▁ep -2726 +▁roll -2727 +▁request -2728 +▁announced -2729 +hall -2730 +point -2731 +▁source -2732 +▁partners -2733 +afr -2734 +▁immed -2735 +▁huge -2736 +atives -2737 +services -2738 +▁wed -2739 +▁ut -2740 +make -2741 +▁ability -2742 +▁complet -2743 +▁stock -2744 +▁global -2745 +▁mobile -2746 +▁code -2747 +▁gets -2748 +change -2749 +iring -2750 +lex -2751 +arant -2752 +love -2753 +asons -2754 +enge -2755 +izes -2756 +▁modern -2757 +dri -2758 +▁earn -2759 +▁agre -2760 +force -2761 +▁materials -2762 +erc -2763 +▁arch -2764 +iation -2765 +da -2766 +vol -2767 +reme -2768 +▁giving -2769 +▁trust -2770 +akers -2771 +▁standard -2772 +pay -2773 +brit -2774 +according -2775 +▁sale -2776 +cure -2777 +▁option -2778 +▁attack -2779 +▁restaur -2780 +california -2781 +tex -2782 +▁award -2783 +di -2784 +▁write -2785 +▁published -2786 +▁admin -2787 +roy -2788 +plement -2789 +▁played -2790 +▁patients -2791 +▁ones -2792 +▁dest -2793 +ocol -2794 +▁gar -2795 +christmas -2796 +▁warm -2797 +▁parts -2798 +dem -2799 +fil -2800 +ida -2801 +▁ded -2802 +▁eye -2803 +▁maintain -2804 +cu -2805 +pose -2806 +have -2807 +▁fine -2808 +ael -2809 +▁written -2810 +ivity -2811 +▁organization -2812 +▁sus -2813 +▁interested -2814 +▁fab -2815 +vey -2816 +india -2817 +▁copy -2818 +▁sold -2819 +▁planning -2820 +ault -2821 +develop -2822 +▁dry -2823 +▁instead -2824 +▁images -2825 +▁target -2826 +hist -2827 +▁legal -2828 +irit -2829 +open -2830 +tri -2831 +rown -2832 +▁innov -2833 +erve -2834 +erous -2835 +▁highly -2836 +plan -2837 +thank -2838 +each -2839 +ned -2840 +▁sense -2841 +paul -2842 +▁join -2843 +zz -2844 +▁remember -2845 +phil -2846 +▁attention -2847 +bay -2848 +▁inj -2849 +▁rout -2850 +▁mid -2851 +▁heat -2852 +ashion -2853 +▁interesting -2854 +▁sens -2855 +▁picture -2856 +▁levels -2857 +▁bi -2858 +▁ur -2859 +who -2860 +orge -2861 +▁appe -2862 +hand -2863 +▁starting -2864 +▁star -2865 +▁policy -2866 +washing -2867 +▁average -2868 +▁interview -2869 +acing -2870 +▁answer -2871 +▁lic -2872 +ography -2873 +▁themselves -2874 +▁stuff -2875 +uel -2876 +acking -2877 +▁consult -2878 +▁president -2879 +▁conditions -2880 +▁item -2881 +▁depart -2882 +▁document -2883 +▁dream -2884 +▁equipment -2885 +▁cases -2886 +▁positive -2887 +hi -2888 +▁gift -2889 +▁healthy -2890 +▁families -2891 +apan -2892 +▁dead -2893 +▁lower -2894 +▁contain -2895 +▁deliver -2896 +▁player -2897 +▁qual -2898 +angel -2899 +▁sou -2900 +ipped -2901 +fire -2902 +long -2903 +mart -2904 +▁businesses -2905 +build -2906 +▁firm -2907 +▁gold -2908 +▁dise -2909 +▁streng -2910 +▁lif -2911 +two -2912 +aches -2913 +▁completely -2914 +ben -2915 +▁bur -2916 +▁effective -2917 +iny -2918 +nds -2919 +apple -2920 +▁pull -2921 +▁solution -2922 +ension -2923 +▁approach -2924 +▁wasn -2925 +▁upon -2926 +▁achieve -2927 +atory -2928 +▁fully -2929 +▁prices -2930 +▁multiple -2931 +▁condition -2932 +ado -2933 +▁groups -2934 +rought -2935 +inary -2936 +system -2937 +▁baby -2938 +gre -2939 +▁slow -2940 +▁shoot -2941 +review -2942 +▁prim -2943 +wa -2944 +▁began -2945 +istic -2946 +▁trip -2947 +▁stories -2948 +▁directly -2949 +▁www -2950 +▁express -2951 +▁resources -2952 +▁wrong -2953 +lt -2954 +▁sett -2955 +▁cat -2956 +office -2957 +▁message -2958 +▁fight -2959 +uses -2960 +▁reports -2961 +▁neigh -2962 +▁reach -2963 +▁hour -2964 +ffee -2965 +take -2966 +▁pers -2967 +▁sites -2968 +▁insurance -2969 +note -2970 +▁wal -2971 +ctions -2972 +▁band -2973 +isl -2974 +▁opportunities -2975 +olf -2976 +anch -2977 +educ -2978 +▁neg -2979 +▁subject -2980 +dy -2981 +▁camer -2982 +▁comments -2983 +vest -2984 +▁prem -2985 +dan -2986 +▁broad -2987 +produ -2988 +▁aim -2989 +▁fans -2990 +enty -2991 +▁defe -2992 +▁advice -2993 +ties -2994 +gold -2995 +▁types -2996 +ysis -2997 +▁except -2998 +▁attract -2999 +washington -3000 +erved -3001 +▁successful -3002 +▁wait -3003 +▁plant -3004 +tom -3005 +▁dru -3006 +aughter -3007 +▁attempt -3008 +japan -3009 +▁adult -3010 +rock -3011 +▁offe -3012 +▁stru -3013 +▁viol -3014 +▁suit -3015 +▁affect -3016 +▁cert -3017 +▁sen -3018 +ruit -3019 +▁thinking -3020 +▁implement -3021 +itation -3022 +▁opp -3023 +team -3024 +▁reve -3025 +▁solutions -3026 +▁employees -3027 +fast -3028 +itely -3029 +vision -3030 +▁fix -3031 +▁fair -3032 +comes -3033 +▁birth -3034 +▁hop -3035 +▁obs -3036 +cert -3037 +▁ir -3038 +▁teams -3039 +▁traditional -3040 +▁propos -3041 +▁pictures -3042 +▁excellent -3043 +▁creating -3044 +china -3045 +▁led -3046 +rob -3047 +▁mount -3048 +onse -3049 +▁tur -3050 +inner -3051 +▁capt -3052 +wom -3053 +▁feeling -3054 +people -3055 +▁separ -3056 +▁necessary -3057 +sum -3058 +▁saying -3059 +▁commercial -3060 +▁goals -3061 +▁execut -3062 +bet -3063 +▁gave -3064 +show -3065 +▁dark -3066 +▁countries -3067 +▁promot -3068 +iable -3069 +▁carry -3070 +mail -3071 +▁itself -3072 +person -3073 +william -3074 +director -3075 +iced -3076 +▁vot -3077 +▁extreme -3078 +▁sleep -3079 +▁wish -3080 +tee -3081 +▁tele -3082 +▁div -3083 +▁essential -3084 +ffects -3085 +company -3086 +▁section -3087 +ores -3088 +vide -3089 +det -3090 +▁serve -3091 +▁places -3092 +▁glass -3093 +flor -3094 +▁sometimes -3095 +arts -3096 +itute -3097 +sun -3098 +association -3099 +▁ble -3100 +▁club -3101 +▁har -3102 +▁imm -3103 +pan -3104 +aught -3105 +vin -3106 +”. -3107 +ancy -3108 +▁blood -3109 +unte -3110 +▁wonderful -3111 +cs -3112 +mil -3113 +▁budget -3114 +uth -3115 +cher -3116 +▁growing -3117 +▁schools -3118 +ibly -3119 +▁applications -3120 +istry -3121 +▁cancer -3122 +▁felt -3123 +▁vehicle -3124 +▁cross -3125 +▁sweet -3126 +▁aware -3127 +yond -3128 +court -3129 +▁spent -3130 +▁complex -3131 +▁limited -3132 +▁delivery -3133 +▁organiz -3134 +club -3135 +bra -3136 +▁eat -3137 +▁distribut -3138 +▁album -3139 +door -3140 +boo -3141 +▁impress -3142 +▁dro -3143 +texas -3144 +fun -3145 +although -3146 +▁loved -3147 +▁progress -3148 +hest -3149 +ta -3150 +ads -3151 +haps -3152 +ils -3153 +▁kn -3154 +pite -3155 +michael -3156 +run -3157 +aries -3158 +▁region -3159 +▁director -3160 +contact -3161 +▁correct -3162 +▁aff -3163 +▁son -3164 +▁terms -3165 +corpor -3166 +▁smart -3167 +council -3168 +▁grand -3169 +camp -3170 +live -3171 +▁speed -3172 +▁requirements -3173 +▁block -3174 +canada -3175 +▁physical -3176 +bing -3177 +▁apart -3178 +▁temper -3179 +▁respond -3180 +oll -3181 +▁mention -3182 +▁spend -3183 +▁weather -3184 +▁profession -3185 +▁blue -3186 +▁developed -3187 +rang -3188 +▁shop -3189 +▁towards -3190 +▁occur -3191 +▁individuals -3192 +▁nut -3193 +rote -3194 +opt -3195 +cell -3196 +mitted -3197 +▁conduct -3198 +▁straight -3199 +▁heard -3200 +▁situation -3201 +▁cold -3202 +▁finish -3203 +chan -3204 +ustr -3205 +▁sort -3206 +sol -3207 +research -3208 +fa -3209 +pack -3210 +sign -3211 +▁exerc -3212 +▁wants -3213 +▁jud -3214 +itary -3215 +aint -3216 +windows -3217 +asy -3218 +▁eyes -3219 +▁nature -3220 +▁quarter -3221 +online -3222 +▁moving -3223 +indian -3224 +▁finally -3225 +▁difference -3226 +lu -3227 +pc -3228 +▁statement -3229 +▁mark -3230 +▁race -3231 +▁device -3232 +ceed -3233 +▁dog -3234 +▁offering -3235 +made -3236 +james -3237 +▁shot -3238 +aks -3239 +wise -3240 +custom -3241 +▁pattern -3242 +ady -3243 +▁ing -3244 +▁ver -3245 +sub -3246 +▁construction -3247 +ox -3248 +▁enh -3249 +▁avoid -3250 +▁became -3251 +▁brought -3252 +▁college -3253 +▁sty -3254 +▁feet -3255 +ka -3256 +▁convers -3257 +▁apprec -3258 +rench -3259 +main -3260 +hum -3261 +hop -3262 +oon -3263 +food -3264 +tor -3265 +▁stage -3266 +▁charge -3267 +▁ru -3268 +▁pet -3269 +▁nation -3270 +ube -3271 +▁spot -3272 +▁title -3273 +internet -3274 +ological -3275 +iment -3276 +▁wedding -3277 +▁advant -3278 +keep -3279 +▁ord -3280 +▁annual -3281 +oke -3282 +test -3283 +▁kitchen -3284 +aling -3285 +lead -3286 +): -3287 +rad -3288 +iber -3289 +color -3290 +card -3291 +only -3292 +▁wear -3293 +▁seven -3294 +european -3295 +game -3296 +▁girls -3297 +▁tried -3298 +▁turned -3299 +aled -3300 +olute -3301 +▁homes -3302 +▁hotel -3303 +▁je -3304 +oms -3305 +▁owners -3306 +". -3307 +▁imag -3308 +▁uses -3309 +rences -3310 +jun -3311 +dule -3312 +▁stress -3313 +▁language -3314 +▁investment -3315 +▁guarant -3316 +urt -3317 +▁cas -3318 +ker -3319 +fall -3320 +▁hous -3321 +▁guide -3322 +▁helping -3323 +oves -3324 +▁ten -3325 +sey -3326 +▁ge -3327 +because -3328 +▁retail -3329 +▁mic -3330 +mount -3331 +▁rent -3332 +▁internet -3333 +▁exactly -3334 +rant -3335 +▁wrote -3336 +▁consist -3337 +blue -3338 +▁medic -3339 +plus -3340 +▁bath -3341 +▁nearly -3342 +usa -3343 +bc -3344 +▁pow -3345 +wall -3346 +▁partner -3347 +mine -3348 +special -3349 +information -3350 +cks -3351 +▁entertain -3352 +▁term -3353 +▁categ -3354 +▁brow -3355 +▁influ -3356 +community -3357 +cul -3358 +▁maybe -3359 +general -3360 +▁offered -3361 +iance -3362 +▁active -3363 +▁client -3364 +ancing -3365 +▁strateg -3366 +▁beginning -3367 +micro -3368 +▁double -3369 +ares -3370 +▁max -3371 +▁concept -3372 +music -3373 +▁wife -3374 +▁challenge -3375 +elect -3376 +ocr -3377 +ios -3378 +icking -3379 +idered -3380 +▁posted -3381 +:|| -3382 +▁bow -3383 +▁overall -3384 +ses -3385 +town -3386 +dig -3387 +mad -3388 +▁path -3389 +hill -3390 +conne -3391 +ustain -3392 +▁pages -3393 +▁definitely -3394 +mont -3395 +iately -3396 +league -3397 +▁pieces -3398 +▁activity -3399 +log -3400 +los -3401 +ocks -3402 +ounds -3403 +▁lots -3404 +▁ahead -3405 +▁billion -3406 +estival -3407 +▁bottom -3408 +never -3409 +opped -3410 +church -3411 +▁thous -3412 +week -3413 +▁altern -3414 +mot -3415 +▁appoint -3416 +▁decl -3417 +▁expand -3418 +▁independ -3419 +illa -3420 +wr -3421 +iles -3422 +▁inform -3423 +▁reduce -3424 +which -3425 +active -3426 +mel -3427 +▁acqu -3428 +▁finished -3429 +craft -3430 +▁benefit -3431 +roud -3432 +▁gas -3433 +icious -3434 +agn -3435 +ican -3436 +▁paid -3437 +▁devices -3438 +▁surround -3439 +▁fat -3440 +cam -3441 +▁thank -3442 +ago -3443 +▁hus -3444 +▁farm -3445 +police -3446 +lee -3447 +▁sports -3448 +sou -3449 +antly -3450 +▁collabor -3451 +cong -3452 +▁miles -3453 +▁extend -3454 +▁response -3455 +▁excited -3456 +smith -3457 +▁decor -3458 +look -3459 +▁emot -3460 +▁length -3461 +management -3462 +follow -3463 +▁exhib -3464 +▁advert -3465 +▁earlier -3466 +▁comfortable -3467 +ml -3468 +bell -3469 +▁easier -3470 +inte -3471 +page -3472 +watch -3473 +▁adjust -3474 +▁corpor -3475 +▁po -3476 +atever -3477 +island -3478 +anta -3479 +writ -3480 +▁largest -3481 +▁conference -3482 +amin -3483 +▁eight -3484 +▁dating -3485 +▁mer -3486 +▁assess -3487 +florida -3488 +▁transport -3489 +boy -3490 +vant -3491 +▁states -3492 +▁opening -3493 +▁belie -3494 +▁demand -3495 +rd -3496 +van -3497 +▁served -3498 +▁storage -3499 +▁flav -3500 +isions -3501 +▁damage -3502 +▁estate -3503 +start -3504 +▁analy -3505 +gal -3506 +exper -3507 +▁candid -3508 +lake -3509 +▁bank -3510 +▁jobs -3511 +▁tips -3512 +quest -3513 +▁thanks -3514 +oper -3515 +oving -3516 +▁coffee -3517 +jer -3518 +▁lik -3519 +during -3520 +▁culture -3521 +▁named -3522 +▁pen -3523 +▁sem -3524 +▁pred -3525 +▁numbers -3526 +▁completed -3527 +print -3528 +▁background -3529 +arc -3530 +▁rates -3531 +russ -3532 +fit -3533 +▁histor -3534 +australia -3535 +▁rock -3536 +▁understanding -3537 +ivil -3538 +▁cash -3539 +▁immediately -3540 +▁driving -3541 +bank -3542 +working -3543 +▁responsible -3544 +▁plus -3545 +▁considered -3546 +▁creative -3547 +lying -3548 +▁mis -3549 +organ -3550 +undred -3551 +▁father -3552 +▁gradu -3553 +▁truly -3554 +▁afford -3555 +▁setting -3556 +hens -3557 +media -3558 +cd -3559 +compl -3560 +author -3561 +ails -3562 +za -3563 +▁haven -3564 +▁cards -3565 +▁fashion -3566 +vict -3567 +bul -3568 +▁beyond -3569 +onstr -3570 +▁wine -3571 +ota -3572 +oil -3573 +jes -3574 +sell -3575 +azine -3576 +▁experienced -3577 +nded -3578 +▁knew -3579 +▁analysis -3580 +cel -3581 +family -3582 +▁occas -3583 +▁normal -3584 +download -3585 +icing -3586 +▁button -3587 +education -3588 +ously -3589 +▁accur -3590 +▁owner -3591 +▁reviews -3592 +orial -3593 +▁welcome -3594 +▁purpose -3595 +▁prefer -3596 +▁husband -3597 +aded -3598 +ategy -3599 +thom -3600 +▁cream -3601 +ampions -3602 +central -3603 +rooms -3604 +▁pool -3605 +▁regul -3606 +va -3607 +ald -3608 +ocolate -3609 +▁frame -3610 +▁colors -3611 +▁runs -3612 +▁talking -3613 +▁disease -3614 +▁political -3615 +chic -3616 +▁degree -3617 +publ -3618 +▁guys -3619 +lement -3620 +▁threat -3621 +iar -3622 +▁machine -3623 +republic -3624 +▁shopping -3625 +▁ingred -3626 +▁holiday -3627 +▁born -3628 +text -3629 +cat -3630 +anned -3631 +arily -3632 +▁train -3633 +icken -3634 +sil -3635 +▁satis -3636 +▁initial -3637 +priate -3638 +▁launch -3639 +▁onto -3640 +posted -3641 +brown -3642 +▁park -3643 +▁session -3644 +lou -3645 +▁paint -3646 +▁sugar -3647 +▁changed -3648 +▁guy -3649 +▁files -3650 +jour -3651 +▁looked -3652 +case -3653 +▁slight -3654 +▁evening -3655 +pare -3656 +▁indic -3657 +▁fem -3658 +istered -3659 +▁highest -3660 +▁helped -3661 +grand -3662 +▁shipping -3663 +▁prec -3664 +aud -3665 +▁volunte -3666 +▁surv -3667 +equ -3668 +▁lines -3669 +ences -3670 +▁object -3671 +▁fill -3672 +children -3673 +▁increased -3674 +help -3675 +▁proud -3676 +▁himself -3677 +pd -3678 +▁opin -3679 +weight -3680 +▁proced -3681 +lor -3682 +▁clin -3683 +▁artist -3684 +bow -3685 +▁egg -3686 +oph -3687 +▁spirit -3688 +▁particularly -3689 +fund -3690 +▁drop -3691 +through -3692 +▁camera -3693 +▁winter -3694 +▁federal -3695 +district -3696 +▁fant -3697 +▁mission -3698 +ws -3699 +▁daughter -3700 +itect -3701 +▁strugg -3702 +▁traffic -3703 +▁absolute -3704 +lord -3705 +pass -3706 +product -3707 +▁journey -3708 +▁produce -3709 +ris -3710 +▁craft -3711 +▁rules -3712 +science -3713 +▁pressure -3714 +hu -3715 +offic -3716 +rey -3717 +istics -3718 +▁models -3719 +▁economic -3720 +copy -3721 +▁allowed -3722 +▁selection -3723 +iling -3724 +▁patient -3725 +full -3726 +atl -3727 +lies -3728 +▁spread -3729 +▁highlight -3730 +const -3731 +▁incred -3732 +▁vac -3733 +profess -3734 +div -3735 +thes -3736 +▁efforts -3737 +union -3738 +▁stream -3739 +▁secure -3740 +going -3741 +▁spring -3742 +▁existing -3743 +han -3744 +rapy -3745 +▁bul -3746 +▁increasing -3747 +▁furn -3748 +▁videos -3749 +▁subst -3750 +clusive -3751 +▁letter -3752 +french -3753 +▁peace -3754 +british -3755 +pects -3756 +mes -3757 +▁bill -3758 +tit -3759 +obama -3760 +▁behav -3761 +▁residents -3762 +▁manager -3763 +master -3764 +▁frequ -3765 +▁mail -3766 +▁relax -3767 +▁fan -3768 +hd -3769 +project -3770 +▁measure -3771 +social -3772 +xim -3773 +contin -3774 +▁heavy -3775 +▁trade -3776 +▁couldn -3777 +▁middle -3778 +▁powerful -3779 +▁challenges -3780 +▁beauty -3781 +▁certainly -3782 +icon -3783 +▁basis -3784 +fortun -3785 +ologies -3786 +development -3787 +porary -3788 +▁subscri -3789 +resp -3790 +▁obtain -3791 +▁shared -3792 +▁dedicated -3793 +▁smooth -3794 +▁liter -3795 +unk -3796 +▁update -3797 +▁moved -3798 +▁seeing -3799 +▁multi -3800 +▁gain -3801 +▁wild -3802 +lished -3803 +▁cru -3804 +dec -3805 +mill -3806 +annel -3807 +▁payment -3808 +▁taste -3809 +uce -3810 +acity -3811 +▁links -3812 +support -3813 +▁schedule -3814 +iger -3815 +▁ban -3816 +▁rese -3817 +includ -3818 +▁alone -3819 +▁instruct -3820 +somet -3821 +whether -3822 +▁cars -3823 +▁gone -3824 +encies -3825 +▁science -3826 +▁demonstr -3827 +▁organizations -3828 +imal -3829 +▁whose -3830 +▁transfer -3831 +blog -3832 +pping -3833 +▁refer -3834 +▁hundred -3835 +roid -3836 +▁aren -3837 +▁brother -3838 +term -3839 +oe -3840 +▁supply -3841 +▁effects -3842 +ying -3843 +▁ton -3844 +front -3845 +pared -3846 +▁protection -3847 +▁basic -3848 +▁steps -3849 +rian -3850 +▁characters -3851 +uled -3852 +▁competition -3853 +▁church -3854 +ico -3855 +apter -3856 +▁guests -3857 +river -3858 +standing -3859 +▁communities -3860 +▁passion -3861 +gn -3862 +tre -3863 +fol -3864 +ona -3865 +▁graph -3866 +▁extremely -3867 +oices -3868 +▁butter -3869 +ronic -3870 +▁cast -3871 +▁compon -3872 +▁finding -3873 +western -3874 +apers -3875 +iture -3876 +▁forget -3877 +▁reasons -3878 +times -3879 +foundation -3880 +sequ -3881 +▁strength -3882 +▁unit -3883 +next -3884 +▁posts -3885 +aming -3886 +▁garden -3887 +▁sched -3888 +bour -3889 +ened -3890 +flow -3891 +▁notice -3892 +▁scient -3893 +▁fa -3894 +▁hospital -3895 +ht -3896 +ni -3897 +learn -3898 +anging -3899 +ergency -3900 +commission -3901 +▁guid -3902 +▁ideal -3903 +▁happened -3904 +▁rob -3905 +site -3906 +ston -3907 +asure -3908 +before -3909 +▁shown -3910 +▁voice -3911 +▁manufacture -3912 +▁coach -3913 +▁leader -3914 +▁determin -3915 +bill -3916 +▁breat -3917 +▁remind -3918 +ari -3919 +▁academ -3920 +▁eval -3921 +security -3922 +▁au -3923 +foot -3924 +▁waiting -3925 +both -3926 +osing -3927 +▁batter -3928 +alex -3929 +▁luck -3930 +▁classes -3931 +rief -3932 +▁solid -3933 +stone -3934 +▁beg -3935 +▁pair -3936 +▁biggest -3937 +▁meaning -3938 +▁although -3939 +▁experiences -3940 +▁stick -3941 +tot -3942 +engine -3943 +▁lack -3944 +▁trou -3945 +▁adding -3946 +▁passed -3947 +gers -3948 +▁views -3949 +▁covered -3950 +▁fear -3951 +▁regarding -3952 +num -3953 +▁beat -3954 +another -3955 +berg -3956 +▁snow -3957 +tur -3958 +▁cit -3959 +▁gun -3960 +▁score -3961 +▁senior -3962 +▁surface -3963 +mi -3964 +lied -3965 +ifying -3966 +azon -3967 +itar -3968 +apt -3969 +aven -3970 +rial -3971 +▁contains -3972 +cc -3973 +▁sand -3974 +▁scene -3975 +ams -3976 +fre -3977 +sand -3978 +▁electric -3979 +orm -3980 +irection -3981 +▁compared -3982 +▁standards -3983 +ration -3984 +ya -3985 +zer -3986 +ordin -3987 +lyn -3988 +▁continues -3989 +▁associated -3990 +olar -3991 +▁conven -3992 +▁advantage -3993 +bas -3994 +ags -3995 +video -3996 +▁disapp -3997 +data -3998 +tour -3999 +▁window -4000 +nic -4001 +▁ju -4002 +rick -4003 +orney -4004 +▁den -4005 +enter -4006 +▁starts -4007 +▁classic -4008 +▁package -4009 +grade -4010 +▁minute -4011 +▁calls -4012 +▁names -4013 +iques -4014 +▁throw -4015 +▁continued -4016 +▁died -4017 +▁recipe -4018 +iverse -4019 +▁master -4020 +▁military -4021 +dou -4022 +commod -4023 +▁und -4024 +▁experts -4025 +▁professionals -4026 +▁ice -4027 +mical -4028 +related -4029 +▁dinner -4030 +▁keeping -4031 +▁artists -4032 +england -4033 +ipes -4034 +▁guess -4035 +▁figure -4036 +ara -4037 +▁dang -4038 +▁closed -4039 +▁talent -4040 +▁ran -4041 +▁epis -4042 +▁district -4043 +▁pm -4044 +noon -4045 +store -4046 +▁manage -4047 +institute -4048 +▁wat -4049 +▁cab -4050 +beach -4051 +their -4052 +▁followed -4053 +▁technical -4054 +wis -4055 +▁circ -4056 +▁larger -4057 +▁reflect -4058 +gery -4059 +▁nove -4060 +illage -4061 +valley -4062 +▁connection -4063 +mass -4064 +atures -4065 +▁listed -4066 +▁sustain -4067 +▁football -4068 +▁cra -4069 +▁knows -4070 +▁trail -4071 +cm -4072 +▁critical -4073 +▁awes -4074 +ula -4075 +▁friendly -4076 +▁jump -4077 +▁strategy -4078 +▁laun -4079 +▁fabric -4080 +allery -4081 +▁fish -4082 +▁capital -4083 +mex -4084 +▁shape -4085 +▁officials -4086 +aly -4087 +jim -4088 +▁income -4089 +▁audience -4090 +▁resident -4091 +carol -4092 +▁virt -4093 +▁winning -4094 +fair -4095 +▁medium -4096 +oma -4097 +▁gra -4098 +▁opened -4099 +▁studies -4100 +▁efficient -4101 +bes -4102 +elling -4103 +ette -4104 +▁teaching -4105 +▁sym -4106 +▁facilities -4107 +▁fol -4108 +quality -4109 +▁developing -4110 +▁ship -4111 +▁advent -4112 +▁tim -4113 +▁bright -4114 +hard -4115 +▁motor -4116 +▁doctor -4117 +▁learned -4118 +▁restaurant -4119 +mu -4120 +okes -4121 +▁salt -4122 +connect -4123 +▁arrest -4124 +▁watching -4125 +wing -4126 +season -4127 +▁kept -4128 +▁ingredients -4129 +women -4130 +▁handle -4131 +▁awesome -4132 +ums -4133 +▁placed -4134 +▁thousands -4135 +level -4136 +▁poor -4137 +▁initi -4138 +ements -4139 +▁entry -4140 +▁bought -4141 +▁station -4142 +▁workers -4143 +▁rich -4144 +acement -4145 +▁exciting -4146 +▁leaves -4147 +▁leaving -4148 +▁grant -4149 +▁produced -4150 +▁updated -4151 +▁institut -4152 +conference -4153 +spring -4154 +igation -4155 +edy -4156 +ki -4157 +vi -4158 +scott -4159 +ishes -4160 +▁older -4161 +▁established -4162 +▁greater -4163 +▁appropriate -4164 +ixt -4165 +antic -4166 +▁princ -4167 +▁neighbor -4168 +ech -4169 +▁explore -4170 +franc -4171 +▁plenty -4172 +series -4173 +yer -4174 +cles -4175 +venue -4176 +▁force -4177 +▁spons -4178 +itors -4179 +▁catch -4180 +▁presented -4181 +jul -4182 +chicago -4183 +▁famous -4184 +▁tickets -4185 +▁monitor -4186 +max -4187 +inese -4188 +inger -4189 +brand -4190 +ograp -4191 +▁speak -4192 +▁sharing -4193 +gi -4194 +happ -4195 +type -4196 +▁claims -4197 +▁facility -4198 +bling -4199 +looking -4200 +▁cry -4201 +jesus -4202 +▁rooms -4203 +▁department -4204 +aa -4205 +access -4206 +atically -4207 +▁testing -4208 +size -4209 +award -4210 +▁requires -4211 +▁whatever -4212 +▁announce -4213 +▁sport -4214 +▁remove -4215 +▁mo -4216 +lands -4217 +wil -4218 +party -4219 +▁veget -4220 +academ -4221 +walk -4222 +amazon -4223 +invest -4224 +▁driver -4225 +▁victim -4226 +▁vote -4227 +▁theme -4228 +▁advance -4229 +jud -4230 +pac -4231 +zen -4232 +▁remains -4233 +▁useful -4234 +▁becoming -4235 +ups -4236 +tech -4237 +ooper -4238 +▁shall -4239 +▁buying -4240 +email -4241 +▁prepar -4242 +▁designs -4243 +▁properties -4244 +ctors -4245 +tenance -4246 +▁discover -4247 +▁flat -4248 +▁structure -4249 +▁swe -4250 +▁yester -4251 +rome -4252 +atively -4253 +code -4254 +▁map -4255 +cost -4256 +▁flex -4257 +”, -4258 +prote -4259 +commit -4260 +▁selected -4261 +cin -4262 +▁stores -4263 +report -4264 +▁repair -4265 +▁vision -4266 +berry -4267 +osure -4268 +▁tree -4269 +actions -4270 +▁brings -4271 +▁beach -4272 +▁diagn -4273 +▁tables -4274 +▁happens -4275 +▁valu -4276 +muse -4277 +lastname -4278 +join -4279 +being -4280 +itude -4281 +price -4282 +▁desk -4283 +▁signed -4284 +▁street -4285 +christian -4286 +girl -4287 +agram -4288 +chris -4289 +pective -4290 +origin -4291 +▁serving -4292 +?? -4293 +oom -4294 +ifts -4295 +▁balance -4296 +▁bra -4297 +▁alleg -4298 +▁plants -4299 +fortunately -4300 +▁menu -4301 +burg -4302 +books -4303 +▁crow -4304 +▁plastic -4305 +gas -4306 +bles -4307 +▁articles -4308 +vere -4309 +▁yesterday -4310 +days -4311 +uries -4312 +ximately -4313 +▁journal -4314 +atab -4315 +▁rain -4316 +▁chocolate -4317 +iency -4318 +▁perhaps -4319 +outs -4320 +▁memory -4321 +▁primary -4322 +ira -4323 +ped -4324 +▁anti -4325 +▁wouldn -4326 +ec -4327 +▁fee -4328 +mond -4329 +▁scen -4330 +chinese -4331 +▁operations -4332 +▁ul -4333 +▁secret -4334 +▁direction -4335 +cohol -4336 +government -4337 +hotel -4338 +▁pray -4339 +▁prepared -4340 +▁drug -4341 +▁funds -4342 +▁library -4343 +▁spending -4344 +haw -4345 +claim -4346 +society -4347 +▁temperature -4348 +got -4349 +▁actual -4350 +▁corner -4351 +▁economy -4352 +ading -4353 +george -4354 +limited -4355 +▁status -4356 +known -4357 +▁util -4358 +▁movement -4359 +enses -4360 +▁faith -4361 +▁meant -4362 +hab -4363 +sex -4364 +tic -4365 +rael -4366 +▁mist -4367 +▁discount -4368 +wer -4369 +▁mechan -4370 +ceo -4371 +▁cris -4372 +lock -4373 +▁resist -4374 +frank -4375 +▁ride -4376 +copyright -4377 +▁numerous -4378 +▁raised -4379 +", -4380 +chief -4381 +▁busy -4382 +greg -4383 +africa -4384 +▁forms -4385 +medical -4386 +▁orders -4387 +▁changing -4388 +▁syn -4389 +sn -4390 +▁locations -4391 +gra -4392 +dered -4393 +vention -4394 +adv -4395 +▁focused -4396 +▁exercise -4397 +▁communication -4398 +ket -4399 +aper -4400 +▁dep -4401 +▁core -4402 +link -4403 +journal -4404 +▁injury -4405 +agon -4406 +▁eating -4407 +** -4408 +odes -4409 +creat -4410 +abilities -4411 +tem -4412 +mid -4413 +▁youth -4414 +microsoft -4415 +ci -4416 +israel -4417 +▁notes -4418 +▁independent -4419 +sea -4420 +▁listen -4421 +▁survey -4422 +▁showing -4423 +!!! -4424 +riage -4425 +▁favour -4426 +rict -4427 +▁obvious -4428 +share -4429 +▁rare -4430 +lets -4431 +▁advanced -4432 +far -4433 +▁brown -4434 +congress -4435 +essions -4436 +▁afternoon -4437 +▁psy -4438 +lines -4439 +trump -4440 +▁putting -4441 +▁evidence -4442 +▁combination -4443 +technology -4444 +▁worry -4445 +ini -4446 +▁wire -4447 +▁records -4448 +cos -4449 +▁ended -4450 +umin -4451 +▁suff -4452 +source -4453 +▁animals -4454 +▁methods -4455 +▁reality -4456 +▁practices -4457 +pu -4458 +dev -4459 +▁drink -4460 +▁slightly -4461 +pad -4462 +anth -4463 +iences -4464 +want -4465 +second -4466 +ss -4467 +▁micro -4468 +current -4469 +▁enjoyed -4470 +▁caused -4471 +▁missing -4472 +die -4473 +hun -4474 +▁limit -4475 +sports -4476 +kit -4477 +▁determine -4478 +kin -4479 +▁tend -4480 +▁metal -4481 +gb -4482 +maybe -4483 +▁agency -4484 +▁sources -4485 +centre -4486 +▁sector -4487 +▁showed -4488 +ded -4489 +▁cleaning -4490 +▁operating -4491 +fi -4492 +▁inf -4493 +▁environmental -4494 +▁estim -4495 +mike -4496 +▁lux -4497 +indust -4498 +▁joined -4499 +tru -4500 +olic -4501 +▁diet -4502 +▁shel -4503 +▁relig -4504 +bob -4505 +meet -4506 +▁mill -4507 +bud -4508 +▁flu -4509 +▁dance -4510 +▁compre -4511 +▁teachers -4512 +▁shut -4513 +peter -4514 +▁thoughts -4515 +hus -4516 +imp -4517 +estic -4518 +ivered -4519 +▁sch -4520 +ander -4521 +robert -4522 +▁updates -4523 +need -4524 +ript -4525 +▁funding -4526 +▁organis -4527 +sor -4528 +eter -4529 +earth -4530 +▁capacity -4531 +▁dete -4532 +▁lose -4533 +employ -4534 +igan -4535 +isher -4536 +perform -4537 +▁colour -4538 +▁profile -4539 +▁assistance -4540 +▁nar -4541 +▁songs -4542 +rupt -4543 +▁visual -4544 +bridge -4545 +ampionship -4546 +▁appears -4547 +▁accident -4548 +▁visitors -4549 +did -4550 +ownt -4551 +library -4552 +▁previously -4553 +available -4554 +anna -4555 +hope -4556 +scre -4557 +▁repe -4558 +▁steel -4559 +lie -4560 +▁thr -4561 +▁chair -4562 +▁filled -4563 +azz -4564 +hon -4565 +▁du -4566 +those -4567 +▁brain -4568 +▁sea -4569 +▁south -4570 +android -4571 +▁managed -4572 +jeff -4573 +ception -4574 +ales -4575 +▁cart -4576 +▁swim -4577 +▁decide -4578 +▁corporate -4579 +▁coll -4580 +democr -4581 +▁charges -4582 +ili -4583 +ferences -4584 +▁markets -4585 +inch -4586 +▁deb -4587 +istan -4588 +▁https -4589 +▁honest -4590 +▁chat -4591 +▁spl -4592 +▁neck -4593 +▁cheese -4594 +▁allowing -4595 +squ -4596 +▁raise -4597 +▁stret -4598 +building -4599 +wild -4600 +▁rev -4601 +▁extensive -4602 +acks -4603 +▁teacher -4604 +▁accommod -4605 +edom -4606 +global -4607 +▁helpful -4608 +▁ath -4609 +louis -4610 +▁legisl -4611 +▁central -4612 +▁walking -4613 +▁committed -4614 +ett -4615 +away -4616 +▁firstname -4617 +hr -4618 +▁lunch -4619 +hensive -4620 +▁absolutely -4621 +▁fold -4622 +energy -4623 +ervation -4624 +▁documents -4625 +ellig -4626 +manager -4627 +▁asking -4628 +▁coverage -4629 +assist -4630 +▁vehicles -4631 +▁pal -4632 +francis -4633 +iration -4634 +aked -4635 +inn -4636 +pop -4637 +uty -4638 +appro -4639 +welcome -4640 +▁outdoor -4641 +▁smaller -4642 +pract -4643 +▁tast -4644 +▁selling -4645 +▁register -4646 +style -4647 +digital -4648 +gon -4649 +▁milk -4650 +▁inspired -4651 +▁leadership -4652 +parent -4653 +▁promote -4654 +▁decisions -4655 +human -4656 +▁architect -4657 +pot -4658 +rose -4659 +▁north -4660 +▁population -4661 +?” -4662 +pow -4663 +▁earth -4664 +ola -4665 +sky -4666 +tun -4667 +network -4668 +▁reached -4669 +committee -4670 +▁elements -4671 +▁relationships -4672 +▁counter -4673 +▁establish -4674 +▁expensive -4675 +▁partnership -4676 +france -4677 +angeles -4678 +pie -4679 +▁sauce -4680 +medic -4681 +▁parties -4682 +▁delicious -4683 +py -4684 +bac -4685 +dog -4686 +▁becomes -4687 +▁dy -4688 +legal -4689 +▁agree -4690 +chair -4691 +▁lovely -4692 +▁windows -4693 +▁concerns -4694 +▁expertise -4695 +admin -4696 +▁topic -4697 +▁winner -4698 +▁society -4699 +▁van -4700 +inste -4701 +kings -4702 +comple -4703 +▁radio -4704 +four -4705 +▁spect -4706 +▁upcoming -4707 +nut -4708 +icip -4709 +travel -4710 +▁willing -4711 +aser -4712 +photo -4713 +iers -4714 +▁registered -4715 +non -4716 +▁relevant -4717 +saf -4718 +worth -4719 +▁truth -4720 +▁rid -4721 +▁launched -4722 +▁connected -4723 +▁fourth -4724 +▁century -4725 +omb -4726 +▁task -4727 +atin -4728 +suit -4729 +▁sounds -4730 +▁bac -4731 +▁format -4732 +▁techniques -4733 +buy -4734 +▁movies -4735 +▁eth -4736 +▁fees -4737 +ocket -4738 +cook -4739 +▁doll -4740 +▁plays -4741 +pur -4742 +osis -4743 +▁kid -4744 +▁instru -4745 +▁despite -4746 +▁bon -4747 +▁situ -4748 +museum -4749 +rics -4750 +▁enhance -4751 +!) -4752 +asp -4753 +omin -4754 +prise -4755 +▁approximately -4756 +plo -4757 +three -4758 +▁loan -4759 +ym -4760 +rie -4761 +born -4762 +▁joint -4763 +▁defense -4764 +▁anywhere -4765 +ius -4766 +▁citiz -4767 +▁values -4768 +lot -4769 +prov -4770 +▁stre -4771 +thomas -4772 +▁explain -4773 +▁dur -4774 +▁civil -4775 +▁panel -4776 +▁innovative -4777 +▁commitment -4778 +▁discussion -4779 +bled -4780 +▁delivered -4781 +▁operation -4782 +kel -4783 +▁essay -4784 +hom -4785 +virgin -4786 +eless -4787 +▁readers -4788 +▁educational -4789 +hou -4790 +ubs -4791 +kes -4792 +▁nav -4793 +▁separate -4794 +een -4795 +▁maintenance -4796 +▁recommended -4797 +prem -4798 +secut -4799 +▁surgery -4800 +def -4801 +▁meat -4802 +dc -4803 +▁sav -4804 +ita -4805 +oles -4806 +▁feels -4807 +▁waste -4808 +▁tun -4809 +month -4810 +▁generally -4811 +▁assign -4812 +ero -4813 +▁rap -4814 +▁toward -4815 +▁courses -4816 +▁controll -4817 +cov -4818 +▁mental -4819 +mur -4820 +▁cheap -4821 +▁fellow -4822 +sem -4823 +▁parking -4824 +job -4825 +▁identify -4826 +fed -4827 +cean -4828 +idering -4829 +▁encourage -4830 +▁fantastic -4831 +▁presence -4832 +▁optim -4833 +▁killed -4834 +cription -4835 +▁calling -4836 +▁emergency -4837 +ensions -4838 +▁cooking -4839 +cape -4840 +lind -4841 +▁tea -4842 +▁photograph -4843 +does -4844 +▁argu -4845 +elines -4846 +icies -4847 +▁roof -4848 +▁editor -4849 +jon -4850 +icians -4851 +▁sympt -4852 +▁agreement -4853 +nel -4854 +rings -4855 +▁edge -4856 +rid -4857 +▁adults -4858 +▁landsc -4859 +▁feedback -4860 +▁vs -4861 +ipal -4862 +making -4863 +▁charged -4864 +hell -4865 +▁joy -4866 +▁nine -4867 +▁chicken -4868 +▁guest -4869 +archers -4870 +▁arrang -4871 +sar -4872 +games -4873 +▁seat -4874 +▁server -4875 +▁hearing -4876 +▁sitting -4877 +▁processes -4878 +▁sets -4879 +▁fuel -4880 +▁rise -4881 +▁appearance -4882 +▁consumers -4883 +▁mur -4884 +▁mostly -4885 +▁academic -4886 +▁combined -4887 +eed -4888 +hal -4889 +stock -4890 +▁od -4891 +▁accompl -4892 +cred -4893 +▁fruit -4894 +asia -4895 +▁signs -4896 +▁generation -4897 +▁trial -4898 +▁featured -4899 +▁executive -4900 +▁installation -4901 +irmed -4902 +letter -4903 +▁intended -4904 +▁prepare -4905 +▁tight -4906 +chester -4907 +▁transl -4908 +▁teen -4909 +▁furniture -4910 +▁therefore -4911 +ko -4912 +shop -4913 +nament -4914 +▁category -4915 +▁rat -4916 +local -4917 +communic -4918 +wi -4919 +gent -4920 +heart -4921 +▁bowl -4922 +▁push -4923 +little -4924 +▁cities -4925 +▁provider -4926 +xx -4927 +cross -4928 +▁infr -4929 +▁detailed -4930 +ifications -4931 +jac -4932 +▁kick -4933 +▁factors -4934 +▁dogs -4935 +▁alternative -4936 +▁brief -4937 +▁breast -4938 +▁properly -4939 +burn -4940 +▁battle -4941 +▁responsibility -4942 +bost -4943 +iled -4944 +▁seconds -4945 +▁confident -4946 +▁motiv -4947 +johnson -4948 +▁studio -4949 +▁confidence -4950 +clean -4951 +▁brands -4952 +▁info -4953 +martin -4954 +agement -4955 +▁chemical -4956 +span -4957 +ulated -4958 +▁participate -4959 +amb -4960 +drive -4961 +atabase -4962 +▁sister -4963 +▁removed -4964 +unning -4965 +▁exchange -4966 +▁majority -4967 +▁entertainment -4968 +rat -4969 +matt -4970 +happy -4971 +▁mort -4972 +▁holding -4973 +▁bringing -4974 +▁measures -4975 +▁transform -4976 +kn -4977 +kins -4978 +young -4979 +▁approved -4980 +▁kill -4981 +▁meal -4982 +othing -4983 +record -4984 +▁caught -4985 +▁purchased -4986 +ja -4987 +secret -4988 +▁tough -4989 +▁novel -4990 +▁seeking -4991 +▁scheduled -4992 +▁ven -4993 +▁lock -4994 +▁described -4995 +guide -4996 +▁female -4997 +▁occasion -4998 +▁websites -4999 +joe -5000 +festival -5001 +▁participants -5002 +jose -5003 +ulty -5004 +visit -5005 +▁clim -5006 +▁importance -5007 +▁achie -5008 +▁writer -5009 +students -5010 +▁totally -5011 +▁distance -5012 +dro -5013 +priv -5014 +▁stars -5015 +americans -5016 +▁returned -5017 +jones -5018 +applic -5019 +▁psych -5020 +interest -5021 +execut -5022 +▁minor -5023 +ager -5024 +▁rom -5025 +▁opinion -5026 +kar -5027 +icate -5028 +canad -5029 +▁satisf -5030 +▁wearing -5031 +ushed -5032 +econom -5033 +▁guarantee -5034 +anda -5035 +▁ment -5036 +▁sides -5037 +▁thick -5038 +▁familiar -5039 +▁performed -5040 +table -5041 +▁contem -5042 +▁supporting -5043 +dam -5044 +iner -5045 +prene -5046 +mother -5047 +▁centre -5048 +turn -5049 +▁hundreds -5050 +wat -5051 +▁shap -5052 +▁maximum -5053 +▁appeared -5054 +hors -5055 +▁vir -5056 +▁ring -5057 +▁coast -5058 +▁units -5059 +sometimes -5060 +australian -5061 +ols -5062 +paper -5063 +▁seek -5064 +boston -5065 +control -5066 +▁topics -5067 +hospital -5068 +▁restrict -5069 +▁investigation -5070 +asks -5071 +expl -5072 +space -5073 +▁monthly -5074 +▁ourselves -5075 +trust -5076 +▁fell -5077 +ushing -5078 +▁teach -5079 +▁switch -5080 +▁climate -5081 +mony -5082 +ulture -5083 +▁installed -5084 +otton -5085 +host -5086 +kore -5087 +▁doors -5088 +reprene -5089 +▁depending -5090 +inct -5091 +▁comprehensive -5092 +break -5093 +▁trees -5094 +▁magazine -5095 +pdf -5096 +oga -5097 +irds -5098 +▁mentioned -5099 +▁worldwide -5100 +try -5101 +alog -5102 +▁quiet -5103 +▁sizes -5104 +▁sexual -5105 +▁clearly -5106 +hs -5107 +coast -5108 +▁leads -5109 +▁foreign -5110 +▁impl -5111 +▁begins -5112 +▁supported -5113 +guard -5114 +ulous -5115 +▁gather -5116 +▁episode -5117 +▁conversation -5118 +map -5119 +▁lap -5120 +itness -5121 +▁mouth -5122 +▁minimum -5123 +▁advertising -5124 +minister -5125 +sat -5126 +▁hun -5127 +▁competitive -5128 +cir -5129 +ixture -5130 +▁square -5131 +cry -5132 +ighter -5133 +eper -5134 +steve -5135 +▁celebrate -5136 +▁tests -5137 +▁valid -5138 +country -5139 +▁visiting -5140 +bb -5141 +eline -5142 +idden -5143 +▁vary -5144 +▁married -5145 +small -5146 +summer -5147 +ho -5148 +nav -5149 +core -5150 +▁chief -5151 +▁veter -5152 +fox -5153 +sus -5154 +▁pros -5155 +▁apparent -5156 +▁birthday -5157 +german -5158 +ohio -5159 +otic -5160 +enjoy -5161 +▁tail -5162 +academy -5163 +japanese -5164 +▁flowers -5165 +▁otherwise -5166 +▁affordable -5167 +verse -5168 +▁invent -5169 +▁struct -5170 +▁breakfast -5171 +fill -5172 +tube -5173 +▁flo -5174 +▁cells -5175 +▁ticket -5176 +▁bond -5177 +estyle -5178 +victor -5179 +▁species -5180 +▁accounts -5181 +wear -5182 +▁ang -5183 +rights -5184 +▁apps -5185 +abs -5186 +mosp -5187 +▁bike -5188 +rical -5189 +▁reco -5190 +pected -5191 +▁audio -5192 +▁picked -5193 +▁delight -5194 +▁featuring -5195 +arp -5196 +iser -5197 +tered -5198 +▁scored -5199 +rate -5200 +▁boost -5201 +mult -5202 +▁deals -5203 +▁draft -5204 +▁teeth -5205 +▁shooting -5206 +▁truck -5207 +african -5208 +); -5209 +▁vent -5210 +▁bread -5211 +station -5212 +▁animal -5213 +▁policies -5214 +daily -5215 +endar -5216 +▁honor -5217 +▁faster -5218 +fc -5219 +ora -5220 +ailability -5221 +kat -5222 +▁pepper -5223 +dar -5224 +beaut -5225 +number -5226 +▁alcohol -5227 +▁marriage -5228 +▁possibly -5229 +▁introduced -5230 +▁technologies -5231 +prom -5232 +ryan -5233 +▁regularly -5234 +mobile -5235 +▁noted -5236 +▁predict -5237 +kind -5238 +range -5239 +▁battery -5240 +▁confirm -5241 +▁illustr -5242 +▁officer -5243 +gradu -5244 +▁bigger -5245 +ez -5246 +▁boat -5247 +iminal -5248 +▁scale -5249 +area -5250 +chang -5251 +▁spin -5252 +▁label -5253 +instead -5254 +▁administration -5255 +odies -5256 +▁weap -5257 +▁eventually -5258 +umn -5259 +▁icon -5260 +▁cloud -5261 +▁shoes -5262 +▁aid -5263 +▁boys -5264 +random -5265 +▁behavior -5266 +▁remaining -5267 +tick -5268 +▁ris -5269 +▁liqu -5270 +germany -5271 +▁arrived -5272 +▁basket -5273 +▁recovery -5274 +▁cro -5275 +ancer -5276 +usion -5277 +▁dollars -5278 +ali -5279 +atre -5280 +▁pit -5281 +ulate -5282 +section -5283 +▁pu -5284 +abet -5285 +osen -5286 +▁evalu -5287 +federal -5288 +▁unless -5289 +▁reliable -5290 +astructure -5291 +▁der -5292 +▁pump -5293 +lywood -5294 +▁inches -5295 +▁neighborhood -5296 +rier -5297 +▁fly -5298 +▁compar -5299 +▁revealed -5300 +▁discovered -5301 +nas -5302 +georg -5303 +williams -5304 +▁consumer -5305 +wish -5306 +▁manufacturing -5307 +kr -5308 +▁thread -5309 +▁volume -5310 +wy -5311 +morrow -5312 +profit -5313 +▁suitable -5314 +bat -5315 +▁fif -5316 +thent -5317 +training -5318 +select -5319 +history -5320 +▁concent -5321 +▁symptoms -5322 +spec -5323 +▁fro -5324 +grees -5325 +▁campus -5326 +covery -5327 +▁choices -5328 +yard -5329 +▁nurs -5330 +▁doubt -5331 +▁flight -5332 +▁drivers -5333 +ae -5334 +bear -5335 +▁realize -5336 +idays -5337 +▁dynam -5338 +▁reput -5339 +gov -5340 +▁appreciate -5341 +▁mine -5342 +▁agent -5343 +▁broken -5344 +▁incorpor -5345 +▁painting -5346 +▁rub -5347 +friendly -5348 +▁answers -5349 +▁cutting -5350 +words -5351 +▁blow -5352 +▁quot -5353 +▁components -5354 +vd -5355 +mean -5356 +phys -5357 +▁mel -5358 +▁elig -5359 +cking -5360 +speed -5361 +▁poll -5362 +▁planned -5363 +penn -5364 +▁dish -5365 +▁submit -5366 +marketing -5367 +▁restaurants -5368 +oop -5369 +irms -5370 +▁election -5371 +▁appointment -5372 +▁sessions -5373 +▁television -5374 +ura -5375 +▁corn -5376 +▁pregn -5377 +standard -5378 +stat -5379 +▁cabin -5380 +▁compos -5381 +carolina -5382 +▁engineering -5383 +ena -5384 +rus -5385 +▁emer -5386 +▁rule -5387 +▁protected -5388 +mom -5389 +nick -5390 +supp -5391 +poses -5392 +orders -5393 +italian -5394 +members -5395 +natural -5396 +▁profit -5397 +has -5398 +acle -5399 +iamond -5400 +▁reserved -5401 +vit -5402 +give -5403 +▁clo -5404 +▁lived -5405 +owntown -5406 +▁earned -5407 +covid -5408 +▁load -5409 +▁films -5410 +▁twice -5411 +install -5412 +▁paying -5413 +▁warrant -5414 +▁considering -5415 +mir -5416 +▁cultural -5417 +▁perfectly -5418 +mos -5419 +division -5420 +cath -5421 +ician -5422 +still -5423 +▁dates -5424 +▁specifically -5425 +▁athlet -5426 +▁fields -5427 +▁applied -5428 +▁bedroom -5429 +▁ages -5430 +▁contest -5431 +▁agencies -5432 +▁internal -5433 +▁strategies -5434 +▁burn -5435 +ternal -5436 +▁dining -5437 +▁seemed -5438 +▁victory -5439 +▁exclusive -5440 +gage -5441 +▁covers -5442 +▁employee -5443 +▁interior -5444 +iat -5445 +adel -5446 +▁oven -5447 +▁golf -5448 +middle -5449 +▁missed -5450 +▁sad -5451 +story -5452 +▁license -5453 +▁revenue -5454 +ibilities -5455 +poon -5456 +▁forum -5457 +▁loves -5458 +rum -5459 +▁edition -5460 +▁housing -5461 +▁university -5462 +issue -5463 +▁lights -5464 +▁resource -5465 +shire -5466 +▁cere -5467 +▁returns -5468 +executive -5469 +easy -5470 +rance -5471 +father -5472 +▁labor -5473 +▁moist -5474 +▁automatically -5475 +▁mir -5476 +▁keeps -5477 +▁league -5478 +▁tomorrow -5479 +ario -5480 +farm -5481 +tle -5482 +base -5483 +▁oppos -5484 +▁confirmed -5485 +stream -5486 +▁thus -5487 +▁proposed -5488 +▁quant -5489 +▁desire -5490 +fs -5491 +stra -5492 +▁depos -5493 +▁grade -5494 +▁messages -5495 +▁commission -5496 +▁incredible -5497 +▁die -5498 +▁lit -5499 +▁yes -5500 +ospit -5501 +▁issued -5502 +▁ign -5503 +ships -5504 +proper -5505 +▁prison -5506 +▁aspects -5507 +▁extract -5508 +▁attorney -5509 +andy -5510 +mach -5511 +▁surf -5512 +▁chosen -5513 +▁practical -5514 +md -5515 +▁debut -5516 +ication -5517 +richard -5518 +▁supposed -5519 +fection -5520 +mt -5521 +ano -5522 +▁ben -5523 +▁vit -5524 +▁turns -5525 +▁buildings -5526 +▁surrounding -5527 +paris -5528 +▁cute -5529 +▁ends -5530 +account -5531 +▁remote -5532 +▁stretch -5533 +▁purposes -5534 +osh -5535 +rep -5536 +umes -5537 +▁anc -5538 +mexico -5539 +▁shares -5540 +ria -5541 +andrew -5542 +jackson -5543 +▁injuries -5544 +▁actions -5545 +▁improved -5546 +kay -5547 +gment -5548 +▁laws -5549 +▁chain -5550 +iest -5551 +▁cake -5552 +▁error -5553 +xy -5554 +▁spokes -5555 +▁styles -5556 +▁musical -5557 +▁accurate -5558 +▁standing -5559 +kim -5560 +▁wid -5561 +▁tells -5562 +▁circum -5563 +▁negative -5564 +▁industrial -5565 +tax -5566 +▁pod -5567 +▁affected -5568 +rac -5569 +conf -5570 +▁dra -5571 +▁hang -5572 +▁wheel -5573 +instagram -5574 +▁instructions -5575 +mind -5576 +▁pun -5577 +▁apartment -5578 +▁favourite -5579 +▁registration -5580 +▁cop -5581 +should -5582 +▁typically -5583 +bad -5584 +event -5585 +flict -5586 +▁pure -5587 +comput -5588 +▁speech -5589 +▁valuable -5590 +eas -5591 +isa -5592 +orable -5593 +▁island -5594 +▁moments -5595 +▁atmosp -5596 +ltd -5597 +save -5598 +▁rapid -5599 +erce -5600 +▁foods -5601 +▁researchers -5602 +iability -5603 +▁replace -5604 +fac -5605 +dist -5606 +daniel -5607 +isters -5608 +▁holds -5609 +▁county -5610 +▁failed -5611 +cling -5612 +pacific -5613 +▁meetings -5614 +.| -5615 +▁une -5616 +▁route -5617 +▁presentation -5618 +olymp -5619 +michigan -5620 +▁decades -5621 +tes -5622 +▁pleased -5623 +▁greatest -5624 +eler -5625 +▁poly -5626 +having -5627 +ato -5628 +▁debt -5629 +▁usual -5630 +▁influence -5631 +glass -5632 +track -5633 +intage -5634 +▁logo -5635 +▁whom -5636 +▁agreed -5637 +▁stated -5638 +husi -5639 +iors -5640 +▁inn -5641 +donald -5642 +▁element -5643 +▁bound -5644 +▁enable -5645 +▁degrees -5646 +▁recorded -5647 +▁awareness -5648 +due -5649 +stop -5650 +tele -5651 +▁dial -5652 +orough -5653 +tay -5654 +oosing -5655 +nc -5656 +pes -5657 +▁rot -5658 +sher -5659 +stay -5660 +zeal -5661 +▁perman -5662 +▁causes -5663 +▁everyday -5664 +▁processing -5665 +member -5666 +▁photograp -5667 +image -5668 +▁hero -5669 +▁reference -5670 +ache -5671 +hour -5672 +santa -5673 +safety -5674 +▁owned -5675 +▁leather -5676 +▁arm -5677 +▁hall -5678 +beat -5679 +▁plate -5680 +perfect -5681 +▁concerned -5682 +▁jack -5683 +▁mini -5684 +▁tack -5685 +living -5686 +▁emotional -5687 +requ -5688 +▁rough -5689 +▁elimin -5690 +oval -5691 +cular -5692 +▁trend -5693 +model -5694 +plate -5695 +ocking -5696 +vernor -5697 +▁grown -5698 +ulations -5699 +▁officers -5700 +▁supports -5701 +adium -5702 +canadian -5703 +smart -5704 +▁grad -5705 +▁recy -5706 +giving -5707 +▁powder -5708 +ucle -5709 +▁newsp -5710 +▁ongoing -5711 +pod -5712 +heat -5713 +iano -5714 +again -5715 +▁adventure -5716 +mbly -5717 +▁sam -5718 +itage -5719 +sembly -5720 +▁browser -5721 +welry -5722 +▁height -5723 +▁designer -5724 +fficient -5725 +▁stopped -5726 +orgeous -5727 +virginia -5728 +pon -5729 +swe -5730 +▁crazy -5731 +despite -5732 +▁premium -5733 +!” -5734 +cab -5735 +▁rear -5736 +jay -5737 +▁phil -5738 +subscri -5739 +personal -5740 +▁hardware -5741 +scot -5742 +oration -5743 +▁organic -5744 +▁bathroom -5745 +▁database -5746 +▁disappoint -5747 +awards -5748 +▁village -5749 +ipl -5750 +abled -5751 +▁mode -5752 +bag -5753 +irty -5754 +▁gro -5755 +▁tro -5756 +▁supplies -5757 +▁functions -5758 +▁effectively -5759 +▁gear -5760 +credit -5761 +▁crowd -5762 +▁calcul -5763 +▁surprise -5764 +broad -5765 +▁knowing -5766 +teen -5767 +▁crime -5768 +continue -5769 +▁creation -5770 +▁tradition -5771 +▁candidates -5772 +▁inspiration -5773 +dun -5774 +draw -5775 +▁tut -5776 +▁beer -5777 +▁licens -5778 +mountain -5779 +▁intellig -5780 +▁happening -5781 +environment -5782 +inal -5783 +arks -5784 +▁gam -5785 +radio -5786 +spanish -5787 +▁distribution -5788 +lan -5789 +▁aw -5790 +▁funny -5791 +▁receiving -5792 +▁electronic -5793 +▁infrastructure -5794 +upd -5795 +▁mixed -5796 +▁script -5797 +▁improvement -5798 +▁accepted -5799 +▁employment -5800 +▁identified -5801 +fish -5802 +▁tag -5803 +▁talks -5804 +▁weekly -5805 +learning -5806 +financial -5807 +▁facing -5808 +▁mixture -5809 +▁providers -5810 +pict -5811 +advis -5812 +clear -5813 +▁belong -5814 +▁treated -5815 +francisco -5816 +published -5817 +▁violence -5818 +plant -5819 +▁harm -5820 +▁serves -5821 +vy -5822 +mally -5823 +polit -5824 +▁advoc -5825 +russian -5826 +phas -5827 +colorado -5828 +shipping -5829 +?" -5830 +iated -5831 +▁disp -5832 +▁payments -5833 +▁administr -5834 +▁healthcare -5835 +▁fixed -5836 +▁flour -5837 +▁silver -5838 +clusion -5839 +ilit -5840 +▁sky -5841 +▁sheet -5842 +▁scholar -5843 +▁losing -5844 +▁taught -5845 +▁lessons -5846 +lay -5847 +▁forces -5848 +▁freedom -5849 +▁extended -5850 +▁determined -5851 +▁horse -5852 +iveness -5853 +▁forest -5854 +var -5855 +erves -5856 +senior -5857 +▁worse -5858 +▁stands -5859 +▁trouble -5860 +▁incident -5861 +▁accessible -5862 +▁ped -5863 +▁teas -5864 +▁arts -5865 +▁intr -5866 +▁closer -5867 +crete -5868 +graduate -5869 +cra -5870 +ele -5871 +mov -5872 +brid -5873 +gate -5874 +jersey -5875 +▁decre -5876 +cow -5877 +▁symb -5878 +▁kinds -5879 +▁hoping -5880 +▁successfully -5881 +pen -5882 +▁dol -5883 +amber -5884 +▁imagine -5885 +▁mal -5886 +▁coordin -5887 +▁follows -5888 +ref -5889 +▁voc -5890 +▁yellow -5891 +▁alleged -5892 +▁witness -5893 +▁upper -5894 +georgia -5895 +▁papers -5896 +▁patterns -5897 +▁vacation -5898 +▁enthusi -5899 +▁matters -5900 +webs -5901 +etime -5902 +▁telling -5903 +▁significantly -5904 +rors -5905 +screen -5906 +▁turning -5907 +▁lift -5908 +▁native -5909 +▁therapy -5910 +▁convenient -5911 +▁efficiency -5912 +▁impressive -5913 +cro -5914 +▁bat -5915 +▁trading -5916 +hp -5917 +pool -5918 +▁dram -5919 +▁baking -5920 +brow -5921 +would -5922 +kingdom -5923 +▁stunning -5924 +rim -5925 +▁advis -5926 +▁folks -5927 +▁recipes -5928 +phy -5929 +create -5930 +▁drugs -5931 +▁guitar -5932 +forcement -5933 +▁conducted -5934 +▁lifestyle -5935 +dur -5936 +rev -5937 +▁pil -5938 +▁male -5939 +ansion -5940 +▁recognized -5941 +total -5942 +large -5943 +▁grab -5944 +▁simpl -5945 +finally -5946 +▁clothes -5947 +▁explained -5948 +▁description -5949 +▁expectations -5950 +cop -5951 +psy -5952 +cers -5953 +user -5954 +▁spr -5955 +ireland -5956 +▁crisis -5957 +▁dim -5958 +▁entreprene -5959 +illy -5960 +orrow -5961 +▁bless -5962 +▁luxury -5963 +inations -5964 +wan -5965 +corn -5966 +pick -5967 +software -5968 +▁routine -5969 +▁dangerous -5970 +▁tall -5971 +southern -5972 +▁investors -5973 +ansas -5974 +▁houses -5975 +▁trends -5976 +▁chapter -5977 +▁reduced -5978 +▁procedures -5979 +ava -5980 +▁cock -5981 +▁cable -5982 +▁goods -5983 +▁severe -5984 +▁hits -5985 +▁sequ -5986 +ji -5987 +tal -5988 +▁channel -5989 +▁tonight -5990 +▁recording -5991 +rated -5992 +disney -5993 +▁manner -5994 +▁settings -5995 +professional -5996 +▁photography -5997 +▁collaboration -5998 +gun -5999 +sales -6000 +▁thin -6001 +▁random -6002 +▁context -6003 +▁virtual -6004 +▁conflict -6005 +▁contemporary -6006 +▁config -6007 +▁forced -6008 +iversary -6009 +▁reporting -6010 +▁il -6011 +film -6012 +▁accessories -6013 +▁cups -6014 +▁laugh -6015 +▁challenging -6016 +luc -6017 +cuss -6018 +italy -6019 +▁vend -6020 +▁gifts -6021 +student -6022 +▁council -6023 +▁hosting -6024 +▁graduate -6025 +▁likes -6026 +▁noticed -6027 +▁authent -6028 +▁attended -6029 +▁clinical -6030 +▁regional -6031 +▁constantly -6032 +▁perspective -6033 +athe -6034 +avenue -6035 +▁printed -6036 +folio -6037 +▁adds -6038 +think -6039 +▁tiny -6040 +baby -6041 +nger -6042 +▁depth -6043 +▁excess -6044 +mix -6045 +royal -6046 +▁govern -6047 +must -6048 +forest -6049 +▁renew -6050 +▁clothing -6051 +▁foundation -6052 +anne -6053 +kids -6054 +quir -6055 +rition -6056 +further -6057 +writing -6058 +▁involve -6059 +▁domestic -6060 +las -6061 +▁:) -6062 +lete -6063 +utch -6064 +express -6065 +▁flavor -6066 +▁retire -6067 +▁univers -6068 +zil -6069 +amps -6070 +iami -6071 +▁hur -6072 +▁stone -6073 +ultural -6074 +▁prompt -6075 +▁innovation -6076 +▁diverse -6077 +▁searching -6078 +▁bags -6079 +ossible -6080 +jam -6081 +gard -6082 +▁shots -6083 +material -6084 +▁circumst -6085 +asts -6086 +▁glad -6087 +▁guard -6088 +▁walls -6089 +▁massive -6090 +▁outstanding -6091 +▁sustainable -6092 +▁tested -6093 +▁positions -6094 +▁replacement -6095 +lon -6096 +such -6097 +▁wet -6098 +garden -6099 +zealand -6100 +five -6101 +treat -6102 +▁taxes -6103 +▁carried -6104 +rub -6105 +ilst -6106 +▁rose -6107 +▁speaking -6108 +gro -6109 +amic -6110 +enger -6111 +▁candidate -6112 +sle -6113 +▁tou -6114 +miami -6115 +▁arrested -6116 +▁millions -6117 +▁construct -6118 +▁resolution -6119 +arab -6120 +enced -6121 +▁uplo -6122 +▁examples -6123 +dim -6124 +unfortunately -6125 +sav -6126 +ugg -6127 +▁ap -6128 +eric -6129 +▁jun -6130 +▁wash -6131 +update -6132 +respond -6133 +remember -6134 +▁medicine -6135 +jr -6136 +▁garage -6137 +▁figures -6138 +▁carefully -6139 +▁surprised -6140 +▁integrated -6141 +nee -6142 +▁abuse -6143 +▁smile -6144 +▁assets -6145 +▁programme -6146 +stal -6147 +viet -6148 +ament -6149 +▁deploy -6150 +▁unable -6151 +▁procedure -6152 +cuit -6153 +grow -6154 +▁reward -6155 +▁versions -6156 +▁interests -6157 +ampion -6158 +▁cycle -6159 +▁flood -6160 +▁nearby -6161 +▁pounds -6162 +▁failure -6163 +▁festival -6164 +▁committee -6165 +▁authorities -6166 +irts -6167 +asian -6168 +location -6169 +▁command -6170 +▁availability -6171 +dol -6172 +onym -6173 +raid -6174 +comfort -6175 +sd -6176 +▁? -6177 +sit -6178 +▁tap -6179 +aints -6180 +short -6181 +▁grew -6182 +▁muse -6183 +systems -6184 +▁emphas -6185 +▁residential -6186 +pg -6187 +clin -6188 +oral -6189 +iples -6190 +▁iron -6191 +▁prot -6192 +▁criminal -6193 +▁eleg -6194 +russia -6195 +▁bottle -6196 +▁herself -6197 +▁entirely -6198 +▁seriously -6199 +▁lad -6200 +choose -6201 +taylor -6202 +feed -6203 +talk -6204 +▁fitness -6205 +▁offices -6206 +recent -6207 +▁gorgeous -6208 +▁anx -6209 +▁pink -6210 +▁tomat -6211 +▁unlike -6212 +▁household -6213 +syl -6214 +iana -6215 +▁ordered -6216 +▁scientific -6217 +everyone -6218 +▁creates -6219 +▁dropped -6220 +▁mountain -6221 +dden -6222 +▁bab -6223 +▁ham -6224 +▁tech -6225 +common -6226 +▁gre -6227 +▁factor -6228 +▁garlic -6229 +▁eligible -6230 +reek -6231 +brook -6232 +eries -6233 +certain -6234 +process -6235 +▁distinct -6236 +file -6237 +▁acid -6238 +official -6239 +uro -6240 +wik -6241 +bury -6242 +edge -6243 +▁west -6244 +▁layer -6245 +▁occup -6246 +▁concern -6247 +yet -6248 +▁weak -6249 +photos -6250 +▁awards -6251 +▁detect -6252 +▁trained -6253 +▁thorough -6254 +html -6255 +▁shower -6256 +▁atmosphere -6257 +▁transportation -6258 +fly -6259 +zes -6260 +▁cm -6261 +wich -6262 +proof -6263 +▁risks -6264 +▁spaces -6265 +▁lighting -6266 +eve -6267 +▁eggs -6268 +▁shift -6269 +▁sick -6270 +▁guidance -6271 +▁historical -6272 +adam -6273 +▁quote -6274 +▁wealth -6275 +▁improving -6276 +▁prop -6277 +▁equal -6278 +charles -6279 +justice -6280 +▁qualified -6281 +▁transition -6282 +vo -6283 +edia -6284 +anton -6285 +▁) -6286 +phones -6287 +rip -6288 +rug -6289 +insp -6290 +▁ease -6291 +▁liquid -6292 +▁rating -6293 +▁capture -6294 +professor -6295 +kent -6296 +▁san -6297 +cloud -6298 +▁none -6299 +▁citizens -6300 +▁identity -6301 +▁contribute -6302 +cue -6303 +exp -6304 +safe -6305 +comment -6306 +▁invited -6307 +▁certified -6308 +emic -6309 +chall -6310 +estate -6311 +▁negot -6312 +iti -6313 +race -6314 +▁skill -6315 +▁hospit -6316 +▁faculty -6317 +tam -6318 +porter -6319 +return -6320 +presents -6321 +engineering -6322 +▁originally -6323 +hl -6324 +llc -6325 +mrs -6326 +▁believes -6327 +dead -6328 +stan -6329 +inson -6330 +▁soul -6331 +ificate -6332 +▁tournament -6333 +tel -6334 +dale -6335 +▁filed -6336 +▁tasks -6337 +▁joining -6338 +▁singles -6339 +’. -6340 +sured -6341 +▁legs -6342 +▁judge -6343 +▁fishing -6344 +▁developers -6345 +erts -6346 +▁proven -6347 +experience -6348 +vas -6349 +pers -6350 +▁wra -6351 +▁blend -6352 +▁solar -6353 +▁yards -6354 +▁sudden -6355 +▁termin -6356 +▁listing -6357 +athan -6358 +jured -6359 +vs -6360 +eria -6361 +sung -6362 +▁rank -6363 +axy -6364 +adow -6365 +queen -6366 +▁empt -6367 +▁entered -6368 +▁mm -6369 +onto -6370 +▁susp -6371 +▁tear -6372 +player -6373 +tell -6374 +▁tip -6375 +iously -6376 +▁juice -6377 +▁worst -6378 +!" -6379 +pn -6380 +cou -6381 +▁sole -6382 +▁lucky -6383 +release -6384 +▁adapt -6385 +▁broadcast -6386 +▁increases -6387 +bol -6388 +▁|| -6389 +bass -6390 +brad -6391 +▁mut -6392 +silver -6393 +complete -6394 +▁founded -6395 +▁calendar -6396 +▁networks -6397 +izza -6398 +agues -6399 +brazil -6400 +senate -6401 +studies -6402 +▁interface -6403 +inar -6404 +gallery -6405 +ariz -6406 +send -6407 +▁frag -6408 +igration -6409 +ographic -6410 +▁interpre -6411 +autom -6412 +▁faces -6413 +▁cooper -6414 +▁engagement -6415 +motor -6416 +▁spots -6417 +▁conver -6418 +▁protein -6419 +▁younger -6420 +aze -6421 +alty -6422 +▁sac -6423 +▁firms -6424 +▁pitch -6425 +▁cotton -6426 +omy -6427 +▁rac -6428 +▁hire -6429 +icial -6430 +▁arms -6431 +▁root -6432 +apping -6433 +▁organized -6434 +▁submitted -6435 +aron -6436 +ooks -6437 +camer -6438 +title -6439 +course -6440 +▁chose -6441 +▁resol -6442 +illiant -6443 +reading -6444 +▁refund -6445 +▁counsel -6446 +mers -6447 +▁fle -6448 +ership -6449 +policy -6450 +▁sample -6451 +▁objects -6452 +rot -6453 +▁carbon -6454 +commerce -6455 +▁streets -6456 +▁fighting -6457 +republican -6458 +▁capabilities -6459 +ikes -6460 +sweet -6461 +▁zone -6462 +▁aside -6463 +▁appeal -6464 +▁dental -6465 +aturally -6466 +▁awarded -6467 +hur -6468 +uling -6469 +▁flag -6470 +▁reputation -6471 +▁implementation -6472 +hire -6473 +▁forever -6474 +▁letters -6475 +▁posting -6476 +▁concert -6477 +▁membership -6478 +hous -6479 +▁cos -6480 +▁vul -6481 +cycle -6482 +▁vital -6483 +hey -6484 +iran -6485 +▁moves -6486 +seattle -6487 +▁instance -6488 +▁volunteers -6489 +ku -6490 +mess -6491 +sound -6492 +▁hole -6493 +▁seed -6494 +consult -6495 +ensation -6496 +together -6497 +▁consequ -6498 +▁drawing -6499 +▁exhibition -6500 +▁institutions -6501 +sche -6502 +uine -6503 +▁hasn -6504 +▁exposure -6505 +▁strategic -6506 +▁monitoring -6507 +uster -6508 +▁bonus -6509 +catholic -6510 +following -6511 +▁downtown -6512 +▁enjoying -6513 +▁situations -6514 +ala -6515 +bible -6516 +thetic -6517 +▁spoke -6518 +▁hosted -6519 +▁output -6520 +▁listening -6521 +▁auto -6522 +without -6523 +▁tracks -6524 +▁airport -6525 +jord -6526 +ennis -6527 +forum -6528 +northern -6529 +▁assessment -6530 +cool -6531 +▁ult -6532 +ateful -6533 +▁sending -6534 +▁choosing -6535 +htt -6536 +nda -6537 +yers -6538 +▁coc -6539 +▁pulled -6540 +customer -6541 +▁privacy -6542 +▁guidelines -6543 +gary -6544 +▁obl -6545 +oured -6546 +annels -6547 +▁hopes -6548 +▁ultimate -6549 +oak -6550 +ults -6551 +▁attached -6552 +aqu -6553 +pin -6554 +tail -6555 +▁sight -6556 +▁frequently -6557 +lif -6558 +say -6559 +izer -6560 +final -6561 +▁flexible -6562 +▁platforms -6563 +▁suggested -6564 +usb -6565 +birth -6566 +▁linked -6567 +▁motion -6568 +▁pleasure -6569 +had -6570 +bang -6571 +▁vibr -6572 +▁suffe -6573 +▁causing -6574 +▁workshop -6575 +ucky -6576 +wire -6577 +▁hat -6578 +▁performing -6579 +▁regulations -6580 +via -6581 +sale -6582 +dream -6583 +▁delay -6584 +atlanta -6585 +therapy -6586 +▁bodies -6587 +▁closely -6588 +bound -6589 +square -6590 +▁phase -6591 +edition -6592 +▁engage -6593 +▁circumstances -6594 +studio -6595 +things -6596 +▁input -6597 +capital -6598 +▁destroy -6599 +hollywood -6600 +▁destination -6601 +fold -6602 +▁inch -6603 +▁warn -6604 +version -6605 +▁matches -6606 +▁portion -6607 +▁visited -6608 +cit -6609 +sole -6610 +chand -6611 +▁dreams -6612 +▁reader -6613 +▁operate -6614 +▁informed -6615 +dark -6616 +inde -6617 +▁bench -6618 +▁trick -6619 +▁terror -6620 +including -6621 +oz -6622 +idi -6623 +liam -6624 +ially -6625 +▁crew -6626 +makers -6627 +▁urban -6628 +▁injured -6629 +▁directed -6630 +▁relatively -6631 +allow -6632 +piece -6633 +nation -6634 +server -6635 +spirit -6636 +▁meets -6637 +▁massage -6638 +▁copyright -6639 +▁spiritual -6640 +comb -6641 +prime -6642 +quick -6643 +▁rice -6644 +▁liked -6645 +houston -6646 +▁loving -6647 +oft -6648 +pak -6649 +▁defend -6650 +▁saving -6651 +fat -6652 +bird -6653 +umps -6654 +▁river -6655 +isation -6656 +▁manual -6657 +▁promise -6658 +insurance -6659 +▁landscape -6660 +.’ -6661 +esy -6662 +burgh -6663 +perhaps -6664 +▁anticip -6665 +▁historic -6666 +.' -6667 +rh -6668 +neys -6669 +osop -6670 +▁bird -6671 +▁lege -6672 +▁upgrade -6673 +▁memories -6674 +▁encouraged -6675 +▁treatments -6676 +cru -6677 +doors -6678 +oster -6679 +joseph -6680 +winter -6681 +▁filter -6682 +past -6683 +tics -6684 +ounce -6685 +▁jewelry -6686 +▁holidays -6687 +kevin -6688 +▁hurt -6689 +friends -6690 +▁critic -6691 +auto -6692 +annual -6693 +▁finds -6694 +▁planet -6695 +▁technique -6696 +▁manufacturer -6697 +ien -6698 +brian -6699 +▁butt -6700 +▁tank -6701 +uclear -6702 +medicine -6703 +▁gallery -6704 +him -6705 +obby -6706 +▁tum -6707 +charge -6708 +▁boxes -6709 +holders -6710 +▁scheme -6711 +▁authors -6712 +▁capable -6713 +championship -6714 +▁publication -6715 +▁km -6716 +▁domin -6717 +▁tooth -6718 +era -6719 +six -6720 +army -6721 +fred -6722 +sport -6723 +▁soil -6724 +▁grass -6725 +▁managing -6726 +dall -6727 +prim -6728 +preme -6729 +effect -6730 +▁talked -6731 +▁justice -6732 +▁typical -6733 +▁victims -6734 +collection -6735 +performance -6736 +hay -6737 +pir -6738 +bath -6739 +path -6740 +tony -6741 +asted -6742 +▁attacks -6743 +secretary -6744 +ani -6745 +lers -6746 +▁flash -6747 +▁smell -6748 +iii -6749 +spot -6750 +▁rail -6751 +minnes -6752 +faction -6753 +▁alongside -6754 +▁resort -6755 +aid -6756 +trad -6757 +▁subs -6758 +▁spark -6759 +▁rental -6760 +products -6761 +▁volunteer -6762 +pi -6763 +money -6764 +▁savings -6765 +▁suggestions -6766 +▁($ -6767 +ella -6768 +▁magic -6769 +▁split -6770 +samsung -6771 +▁arrive -6772 +▁debate -6773 +▁wireless -6774 +▁lawyer -6775 +▁riding -6776 +▁drinking -6777 +amel -6778 +below -6779 +▁hate -6780 +▁proof -6781 +▁proceed -6782 +ingly -6783 +details -6784 +▁blocks -6785 +▁discipl -6786 +izz -6787 +▁cin -6788 +▁stake -6789 +▁relief -6790 +dvd -6791 +▁fairly -6792 +▁seasons -6793 +igs -6794 +▁row -6795 +▁depress -6796 +▁consistent -6797 +▁incredibly -6798 +cha -6799 +chi -6800 +rible -6801 +▁stir -6802 +classic -6803 +▁domain -6804 +▁passing -6805 +▁collected -6806 +▁continuing -6807 +ule -6808 +▁mas -6809 +▁cuts -6810 +▁gent -6811 +rovers -6812 +private -6813 +minnesota -6814 +▁estimated -6815 +▁initiative -6816 +bru -6817 +urb -6818 +feel -6819 +▁fas -6820 +ingers -6821 +▁prove -6822 +▁anyway -6823 +▁pocket -6824 +▁password -6825 +▁apparently -6826 +dor -6827 +hops -6828 +agency -6829 +▁sharp -6830 +▁shops -6831 +channel -6832 +▁phones -6833 +▁walked -6834 +▁priority -6835 +▁celebration -6836 +▁tape -6837 +columb -6838 +histor -6839 +▁newly -6840 +▁shock -6841 +▁chances -6842 +bn -6843 +davis -6844 +golden -6845 +▁reform -6846 +▁approval -6847 +▁equipped -6848 +▁machines -6849 +▁immediate -6850 +▁solve -6851 +ainless -6852 +▁illness -6853 +meanwhile -6854 +▁external -6855 +▁discussed -6856 +nfl -6857 +linux -6858 +▁adopt -6859 +▁storm -6860 +eastern -6861 +▁border -6862 +magazine -6863 +▁somewhere -6864 +▁represents -6865 +metro -6866 +adelph -6867 +▁crack -6868 +▁forth -6869 +▁mainly -6870 +▁writers -6871 +▁believed -6872 +▁normally -6873 +golf -6874 +iron -6875 +▁exh -6876 +cancer -6877 +▁belief -6878 +▁classroom -6879 +▁permission -6880 +etic -6881 +bring -6882 +urning -6883 +▁casino -6884 +▁murder -6885 +website -6886 +▁muscle -6887 +▁hook -6888 +▁frustr -6889 +▁vulner -6890 +▁legislation -6891 +ext -6892 +years -6893 +▁hell -6894 +missions -6895 +therefore -6896 +song -6897 +steel -6898 +▁decade -6899 +description -6900 +nie -6901 +▁scr -6902 +▁colle -6903 +▁instrument -6904 +editor -6905 +▁prize -6906 +▁venue -6907 +winning -6908 +▁infect -6909 +memorial -6910 +▁covering -6911 +▁realized -6912 +)| -6913 +ema -6914 +▁stim -6915 +▁recover -6916 +▁clicking -6917 +▁swimming -6918 +▁industries -6919 +irish -6920 +stery -6921 +▁banks -6922 +counter -6923 +▁casual -6924 +partners -6925 +▁chopped -6926 +▁finance -6927 +itionally -6928 +▁vegetables -6929 +▁differences -6930 +dra -6931 +icit -6932 +ogue -6933 +among -6934 +apped -6935 +consum -6936 +valent -6937 +▁feelings -6938 +▁possibility -6939 +mah -6940 +shot -6941 +▁tie -6942 +▁east -6943 +jewish -6944 +▁meals -6945 +finance -6946 +▁ending -6947 +▁dynamic -6948 +▁breaking -6949 +▁categories -6950 +tar -6951 +vac -6952 +bits -6953 +hero -6954 +stor -6955 +▁vacc -6956 +▁facts -6957 +▁shell -6958 +content -6959 +▁laptop -6960 +▁permanent -6961 +administration -6962 +▁tab -6963 +cript -6964 +double -6965 +▁theory -6966 +adelphia -6967 +▁shouldn -6968 +▁bes -6969 +ashes -6970 +outer -6971 +▁vari -6972 +arizona -6973 +written -6974 +▁division -6975 +▁handling -6976 +cort -6977 +▁dust -6978 +▁agents -6979 +▁replaced -6980 +▁requests -6981 +si -6982 +▁fra -6983 +▁raw -6984 +▁ancient -6985 +buff -6986 +dave -6987 +spain -6988 +▁bear -6989 +▁enables -6990 +▁facilit -6991 +▁portfolio -6992 +ras -6993 +vard -6994 +▁upgr -6995 +▁broke -6996 +officer -6997 +▁colours -6998 +▁filling -6999 +philadelphia -7000 +▁mile -7001 +▁mood -7002 +▁index -7003 +ione -7004 +ridge -7005 +▁elev -7006 +▁breath -7007 +▁formed -7008 +▁anniversary -7009 +▁dent -7010 +compet -7011 +stephen -7012 +▁drinks -7013 +▁headed -7014 +everything -7015 +▁religious -7016 +▁attractive -7017 +fix -7018 +whel -7019 +asant -7020 +▁lets -7021 +▁magn -7022 +▁fifth -7023 +collect -7024 +▁buyers -7025 +▁slowly -7026 +▁printing -7027 +▁brilliant -7028 +▁fo -7029 +vice -7030 +▁sec -7031 +jason -7032 +arters -7033 +▁giant -7034 +▁enroll -7035 +▁titles -7036 +▁charity -7037 +▁suffered -7038 +ulpt -7039 +shirt -7040 +▁yard -7041 +▁bunch -7042 +▁recall -7043 +▁surprising -7044 +▁architecture -7045 +fab -7046 +▁bare -7047 +ensity -7048 +▁mortgage -7049 +gypt -7050 +▁adop -7051 +▁coat -7052 +miller -7053 +▁hidden -7054 +drop -7055 +▁god -7056 +woman -7057 +▁fest -7058 +single -7059 +▁recognize -7060 +▁traveling -7061 +▁ast -7062 +▁birds -7063 +▁describe -7064 +chel -7065 +block -7066 +vania -7067 +▁noise -7068 +▁aspect -7069 +creative -7070 +delivery -7071 +subscribe -7072 +▁accomplish -7073 +fav -7074 +▁adm -7075 +▁pra -7076 +prior -7077 +galaxy -7078 +register -7079 +▁genuine -7080 +solutions -7081 +▁generate -7082 +▁personally -7083 +▁performances -7084 +agu -7085 +lad -7086 +tree -7087 +▁aims -7088 +closed -7089 +reviews -7090 +▁doctors -7091 +▁ec -7092 +alse -7093 +inton -7094 +dallas -7095 +▁roles -7096 +▁packages -7097 +half -7098 +▁pin -7099 +igate -7100 +trail -7101 +around -7102 +▁empty -7103 +▁compat -7104 +▁defined -7105 +construction -7106 +▁recommendations -7107 +%. -7108 +owa -7109 +roman -7110 +staff -7111 +trade -7112 +▁loans -7113 +amy -7114 +▁acts -7115 +toronto -7116 +transport -7117 +aren -7118 +inos -7119 +swed -7120 +▁scan -7121 +▁moder -7122 +represent -7123 +production -7124 +dj -7125 +dry -7126 +jenn -7127 +mini -7128 +diego -7129 +▁graphics -7130 +hz -7131 +▁bars -7132 +ington -7133 +▁habit -7134 +patrick -7135 +computer -7136 +▁guaranteed -7137 +sec -7138 +▁evol -7139 +▁gard -7140 +▁orange -7141 +▁conclud -7142 +communications -7143 +▁bid -7144 +abeth -7145 +apore -7146 +otted -7147 +poons -7148 +wilson -7149 +▁absor -7150 +▁scenes -7151 +▁assistant -7152 +cart -7153 +fresh -7154 +▁blank -7155 +▁recre -7156 +▁acknow -7157 +▁marked -7158 +▁recognition -7159 +ulf -7160 +..... -7161 +egypt -7162 +orange -7163 +▁heads -7164 +▁inner -7165 +columbia -7166 +▁falling -7167 +▁interactive -7168 +suc -7169 +▁et -7170 +moon -7171 +grant -7172 +▁gall -7173 +▁dishes -7174 +▁presents -7175 +▁producing -7176 +▁specialist -7177 +▁communications -7178 +ati -7179 +lane -7180 +aware -7181 +excell -7182 +▁nights -7183 +▁baseball -7184 +▁channels -7185 +anny -7186 +ista -7187 +wart -7188 +▁trig -7189 +writer -7190 +explore -7191 +▁singer -7192 +▁crucial -7193 +▁explains -7194 +▁expressed -7195 +▁rum -7196 +▁substant -7197 +pton -7198 +▁fits -7199 +ronics -7200 +original -7201 +info -7202 +categ -7203 +shore -7204 +▁ensu -7205 +▁speaker -7206 +▁ceremony -7207 +▁newspaper -7208 +▁pharm -7209 +▁cookies -7210 +bott -7211 +▁gym -7212 +▁rig -7213 +▁formal -7214 +▁founder -7215 +commercial -7216 +▁authority -7217 +kee -7218 +url -7219 +hart -7220 +oker -7221 +▁sor -7222 +modern -7223 +article -7224 +▁compare -7225 +▁proposal -7226 +▁programming -7227 +etry -7228 +tense -7229 +always -7230 +events -7231 +▁pride -7232 +issions -7233 +▁claimed -7234 +gor -7235 +▁ads -7236 +▁alive -7237 +▁stuck -7238 +▁remark -7239 +▁carrying -7240 +▁expansion -7241 +alo -7242 +lady -7243 +marc -7244 +coach -7245 +early -7246 +sorry -7247 +▁vintage -7248 +▁achieved -7249 +▁speakers -7250 +▁whenever -7251 +▁enforcement -7252 +ran -7253 +▁mand -7254 +▁agric -7255 +▁consists -7256 +eliz -7257 +shel -7258 +inois -7259 +▁enem -7260 +abetes -7261 +castle -7262 +▁displ -7263 +closure -7264 +▁dollar -7265 +electric -7266 +▁reducing -7267 +ifies -7268 +sters -7269 +▁vast -7270 +celebr -7271 +lessly -7272 +muslim -7273 +airport -7274 +fashion -7275 +▁centers -7276 +▁percentage -7277 +lah -7278 +yan -7279 +▁conv -7280 +▁saved -7281 +▁limits -7282 +▁engaged -7283 +coll -7284 +wine -7285 +▁fru -7286 +stein -7287 +▁boss -7288 +▁embr -7289 +▁clubs -7290 +manchester -7291 +▁connections -7292 +▁preparation -7293 +nik -7294 +imize -7295 +hallow -7296 +▁seats -7297 +▁hotels -7298 +▁combine -7299 +▁dealing -7300 +▁desired -7301 +▁grateful -7302 +additional -7303 +▁consultation -7304 +▁manufacturers -7305 +ilty -7306 +▁rein -7307 +▁vice -7308 +itable -7309 +▁pricing -7310 +▁designers -7311 +▁extension -7312 +▁impossible -7313 +eman -7314 +gene -7315 +pain -7316 +along -7317 +acious -7318 +▁heating -7319 +▁watched -7320 +▁resulting -7321 +▁improvements -7322 +tro -7323 +fant -7324 +▁attending -7325 +▁basically -7326 +mate -7327 +civil -7328 +▁compr -7329 +▁gender -7330 +▁insight -7331 +▁compliance -7332 +▁purchasing -7333 +gt -7334 +wid -7335 +▁ly -7336 +anger -7337 +youth -7338 +osophy -7339 +regional -7340 +assistant -7341 +singapore -7342 +▁participating -7343 +poly -7344 +strong -7345 +▁crash -7346 +▁carpet -7347 +▁narrow -7348 +▁packed -7349 +illinois -7350 +sylvania -7351 +▁winners -7352 +▁shoulder -7353 +▁enterprise -7354 +▁introduction -7355 +rab -7356 +pitt -7357 +zone -7358 +stick -7359 +engers -7360 +▁liber -7361 +▁marks -7362 +manufact -7363 +▁concrete -7364 +▁contrast -7365 +▁sensitive -7366 +’, -7367 +aka -7368 +nin -7369 +obe -7370 +gift -7371 +▁plot -7372 +brother -7373 +pending -7374 +▁acting -7375 +▁lesson -7376 +▁default -7377 +▁exceptional -7378 +lam -7379 +aska -7380 +hind -7381 +ingu -7382 +▁clar -7383 +▁deck -7384 +▁lady -7385 +▁strike -7386 +planning -7387 +questions -7388 +sir -7389 +impro -7390 +▁consc -7391 +designed -7392 +ications -7393 +too -7394 +harry -7395 +▁assum -7396 +▁aggress -7397 +▁suffering -7398 +mun -7399 +ulum -7400 +jordan -7401 +▁gaming -7402 +▁everywhere -7403 +alt -7404 +moo -7405 +oir -7406 +zing -7407 +▁dad -7408 +orous -7409 +▁opens -7410 +▁acquis -7411 +industry -7412 +▁booking -7413 +▁constant -7414 +▁consumption -7415 +fu -7416 +mend -7417 +creek -7418 +girls -7419 +▁ocean -7420 +▁whilst -7421 +▁succeed -7422 +resources -7423 +bush -7424 +near -7425 +▁legend -7426 +▁healing -7427 +▁opinions -7428 +ila -7429 +alle -7430 +apol -7431 +aund -7432 +built -7433 +close -7434 +study -7435 +▁bills -7436 +▁criteria -7437 +▁overwhel -7438 +▁obviously -7439 +usd -7440 +onna -7441 +▁hel -7442 +arian -7443 +index -7444 +korea -7445 +▁spray -7446 +updated -7447 +▁humans -7448 +▁rising -7449 +▁damaged -7450 +▁heading -7451 +▁functional -7452 +▁resistance -7453 +▁lie -7454 +henry -7455 +▁unus -7456 +length -7457 +theast -7458 +▁actor -7459 +ologist -7460 +features -7461 +▁preferred -7462 +▁networking -7463 +sha -7464 +ghan -7465 +▁mob -7466 +ockey -7467 +▁mold -7468 +terior -7469 +anthony -7470 +itution -7471 +▁driven -7472 +▁gentle -7473 +▁instant -7474 +▁assault -7475 +▁weapons -7476 +▁experiment -7477 +▁announcement -7478 +qual -7479 +mates -7480 +▁fundra -7481 +▁indeed -7482 +▁literally -7483 +pennsylvania -7484 +)|| -7485 +▁ens -7486 +dress -7487 +▁lect -7488 +▁mere -7489 +▁pill -7490 +univers -7491 +▁attribut -7492 +▁interact -7493 +▁scientists -7494 +jess -7495 +▁bridge -7496 +▁emails -7497 +iders -7498 +▁managers -7499 +▁retirement -7500 +nat -7501 +▁pic -7502 +analy -7503 +kelly -7504 +▁trim -7505 +kitchen -7506 +▁returning -7507 +corporation -7508 +reach -7509 +▁loose -7510 +▁seeds -7511 +▁anymore -7512 +▁checking -7513 +▁entering -7514 +▁displayed -7515 +▁personality -7516 +cad -7517 +▁tor -7518 +▁beef -7519 +▁wins -7520 +matthe -7521 +novation -7522 +▁podcast -7523 +▁ensuring -7524 +▁suggests -7525 +▁talented -7526 +api -7527 +bab -7528 +iac -7529 +ints -7530 +mann -7531 +onut -7532 +stars -7533 +users -7534 +▁solo -7535 +▁intense -7536 +▁admitted -7537 +▁component -7538 +eye -7539 +▁plain -7540 +▁deposit -7541 +currently -7542 +▁preparing -7543 +▁basketball -7544 +▁interviews -7545 +boys -7546 +cand -7547 +dney -7548 +▁gay -7549 +▁tub -7550 +icity -7551 +intel -7552 +langu -7553 +sarah -7554 +write -7555 +better -7556 +▁franch -7557 +major -7558 +integr -7559 +▁yours -7560 +onymous -7561 +▁string -7562 +▁farmers -7563 +▁association -7564 +itz -7565 +sts -7566 +holy -7567 +liber -7568 +cooper -7569 +images -7570 +▁forgot -7571 +▁removal -7572 +▁visible -7573 +▁introduce -7574 +▁packaging -7575 +▁officially -7576 +hem -7577 +alan -7578 +anti -7579 +iraq -7580 +hills -7581 +vised -7582 +▁boot -7583 +▁trips -7584 +▁accused -7585 +▁matching -7586 +▁politics -7587 +▁wondering -7588 +mw -7589 +hub -7590 +iowa -7591 +activ -7592 +▁rural -7593 +jection -7594 +▁acceler -7595 +amm -7596 +una -7597 +▁hi -7598 +athon -7599 +ilton -7600 +▁chart -7601 +▁lemon -7602 +▁lists -7603 +▁olive -7604 +▁salad -7605 +▁invite -7606 +▁closing -7607 +▁graphic -7608 +▁forecast -7609 +protection -7610 +▁creativity -7611 +▁reasonable -7612 +additionally -7613 +etch -7614 +focus -7615 +former -7616 +▁tired -7617 +▁stable -7618 +▁diseases -7619 +▁lifetime -7620 +▁increasingly -7621 +fan -7622 +▁batt -7623 +▁pace -7624 +aining -7625 +bourne -7626 +▁blind -7627 +ression -7628 +comments -7629 +▁staying -7630 +▁findings -7631 +▁licensed -7632 +▁professor -7633 +▁puts -7634 +acular -7635 +▁focuses -7636 +halloween -7637 +ventional -7638 +value -7639 +taking -7640 +▁recip -7641 +success -7642 +theatre -7643 +▁trailer -7644 +▁viewing -7645 +▁stations -7646 +▁correspond -7647 +eration -7648 +▁afraid -7649 +▁teaspoon -7650 +▁primarily -7651 +▁potentially -7652 +cn -7653 +kh -7654 +hemat -7655 +stage -7656 +▁brush -7657 +▁nomin -7658 +restaur -7659 +▁suspect -7660 +▁tracking -7661 +▁developer -7662 +▁temperatures -7663 +oen -7664 +teac -7665 +defin -7666 +eness -7667 +▁widely -7668 +beck -7669 +ifer -7670 +snow -7671 +onder -7672 +utter -7673 +import -7674 +address -7675 +pection -7676 +▁illegal -7677 +compan -7678 +▁chips -7679 +▁attach -7680 +▁naturally -7681 +anes -7682 +arth -7683 +▁sail -7684 +simply -7685 +▁honey -7686 +liament -7687 +▁racing -7688 +▁precise -7689 +elizabeth -7690 +jen -7691 +ador -7692 +imag -7693 +▁ves -7694 +▁faced -7695 +▁stamp -7696 +property -7697 +blu -7698 +bern -7699 +andre -7700 +icide -7701 +maker -7702 +friger -7703 +▁intent -7704 +▁visits -7705 +▁texture -7706 +▁stronger -7707 +sac -7708 +oked -7709 +skin -7710 +ented -7711 +▁clock -7712 +▁drama -7713 +independ -7714 +▁greatly -7715 +▁expenses -7716 +▁releases -7717 +employment -7718 +▁reduction -7719 +uten -7720 +psych -7721 +▁yoga -7722 +▁ultim -7723 +supreme -7724 +▁harder -7725 +'. -7726 +sq -7727 +oses -7728 +rober -7729 +scale -7730 +steven -7731 +britain -7732 +▁flying -7733 +▁suppor -7734 +tennes -7735 +▁bacter -7736 +▁behalf -7737 +▁conserv -7738 +▁strange -7739 +▁obtained -7740 +▁computers -7741 +zzle -7742 +ureau -7743 +▁plug -7744 +▁unex -7745 +lights -7746 +▁blogs -7747 +pakistan -7748 +▁checked -7749 +▁warning -7750 +rt -7751 +glas -7752 +▁fault -7753 +▁yield -7754 +nob -7755 +kansas -7756 +▁weren -7757 +selling -7758 +▁gained -7759 +▁writes -7760 +advanced -7761 +▁careful -7762 +▁warranty -7763 +▁workshops -7764 +▁delivering -7765 +kong -7766 +tips -7767 +eland -7768 +saint -7769 +▁museum -7770 +▁colleagues -7771 +▁highlights -7772 +gab -7773 +rele -7774 +ashed -7775 +owner -7776 +rency -7777 +oregon -7778 +owners -7779 +rating -7780 +▁param -7781 +nothing -7782 +▁principles -7783 +▁nose -7784 +iliate -7785 +▁desktop -7786 +▁superior -7787 +▁employers -7788 +uct -7789 +unit -7790 +vals -7791 +▁vine -7792 +simple -7793 +andemic -7794 +machine -7795 +▁gotten -7796 +sciences -7797 +▁elegant -7798 +▁raising -7799 +application -7800 +bull -7801 +laure -7802 +▁duty -7803 +▁semi -7804 +▁entit -7805 +village -7806 +▁controls -7807 +▁communicate -7808 +fm -7809 +hello -7810 +▁shapes -7811 +▁struggling -7812 +yl -7813 +hav -7814 +austin -7815 +▁minimal -7816 +▁electrical -7817 +vc -7818 +▁lip -7819 +rants -7820 +▁inventory -7821 +▁promoting -7822 +▁approaches -7823 +barn -7824 +hong -7825 +vegas -7826 +▁sour -7827 +▁wave -7828 +▁strict -7829 +▁hanging -7830 +▁shipped -7831 +▁sponsor -7832 +▁photographs -7833 +aha -7834 +tery -7835 +▁twist -7836 +▁wooden -7837 +▁cameras -7838 +deal -7839 +math -7840 +▁ther -7841 +odge -7842 +sels -7843 +▁cher -7844 +▁laid -7845 +▁align -7846 +▁scores -7847 +victoria -7848 +▁controvers -7849 +▁ultimately -7850 +▁till -7851 +▁divid -7852 +▁anxiety -7853 +▁involves -7854 +▁occurred -7855 +▁signature -7856 +▁participation -7857 +bish -7858 +▁emb -7859 +frame -7860 +princ -7861 +stract -7862 +getting -7863 +▁resume -7864 +▁struck -7865 +▁thinks -7866 +▁cancell -7867 +▁involving -7868 +▁ -7869 +e -7870 +t -7871 +a -7872 +o -7873 +i -7874 +n -7875 +s -7876 +r -7877 +h -7878 +l -7879 +d -7880 +c -7881 +u -7882 +m -7883 +p -7884 +f -7885 +g -7886 +y -7887 +w -7888 +b -7889 +. -7890 +v -7891 +, -7892 +k -7893 +- -7894 +0 -7895 +1 -7896 +x -7897 +2 -7898 +j -7899 +’ -7900 +' -7901 +: -7902 +z -7903 +q -7904 +) -7905 +3 -7906 +( -7907 +5 -7908 +4 -7909 +" -7910 +9 -7911 +6 -7912 +8 -7913 +! -7914 +| -7915 +7 -7916 +/ -7917 +? -7918 +“ -7919 +” -7920 +– -7921 +; -7922 +& -7923 +$ -7924 +% -7925 +— -7926 +* -7927 diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_gpt.py b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_gpt.py new file mode 100644 index 0000000000..6ea42d03e0 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_gpt.py @@ -0,0 +1,3143 @@ +import base64, collections, copy, fcntl, glob, io, lzma, math, os +from pathlib import Path +import random, re, subprocess, time, uuid, numpy as np, sentencepiece as spm, torch, torch.distributed as dist, torch.nn.functional as F +from torch import Tensor, nn +from flash_attn_interface import ( + flash_attn_func as flash_attn_3_func, + flash_attn_varlen_func, +) +from concurrent.futures import ThreadPoolExecutor +import triton +import triton.language as tl +from triton.tools.tensor_descriptor import TensorDescriptor +_FUSED_CE_LIBRARY = "pgsubmission1draft7fusedce" +_FUSED_CE_BLOCK_SIZE = 1024 +_FUSED_CE_NUM_WARPS = 4 +@triton.jit +def _softcapped_ce_fwd_kernel( + logits_ptr, losses_ptr, lse_ptr, targets_ptr, + stride_logits_n, stride_logits_v, + n_rows, n_cols, softcap, + block_size: tl.constexpr, +): + row_idx = tl.program_id(0).to(tl.int64) + logits_row_ptr = logits_ptr + row_idx * stride_logits_n + max_val = -float("inf") + sum_exp = 0.0 + A = 2.0 * softcap + inv_C = 2.0 / softcap + for off in range(0, n_cols, block_size): + cols = off + tl.arange(0, block_size) + mask = cols < n_cols + val = tl.load( + logits_row_ptr + cols * stride_logits_v, + mask=mask, other=-float("inf"), + ).to(tl.float32) + z = A * tl.sigmoid(val * inv_C) + z = tl.where(mask, z, -float("inf")) + curr_max = tl.max(z, axis=0) + new_max = tl.maximum(max_val, curr_max) + sum_exp = sum_exp * tl.exp(max_val - new_max) + tl.sum(tl.exp(z - new_max), axis=0) + max_val = new_max + lse = max_val + tl.log(sum_exp) + tl.store(lse_ptr + row_idx, lse) + target = tl.load(targets_ptr + row_idx).to(tl.int32) + target_val = tl.load(logits_row_ptr + target * stride_logits_v).to(tl.float32) + target_z = A * tl.sigmoid(target_val * inv_C) + tl.store(losses_ptr + row_idx, lse - target_z) +@triton.jit +def _softcapped_ce_bwd_kernel( + grad_logits_ptr, grad_losses_ptr, lse_ptr, logits_ptr, targets_ptr, + stride_logits_n, stride_logits_v, + stride_grad_n, stride_grad_v, + n_rows, n_cols, softcap, + block_size: tl.constexpr, +): + row_idx = tl.program_id(0).to(tl.int64) + logits_row_ptr = logits_ptr + row_idx * stride_logits_n + grad_row_ptr = grad_logits_ptr + row_idx * stride_grad_n + lse = tl.load(lse_ptr + row_idx) + grad_loss = tl.load(grad_losses_ptr + row_idx).to(tl.float32) + target = tl.load(targets_ptr + row_idx).to(tl.int32) + A = 2.0 * softcap + inv_C = 2.0 / softcap + dz_dx_scale = A * inv_C + for off in range(0, n_cols, block_size): + cols = off + tl.arange(0, block_size) + mask = cols < n_cols + val = tl.load( + logits_row_ptr + cols * stride_logits_v, + mask=mask, other=0.0, + ).to(tl.float32) + sigmoid_u = tl.sigmoid(val * inv_C) + z = A * sigmoid_u + probs = tl.exp(z - lse) + grad_z = grad_loss * (probs - tl.where(cols == target, 1.0, 0.0)) + grad_x = grad_z * (dz_dx_scale * sigmoid_u * (1.0 - sigmoid_u)) + tl.store(grad_row_ptr + cols * stride_grad_v, grad_x, mask=mask) +def _validate_softcapped_ce_inputs( + logits: Tensor, targets: Tensor, softcap: float, +) -> tuple[Tensor, Tensor]: + if logits.ndim != 2: + raise ValueError(f"Expected logits.ndim=2, got {logits.ndim}") + if targets.ndim != 1: + raise ValueError(f"Expected targets.ndim=1, got {targets.ndim}") + if logits.shape[0] != targets.shape[0]: + raise ValueError( + f"Expected matching rows, got logits={tuple(logits.shape)} targets={tuple(targets.shape)}" + ) + if not logits.is_cuda or not targets.is_cuda: + raise ValueError("softcapped_cross_entropy requires CUDA tensors") + if softcap <= 0.0: + raise ValueError(f"softcap must be positive, got {softcap}") + if logits.dtype not in (torch.float16, torch.bfloat16, torch.float32): + raise ValueError(f"Unsupported logits dtype: {logits.dtype}") + logits = logits.contiguous() + targets = targets.contiguous() + if targets.dtype != torch.int64: + targets = targets.to(dtype=torch.int64) + return logits, targets +@torch.library.custom_op(f"{_FUSED_CE_LIBRARY}::softcapped_ce", mutates_args=()) +def softcapped_ce_op(logits: Tensor, targets: Tensor, softcap: float) -> tuple[Tensor, Tensor]: + logits, targets = _validate_softcapped_ce_inputs(logits, targets, float(softcap)) + n_rows, n_cols = logits.shape + losses = torch.empty((n_rows,), device=logits.device, dtype=torch.float32) + lse = torch.empty((n_rows,), device=logits.device, dtype=torch.float32) + _softcapped_ce_fwd_kernel[(n_rows,)]( + logits, losses, lse, targets, + logits.stride(0), logits.stride(1), + n_rows, n_cols, float(softcap), + block_size=_FUSED_CE_BLOCK_SIZE, num_warps=_FUSED_CE_NUM_WARPS, + ) + return losses, lse +@softcapped_ce_op.register_fake +def _(logits: Tensor, targets: Tensor, softcap: float): + if logits.ndim != 2 or targets.ndim != 1: + raise ValueError("softcapped_ce fake impl expects 2D logits and 1D targets") + if logits.shape[0] != targets.shape[0]: + raise ValueError( + f"Expected matching rows, got logits={tuple(logits.shape)} targets={tuple(targets.shape)}" + ) + n_rows = logits.shape[0] + return ( + logits.new_empty((n_rows,), dtype=torch.float32), + logits.new_empty((n_rows,), dtype=torch.float32), + ) +@torch.library.custom_op(f"{_FUSED_CE_LIBRARY}::softcapped_ce_backward", mutates_args=()) +def softcapped_ce_backward_op( + logits: Tensor, targets: Tensor, lse: Tensor, grad_losses: Tensor, softcap: float, +) -> Tensor: + logits, targets = _validate_softcapped_ce_inputs(logits, targets, float(softcap)) + lse = lse.contiguous() + grad_losses = grad_losses.contiguous().to(dtype=torch.float32) + if lse.ndim != 1 or grad_losses.ndim != 1: + raise ValueError("Expected 1D lse and grad_losses") + if lse.shape[0] != logits.shape[0] or grad_losses.shape[0] != logits.shape[0]: + raise ValueError( + f"Expected row-aligned lse/grad_losses, got logits={tuple(logits.shape)} " + f"lse={tuple(lse.shape)} grad_losses={tuple(grad_losses.shape)}" + ) + grad_logits = torch.empty_like(logits) + n_rows, n_cols = logits.shape + _softcapped_ce_bwd_kernel[(n_rows,)]( + grad_logits, grad_losses, lse, logits, targets, + logits.stride(0), logits.stride(1), + grad_logits.stride(0), grad_logits.stride(1), + n_rows, n_cols, float(softcap), + block_size=_FUSED_CE_BLOCK_SIZE, num_warps=_FUSED_CE_NUM_WARPS, + ) + return grad_logits +@softcapped_ce_backward_op.register_fake +def _(logits: Tensor, targets: Tensor, lse: Tensor, grad_losses: Tensor, softcap: float): + if logits.ndim != 2 or targets.ndim != 1 or lse.ndim != 1 or grad_losses.ndim != 1: + raise ValueError("softcapped_ce_backward fake impl expects 2D logits and 1D row tensors") + if ( + logits.shape[0] != targets.shape[0] + or logits.shape[0] != lse.shape[0] + or logits.shape[0] != grad_losses.shape[0] + ): + raise ValueError("softcapped_ce_backward fake impl expects row-aligned tensors") + return logits.new_empty(logits.shape) +def _softcapped_ce_setup_context( + ctx: torch.autograd.function.FunctionCtx, inputs, output, +) -> None: + logits, targets, softcap = inputs + _losses, lse = output + ctx.save_for_backward(logits, targets, lse) + ctx.softcap = float(softcap) +def _softcapped_ce_backward( + ctx: torch.autograd.function.FunctionCtx, grad_losses: Tensor, grad_lse: "Tensor | None", +): + del grad_lse + logits, targets, lse = ctx.saved_tensors + grad_logits = torch.ops.pgsubmission1draft7fusedce.softcapped_ce_backward( + logits, targets, lse, grad_losses, ctx.softcap + ) + return grad_logits, None, None +softcapped_ce_op.register_autograd( + _softcapped_ce_backward, setup_context=_softcapped_ce_setup_context, +) +def softcapped_cross_entropy( + logits: Tensor, targets: Tensor, softcap: float, reduction: str = "mean", +) -> Tensor: + losses, _lse = torch.ops.pgsubmission1draft7fusedce.softcapped_ce( + logits, targets, float(softcap) + ) + if reduction == "none": + return losses + if reduction == "sum": + return losses.sum() + if reduction == "mean": + return losses.mean() + raise ValueError(f"Unsupported reduction={reduction!r}") +class Hyperparameters: + data_dir = os.environ.get("DATA_DIR", "./data/") + seed = int(os.environ.get("SEED", 42)) + run_id = os.environ.get("RUN_ID", str(uuid.uuid4())) + iterations = int(os.environ.get("ITERATIONS", 20000)) + warmdown_frac = float(os.environ.get("WARMDOWN_FRAC", 0.75)) + warmup_steps = int(os.environ.get("WARMUP_STEPS", 20)) + train_batch_tokens = int(os.environ.get("TRAIN_BATCH_TOKENS", 786432)) + fused_ce_enabled = bool(int(os.environ.get("FUSED_CE_ENABLED", "1"))) + train_seq_len = int(os.environ.get("TRAIN_SEQ_LEN", 2048)) + train_log_every = int(os.environ.get("TRAIN_LOG_EVERY", 500)) + max_wallclock_seconds = float(os.environ.get("MAX_WALLCLOCK_SECONDS", 6e2)) + val_batch_tokens = int(os.environ.get("VAL_BATCH_TOKENS", 524288)) + eval_seq_len = int(os.environ.get("EVAL_SEQ_LEN", 2048)) + val_loss_every = int(os.environ.get("VAL_LOSS_EVERY", 4000)) + vocab_size = int(os.environ.get("VOCAB_SIZE", 8192)) + num_layers = int(os.environ.get("NUM_LAYERS", 11)) + xsa_last_n = int(os.environ.get("XSA_LAST_N", 11)) + model_dim = int(os.environ.get("MODEL_DIM", 512)) + num_kv_heads = int(os.environ.get("NUM_KV_HEADS", 4)) + num_heads = int(os.environ.get("NUM_HEADS", 8)) + mlp_mult = float(os.environ.get("MLP_MULT", 4.0)) + skip_gates_enabled = bool(int(os.environ.get("SKIP_GATES_ENABLED", "1"))) + logit_softcap = float(os.environ.get("LOGIT_SOFTCAP", 3e1)) + rope_base = float(os.environ.get("ROPE_BASE", 1e4)) + rope_dims = int(os.environ.get("ROPE_DIMS", 16)) + rope_train_seq_len = int(os.environ.get("ROPE_TRAIN_SEQ_LEN", 2048)) + rope_yarn = bool(int(os.environ.get("ROPE_YARN", "0"))) + ln_scale = bool(int(os.environ.get("LN_SCALE", "1"))) + qk_gain_init = float(os.environ.get("QK_GAIN_INIT", 5.0)) + num_loops = int(os.environ.get("NUM_LOOPS", 2)) + loop_start = int(os.environ.get("LOOP_START", 3)) + loop_end = int(os.environ.get("LOOP_END", 5)) + enable_looping_at = float(os.environ.get("ENABLE_LOOPING_AT", 0.35)) + parallel_start_layer = int(os.environ.get("PARALLEL_START_LAYER", 8)) + parallel_final_lane = os.environ.get("PARALLEL_FINAL_LANE", "mean") + min_lr = float(os.environ.get("MIN_LR", 0.1)) + tied_embed_lr = float(os.environ.get("TIED_EMBED_LR", 0.03)) + tied_embed_init_std = float(os.environ.get("TIED_EMBED_INIT_STD", 0.005)) + matrix_lr = float(os.environ.get("MATRIX_LR", 0.026)) + scalar_lr = float(os.environ.get("SCALAR_LR", 0.02)) + muon_momentum = float(os.environ.get("MUON_MOMENTUM", 0.97)) + muon_backend_steps = int(os.environ.get("MUON_BACKEND_STEPS", 5)) + muon_momentum_warmup_start = float( + os.environ.get("MUON_MOMENTUM_WARMUP_START", 0.92) + ) + muon_momentum_warmup_steps = int(os.environ.get("MUON_MOMENTUM_WARMUP_STEPS", 1500)) + muon_row_normalize = bool(int(os.environ.get("MUON_ROW_NORMALIZE", "1"))) + beta1 = float(os.environ.get("BETA1", 0.9)) + beta2 = float(os.environ.get("BETA2", 0.95)) + adam_eps = float(os.environ.get("ADAM_EPS", 1e-08)) + grad_clip_norm = float(os.environ.get("GRAD_CLIP_NORM", 0.3)) + eval_stride = int(os.environ.get("EVAL_STRIDE", 64)) + adam_wd = float(os.environ.get("ADAM_WD", 0.02)) + muon_wd = float(os.environ.get("MUON_WD", 0.095)) + embed_wd = float(os.environ.get("EMBED_WD", 0.085)) + ema_decay = float(os.environ.get("EMA_DECAY", 0.9965)) + ttt_enabled = bool(int(os.environ.get("TTT_ENABLED", "1"))) + ttt_lora_rank = int(os.environ.get("TTT_LORA_RANK", 96)) + ttt_lora_lr = float(os.environ.get("TTT_LORA_LR", 0.0001)) + ttt_chunk_size = int(os.environ.get("TTT_CHUNK_SIZE", 48)) + ttt_eval_seq_len = int(os.environ.get("TTT_EVAL_SEQ_LEN", 2048)) + ttt_batch_size = int(os.environ.get("TTT_BATCH_SIZE", 16)) + ttt_grad_steps = int(os.environ.get("TTT_GRAD_STEPS", 1)) + ttt_weight_decay = float(os.environ.get("TTT_WEIGHT_DECAY", 1.0)) + ttt_beta1 = float(os.environ.get("TTT_BETA1", 0)) + ttt_beta2 = float(os.environ.get("TTT_BETA2", 0.999)) + ttt_k_lora = bool(int(os.environ.get("TTT_K_LORA", "1"))) + ttt_mlp_lora = bool(int(os.environ.get("TTT_MLP_LORA", "1"))) + ttt_o_lora = bool(int(os.environ.get("TTT_O_LORA", "1"))) + ttt_optimizer = os.environ.get("TTT_OPTIMIZER", "adam") + ttt_eval_batches = os.environ.get("TTT_EVAL_BATCHES", "") + val_doc_fraction = float(os.environ.get("VAL_DOC_FRACTION", 1.0)) + compressor = os.environ.get("COMPRESSOR", "brotli") + gptq_calibration_batches = int(os.environ.get("GPTQ_CALIBRATION_BATCHES", 16)) + gptq_reserve_seconds = float(os.environ.get("GPTQ_RESERVE_SECONDS", 16.0)) + phased_ttt_prefix_docs = int(os.environ.get("PHASED_TTT_PREFIX_DOCS", 2000)) + phased_ttt_num_phases = int(os.environ.get("PHASED_TTT_NUM_PHASES", 3)) + global_ttt_lr = float(os.environ.get("GLOBAL_TTT_LR", 0.001)) + global_ttt_momentum = float(os.environ.get("GLOBAL_TTT_MOMENTUM", 0.9)) + global_ttt_epochs = int(os.environ.get("GLOBAL_TTT_EPOCHS", 1)) + global_ttt_chunk_tokens = int(os.environ.get("GLOBAL_TTT_CHUNK_TOKENS", 32768)) + global_ttt_batch_seqs = int(os.environ.get("GLOBAL_TTT_BATCH_SEQS", 32)) + global_ttt_warmup_start_lr = float(os.environ.get("GLOBAL_TTT_WARMUP_START_LR", 0.0)) + global_ttt_warmup_chunks = int(os.environ.get("GLOBAL_TTT_WARMUP_CHUNKS", 0)) + global_ttt_grad_clip = float(os.environ.get("GLOBAL_TTT_GRAD_CLIP", 1.0)) + global_ttt_respect_doc_boundaries = bool(int(os.environ.get("GLOBAL_TTT_RESPECT_DOC_BOUNDARIES", "1"))) + matrix_bits = int(os.environ.get("MATRIX_BITS", 6)) + embed_bits = int(os.environ.get("EMBED_BITS", 7)) + matrix_clip_sigmas = float(os.environ.get("MATRIX_CLIP_SIGMAS", 12.85)) + embed_clip_sigmas = float(os.environ.get("EMBED_CLIP_SIGMAS", 15.0)) + mlp_clip_sigmas = float(os.environ.get("MLP_CLIP_SIGMAS", 12.0)) + attn_clip_sigmas = float(os.environ.get("ATTN_CLIP_SIGMAS", 13.0)) + attn_out_gate_enabled = bool(int(os.environ.get("ATTN_OUT_GATE_ENABLED", "0"))) + attn_out_gate_src = os.environ.get("ATTN_OUT_GATE_SRC", "proj") + smear_gate_enabled = bool(int(os.environ.get("SMEAR_GATE_ENABLED", "1"))) + gate_window = int(os.environ.get("GATE_WINDOW", 12)) + gated_attn_enabled = bool(int(os.environ.get("GATED_ATTN_ENABLED", "0"))) + gated_attn_init_std = float(os.environ.get("GATED_ATTN_INIT_STD", 0.01)) + gated_attn_quant_gate = bool(int(os.environ.get("GATED_ATTN_QUANT_GATE", "1"))) + sparse_attn_gate_enabled = bool(int(os.environ.get("SPARSE_ATTN_GATE_ENABLED", "1"))) + sparse_attn_gate_init_std = float(os.environ.get("SPARSE_ATTN_GATE_INIT_STD", 0.0)) + sparse_attn_gate_scale = float(os.environ.get("SPARSE_ATTN_GATE_SCALE", 1.0)) + lqer_enabled = bool(int(os.environ.get("LQER_ENABLED", "1"))) + lqer_rank = int(os.environ.get("LQER_RANK", 4)) + lqer_top_k = int(os.environ.get("LQER_TOP_K", 1)) + lqer_factor_bits = int(os.environ.get("LQER_FACTOR_BITS", 4)) + lqer_asym_enabled = bool(int(os.environ.get("LQER_ASYM_ENABLED", "1"))) + lqer_asym_group = int(os.environ.get("LQER_ASYM_GROUP", "64")) + distributed = "RANK" in os.environ and "WORLD_SIZE" in os.environ + rank = int(os.environ.get("RANK", "0")) + world_size = int(os.environ.get("WORLD_SIZE", "1")) + local_rank = int(os.environ.get("LOCAL_RANK", "0")) + is_main_process = rank == 0 + grad_accum_steps = 8 // world_size + caseops_enabled = bool(int(os.environ.get("CASEOPS_ENABLED", "1"))) + _default_caseops_data = os.path.join( + data_dir, + "datasets", + "fineweb10B_sp8192_caseops", + "datasets", + "datasets", + "fineweb10B_sp8192_lossless_caps_caseops_v1_reserved", + ) + _default_caseops_tok = os.path.join( + data_dir, + "datasets", + "fineweb10B_sp8192_caseops", + "datasets", + "tokenizers", + "fineweb_8192_bpe_lossless_caps_caseops_v1_reserved.model", + ) + if caseops_enabled: + datasets_dir = os.environ.get("DATA_PATH", _default_caseops_data) + tokenizer_path = os.environ.get("TOKENIZER_PATH", _default_caseops_tok) + else: + datasets_dir = os.environ.get( + "DATA_PATH", + os.path.join(data_dir, "datasets", f"fineweb10B_sp{vocab_size}"), + ) + tokenizer_path = os.environ.get( + "TOKENIZER_PATH", + os.path.join(data_dir, "tokenizers", f"fineweb_{vocab_size}_bpe.model"), + ) + train_files = os.path.join(datasets_dir, "fineweb_train_*.bin") + val_files = os.path.join(datasets_dir, "fineweb_val_*.bin") + val_bytes_files = os.path.join(datasets_dir, "fineweb_val_bytes_*.bin") + artifact_dir = os.environ.get("ARTIFACT_DIR", "") + logfile = ( + os.path.join(artifact_dir, f"{run_id}.txt") + if artifact_dir + else f"logs/{run_id}.txt" + ) + model_path = os.environ.get( + "MODEL_PATH", + os.path.join(artifact_dir, "final_model.pt") + if artifact_dir + else f"models/{run_id}.pt", + ) + load_model_path = os.environ.get("LOAD_MODEL_PATH", model_path) + quantized_model_path = os.environ.get( + "QUANTIZED_MODEL_PATH", + os.path.join(artifact_dir, "final_model.int6.ptz") + if artifact_dir + else f"models/{run_id}.int{matrix_bits}.ptz", + ) +_logger_hparams = None +def set_logging_hparams(h): + global _logger_hparams + _logger_hparams = h +def log(msg, console=True): + if _logger_hparams is None: + print(msg) + return + if _logger_hparams.is_main_process: + if console: + print(msg) + if _logger_hparams.logfile is not None: + with open(_logger_hparams.logfile, "a", encoding="utf-8") as f: + print(msg, file=f) +class ValidationData: + def __init__(self, h, device): + self.sp = spm.SentencePieceProcessor(model_file=h.tokenizer_path) + if int(self.sp.vocab_size()) != h.vocab_size: + raise ValueError( + f"VOCAB_SIZE={h.vocab_size} does not match tokenizer vocab_size={int(self.sp.vocab_size())}" + ) + self.val_tokens = load_validation_tokens(h.val_files, h.eval_seq_len) + ( + self.base_bytes_lut, + self.has_leading_space_lut, + self.is_boundary_token_lut, + ) = build_sentencepiece_luts(self.sp, h.vocab_size, device) + self.caseops_enabled = bool(getattr(h, "caseops_enabled", False)) + self.val_bytes = None + if self.caseops_enabled: + self.val_bytes = load_validation_byte_sidecar( + h.val_bytes_files, h.eval_seq_len, self.val_tokens.numel() + ) +def build_sentencepiece_luts(sp, vocab_size, device): + sp_vocab_size = int(sp.vocab_size()) + assert ( + sp.piece_to_id("▁") != sp.unk_id() + ), "Tokenizer must have '▁' (space) as its own token for correct BPB byte counting" + table_size = max(sp_vocab_size, vocab_size) + base_bytes_np = np.zeros((table_size,), dtype=np.int16) + has_leading_space_np = np.zeros((table_size,), dtype=np.bool_) + is_boundary_token_np = np.ones((table_size,), dtype=np.bool_) + for token_id in range(sp_vocab_size): + if sp.is_control(token_id) or sp.is_unknown(token_id) or sp.is_unused(token_id): + continue + is_boundary_token_np[token_id] = False + if sp.is_byte(token_id): + base_bytes_np[token_id] = 1 + continue + piece = sp.id_to_piece(token_id) + if piece.startswith("▁"): + has_leading_space_np[token_id] = True + piece = piece[1:] + base_bytes_np[token_id] = len(piece.encode("utf-8")) + return ( + torch.tensor(base_bytes_np, dtype=torch.int16, device=device), + torch.tensor(has_leading_space_np, dtype=torch.bool, device=device), + torch.tensor(is_boundary_token_np, dtype=torch.bool, device=device), + ) +def load_validation_tokens(pattern, seq_len): + files = [ + Path(p) + for p in sorted(glob.glob(pattern)) + if "_bytes_" not in Path(p).name + ] + if not files: + raise FileNotFoundError(f"No files found for pattern: {pattern}") + tokens = torch.cat([load_data_shard(file) for file in files]).contiguous() + usable = (tokens.numel() - 1) // seq_len * seq_len + if usable <= 0: + raise ValueError(f"Validation split is too short for TRAIN_SEQ_LEN={seq_len}") + return tokens[: usable + 1] +def load_validation_byte_sidecar(pattern, seq_len, expected_len): + files = [Path(p) for p in sorted(glob.glob(pattern))] + if not files: + raise FileNotFoundError(f"No byte sidecar files for pattern: {pattern}") + shards = [load_data_shard(file) for file in files] + bytes_full = torch.cat(shards).contiguous() + if bytes_full.numel() < expected_len: + raise ValueError( + f"Byte sidecar too short: {bytes_full.numel()} < val_tokens {expected_len}" + ) + return bytes_full[:expected_len].to(torch.int32) +def load_data_shard(file): + header_bytes = 256 * np.dtype(" 0: + pos = start + while pos < end: + seg_starts.append(pos) + pos += max_doc_len + else: + seg_starts.append(start) + boundaries = seg_starts + [total_len] + padded_len = get_next_multiple_of_n(len(boundaries), bucket_size) + cu = torch.full((padded_len,), total_len, dtype=torch.int32, device=device) + cu[: len(boundaries)] = torch.tensor(boundaries, dtype=torch.int32, device=device) + seg_ends = seg_starts[1:] + [total_len] + max_seqlen = max(end - start for start, end in zip(seg_starts, seg_ends)) + return cu, max_seqlen +class DocumentPackingLoader: + _shard_pool = ThreadPoolExecutor(1) + def __init__(self, h, device, cu_bucket_size=64): + self.rank = h.rank + self.world_size = h.world_size + self.device = device + self.cu_bucket_size = cu_bucket_size + self.max_seq_len = h.train_seq_len + all_files = [Path(p) for p in sorted(glob.glob(h.train_files))] + if not all_files: + raise FileNotFoundError(f"No files found for pattern: {h.train_files}") + self.files = all_files + self.file_iter = iter(self.files) + self._init_shard(load_data_shard(next(self.file_iter))) + self._next_shard = self._submit_next_shard() + self._batch_pool = ThreadPoolExecutor(1) + self._next_batch = None + def _init_shard(self, tokens): + global BOS_ID + self.tokens = tokens + self.shard_size = tokens.numel() + if BOS_ID is None: + BOS_ID = 1 + self.bos_idx = ( + (tokens == BOS_ID).nonzero(as_tuple=True)[0].to(torch.int64).cpu().numpy() + ) + self.cursor = int(self.bos_idx[0]) + def _submit_next_shard(self): + try: + path = next(self.file_iter) + return self._shard_pool.submit(load_data_shard, path) + except StopIteration: + return None + def _advance_shard(self): + if self._next_shard is None: + self.file_iter = iter(self.files) + self._next_shard = self._shard_pool.submit( + load_data_shard, next(self.file_iter) + ) + self._init_shard(self._next_shard.result()) + self._next_shard = self._submit_next_shard() + def _local_doc_starts(self, local_start, total_len): + lo = np.searchsorted(self.bos_idx, local_start, side="left") + hi = np.searchsorted(self.bos_idx, local_start + total_len, side="left") + return (self.bos_idx[lo:hi] - local_start).tolist() + def _prepare_batch(self, num_tokens_local, max_seq_len): + per_rank_span = num_tokens_local + 1 + global_span = per_rank_span * self.world_size + while self.cursor + global_span > self.shard_size: + self._advance_shard() + local_start = self.cursor + self.rank * per_rank_span + buf = self.tokens[local_start : local_start + per_rank_span] + inputs = buf[:-1].to(dtype=torch.int64).pin_memory() + targets = buf[1:].to(dtype=torch.int64).pin_memory() + starts = self._local_doc_starts(local_start, inputs.numel()) + cu_seqlens, max_seqlen = _build_cu_seqlens( + starts, inputs.numel(), inputs.device, max_seq_len, self.cu_bucket_size + ) + cu_seqlens = cu_seqlens.pin_memory() + self.cursor += global_span + return inputs, targets, cu_seqlens, max_seqlen + def next_batch(self, global_tokens, grad_accum_steps): + num_tokens_local = global_tokens // (self.world_size * grad_accum_steps) + if self._next_batch is not None: + inputs, targets, cu_seqlens, max_seqlen = self._next_batch.result() + else: + inputs, targets, cu_seqlens, max_seqlen = self._prepare_batch( + num_tokens_local, self.max_seq_len + ) + self._next_batch = self._batch_pool.submit( + self._prepare_batch, num_tokens_local, self.max_seq_len + ) + return ( + inputs[None].to(self.device, non_blocking=True), + targets[None].to(self.device, non_blocking=True), + cu_seqlens.to(self.device, non_blocking=True), + max_seqlen, + ) +class ShuffledSequenceLoader: + def __init__(self, h, device): + self.world_size = h.world_size + self.seq_len = h.train_seq_len + self.device = device + all_files = [Path(p) for p in sorted(glob.glob(h.train_files))] + if not all_files: + raise FileNotFoundError(f"No files found for pattern: {h.train_files}") + self.files = all_files[h.rank :: h.world_size] + self.rng = np.random.Generator(np.random.PCG64(h.rank)) + self.num_tokens = [_read_num_tokens(f) for f in self.files] + self.start_inds = [[] for _ in self.files] + for si in range(len(self.files)): + self._reset_shard(si) + def _reset_shard(self, si): + max_phase = min( + self.seq_len - 1, max(0, self.num_tokens[si] - self.seq_len - 1) + ) + phase = int(self.rng.integers(max_phase + 1)) if max_phase > 0 else 0 + num_sequences = (self.num_tokens[si] - 1 - phase) // self.seq_len + sequence_order = self.rng.permutation(num_sequences) + self.start_inds[si] = (phase + sequence_order * self.seq_len).tolist() + def next_batch(self, global_tokens, grad_accum_steps): + device_tokens = global_tokens // (self.world_size * grad_accum_steps) + device_batch_size = device_tokens // self.seq_len + remaining = np.array([len(s) for s in self.start_inds], dtype=np.float64) + x = torch.empty((device_batch_size, self.seq_len), dtype=torch.int64) + y = torch.empty((device_batch_size, self.seq_len), dtype=torch.int64) + for bi in range(device_batch_size): + total = remaining.sum() + if total <= 0: + for si in range(len(self.files)): + self._reset_shard(si) + remaining = np.array( + [len(s) for s in self.start_inds], dtype=np.float64 + ) + total = remaining.sum() + probs = remaining / total + si = int(self.rng.choice(len(self.files), p=probs)) + start_ind = self.start_inds[si].pop() + remaining[si] -= 1 + mm = _get_shard_memmap(self.files[si]) + window = torch.as_tensor( + np.array(mm[start_ind : start_ind + self.seq_len + 1], dtype=np.int64) + ) + x[bi] = window[:-1] + y[bi] = window[1:] + return x.to(self.device, non_blocking=True), y.to( + self.device, non_blocking=True + ) +class RMSNorm(nn.Module): + def __init__(self, eps=None): + super().__init__() + self.eps = eps + def forward(self, x): + return F.rms_norm(x, (x.size(-1),), eps=self.eps) +class CastedLinear(nn.Linear): + def forward(self, x): + w = self.weight.to(x.dtype) + bias = self.bias.to(x.dtype) if self.bias is not None else None + return F.linear(x, w, bias) +@triton.jit +def linear_leaky_relu_square_kernel( + a_desc, + b_desc, + c_desc, + aux_desc, + M, + N, + K, + BLOCK_SIZE_M: tl.constexpr, + BLOCK_SIZE_N: tl.constexpr, + BLOCK_SIZE_K: tl.constexpr, + NUM_SMS: tl.constexpr, + FORWARD: tl.constexpr, +): + dtype = tl.bfloat16 + start_pid = tl.program_id(axis=0) + num_pid_m = tl.cdiv(M, BLOCK_SIZE_M) + num_pid_n = tl.cdiv(N, BLOCK_SIZE_N) + k_tiles = tl.cdiv(K, BLOCK_SIZE_K) + num_tiles = num_pid_m * num_pid_n + tile_id_c = start_pid - NUM_SMS + for tile_id in tl.range(start_pid, num_tiles, NUM_SMS, flatten=True): + pid_m = tile_id // num_pid_n + pid_n = tile_id % num_pid_n + offs_am = pid_m * BLOCK_SIZE_M + offs_bn = pid_n * BLOCK_SIZE_N + accumulator = tl.zeros((BLOCK_SIZE_M, BLOCK_SIZE_N), dtype=tl.float32) + for ki in range(k_tiles): + offs_k = ki * BLOCK_SIZE_K + a = a_desc.load([offs_am, offs_k]) + b = b_desc.load([offs_bn, offs_k]) + accumulator = tl.dot(a, b.T, accumulator) + tile_id_c += NUM_SMS + offs_am_c = offs_am + offs_bn_c = offs_bn + acc = tl.reshape(accumulator, (BLOCK_SIZE_M, 2, BLOCK_SIZE_N // 2)) + acc = tl.permute(acc, (0, 2, 1)) + acc0, acc1 = tl.split(acc) + c0 = acc0.to(dtype) + c1 = acc1.to(dtype) + if not FORWARD: + pre0 = aux_desc.load([offs_am_c, offs_bn_c]) + pre1 = aux_desc.load([offs_am_c, offs_bn_c + BLOCK_SIZE_N // 2]) + c0 = c0 * tl.where(pre0 > 0, 2.0 * pre0, 0.18 * pre0) + c1 = c1 * tl.where(pre1 > 0, 2.0 * pre1, 0.18 * pre1) + c_desc.store([offs_am_c, offs_bn_c], c0) + c_desc.store([offs_am_c, offs_bn_c + BLOCK_SIZE_N // 2], c1) + if FORWARD: + aux0 = tl.where(c0 > 0, c0, 0.3 * c0) + aux1 = tl.where(c1 > 0, c1, 0.3 * c1) + aux_desc.store([offs_am_c, offs_bn_c], aux0 * aux0) + aux_desc.store([offs_am_c, offs_bn_c + BLOCK_SIZE_N // 2], aux1 * aux1) +def linear_leaky_relu_square(a, b, aux=None): + M, K = a.shape + N, K2 = b.shape + assert K == K2 + c = torch.empty((M, N), device=a.device, dtype=a.dtype) + forward = aux is None + if aux is None: + aux = torch.empty((M, N), device=a.device, dtype=a.dtype) + num_sms = torch.cuda.get_device_properties(a.device).multi_processor_count + BLOCK_SIZE_M, BLOCK_SIZE_N, BLOCK_SIZE_K = 128, 256, 64 + num_stages = 4 if forward else 3 + a_desc = TensorDescriptor.from_tensor(a, [BLOCK_SIZE_M, BLOCK_SIZE_K]) + b_desc = TensorDescriptor.from_tensor(b, [BLOCK_SIZE_N, BLOCK_SIZE_K]) + c_desc = TensorDescriptor.from_tensor(c, [BLOCK_SIZE_M, BLOCK_SIZE_N // 2]) + aux_desc = TensorDescriptor.from_tensor(aux, [BLOCK_SIZE_M, BLOCK_SIZE_N // 2]) + grid = lambda _meta: ( + min(num_sms, triton.cdiv(M, BLOCK_SIZE_M) * triton.cdiv(N, BLOCK_SIZE_N)), + ) + linear_leaky_relu_square_kernel[grid]( + a_desc, + b_desc, + c_desc, + aux_desc, + M, + N, + K, + BLOCK_SIZE_M=BLOCK_SIZE_M, + BLOCK_SIZE_N=BLOCK_SIZE_N, + BLOCK_SIZE_K=BLOCK_SIZE_K, + NUM_SMS=num_sms, + FORWARD=forward, + num_stages=num_stages, + num_warps=8, + ) + if forward: + return c, aux + return c +class FusedLinearLeakyReLUSquareFunction(torch.autograd.Function): + @staticmethod + def forward(ctx, x, w1, w2): + x_flat = x.reshape(-1, x.shape[-1]) + pre, post = linear_leaky_relu_square(x_flat, w1) + out = F.linear(post, w2) + ctx.save_for_backward(x, w1, w2, pre, post) + return out.view(*x.shape[:-1], out.shape[-1]) + @staticmethod + def backward(ctx, grad_output): + x, w1, w2, pre, post = ctx.saved_tensors + x_flat = x.reshape(-1, x.shape[-1]) + grad_output_flat = grad_output.reshape(-1, grad_output.shape[-1]) + dw2 = grad_output_flat.T @ post + dpre = linear_leaky_relu_square(grad_output_flat, w2.T.contiguous(), aux=pre) + dw1 = dpre.T @ x_flat + dx = dpre @ w1 + return dx.view_as(x), dw1, dw2 +FusedLeakyReLUSquareMLP = FusedLinearLeakyReLUSquareFunction.apply +class Rotary(nn.Module): + def __init__(self, dim, base=1e4, train_seq_len=1024, rope_dims=0, yarn=True): + super().__init__() + self.dim = dim + self.base = base + self.train_seq_len = train_seq_len + self.yarn = yarn + self.rope_dims = rope_dims if rope_dims > 0 else dim + inv_freq = 1.0 / base ** ( + torch.arange(0, self.rope_dims, 2, dtype=torch.float32) / self.rope_dims + ) + self.register_buffer("inv_freq", inv_freq, persistent=False) + self._seq_len_cached = 0 + self._cos_cached = None + self._sin_cached = None + def forward(self, seq_len, device, dtype): + if ( + self._cos_cached is None + or self._sin_cached is None + or self._seq_len_cached < seq_len + or self._cos_cached.device != device + ): + rd = self.rope_dims + if self.yarn and seq_len > self.train_seq_len: + scale = seq_len / self.train_seq_len + new_base = self.base * scale ** (rd / (rd - 2)) + inv_freq = 1.0 / new_base ** ( + torch.arange(0, rd, 2, dtype=torch.float32, device=device) / rd + ) + else: + inv_freq = self.inv_freq.float().to(device) + t = torch.arange(seq_len, device=device, dtype=torch.float32) + freqs = torch.outer(t, inv_freq) + self._cos_cached = freqs.cos()[None, :, None, :] + self._sin_cached = freqs.sin()[None, :, None, :] + self._seq_len_cached = seq_len + return self._cos_cached[:, :seq_len].to(dtype=dtype), self._sin_cached[:, :seq_len].to(dtype=dtype) +def apply_rotary_emb(x, cos, sin, rope_dims=0): + if rope_dims > 0 and rope_dims < x.size(-1): + x_rope, x_pass = x[..., :rope_dims], x[..., rope_dims:] + half = rope_dims // 2 + x1, x2 = x_rope[..., :half], x_rope[..., half:] + x_rope = torch.cat((x1 * cos + x2 * sin, x1 * -sin + x2 * cos), dim=-1) + return torch.cat((x_rope, x_pass), dim=-1) + half = x.size(-1) // 2 + x1, x2 = x[..., :half], x[..., half:] + return torch.cat((x1 * cos + x2 * sin, x1 * -sin + x2 * cos), dim=-1) +class CausalSelfAttention(nn.Module): + def __init__( + self, dim, num_heads, num_kv_heads, rope_base, qk_gain_init, train_seq_len, yarn=True, + attn_out_gate=False, attn_out_gate_src="proj", gate_window=12, + gated_attn=False, gated_attn_init_std=0.01, + sparse_attn_gate=False, sparse_attn_gate_init_std=0.0, sparse_attn_gate_scale=1.0, + ): + super().__init__() + if dim % num_heads != 0: + raise ValueError("model_dim must be divisible by num_heads") + if num_heads % num_kv_heads != 0: + raise ValueError("num_heads must be divisible by num_kv_heads") + if int(attn_out_gate) + int(gated_attn) + int(sparse_attn_gate) > 1: + raise ValueError( + "attn_out_gate, gated_attn, and sparse_attn_gate are mutually exclusive" + ) + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.head_dim = dim // num_heads + if self.head_dim % 2 != 0: + raise ValueError("head_dim must be even for RoPE") + self.q_gain = nn.Parameter( + torch.full((num_heads,), qk_gain_init, dtype=torch.float32) + ) + self.rope_dims = 0 + self.rotary = Rotary(self.head_dim, base=rope_base, train_seq_len=train_seq_len, yarn=yarn) + self.use_xsa = False + self.attn_out_gate = attn_out_gate + self.attn_out_gate_src = attn_out_gate_src + self.gate_window = gate_window + if attn_out_gate: + self.attn_gate_proj = CastedLinear(gate_window, num_heads, bias=False) + self.attn_gate_proj._zero_init = True + self.gated_attn = gated_attn + if gated_attn: + W = torch.empty(num_heads, dim, dtype=torch.float32) + nn.init.normal_(W, mean=0.0, std=gated_attn_init_std) + self.attn_gate_w = nn.Parameter(W) + self.sparse_attn_gate = sparse_attn_gate + self.sparse_attn_gate_scale = sparse_attn_gate_scale + if sparse_attn_gate: + W = torch.empty(num_heads, gate_window, dtype=torch.float32) + if sparse_attn_gate_init_std > 0: + nn.init.normal_(W, mean=0.0, std=sparse_attn_gate_init_std) + else: + nn.init.zeros_(W) + self.attn_gate_w = nn.Parameter(W) + def _xsa_efficient(self, y, v): + B, T, H, D = y.shape + Hkv = v.size(-2) + group = H // Hkv + y_g = y.reshape(B, T, Hkv, group, D) + vn = F.normalize(v, dim=-1).unsqueeze(-2) + proj = (y_g * vn).sum(dim=-1, keepdim=True) * vn + return (y_g - proj).reshape(B, T, H, D) + def forward(self, x, q_w, k_w, v_w, out_w, cu_seqlens=None, max_seqlen=0): + bsz, seqlen, dim = x.shape + q_raw = F.linear(x, q_w.to(x.dtype)) + q = q_raw.reshape(bsz, seqlen, self.num_heads, self.head_dim) + k = F.linear(x, k_w.to(x.dtype)).reshape(bsz, seqlen, self.num_kv_heads, self.head_dim) + v = F.linear(x, v_w.to(x.dtype)).reshape(bsz, seqlen, self.num_kv_heads, self.head_dim) + q = F.rms_norm(q, (q.size(-1),)) + k = F.rms_norm(k, (k.size(-1),)) + cos, sin = self.rotary(seqlen, x.device, q.dtype) + q = apply_rotary_emb(q, cos, sin, self.rope_dims) + k = apply_rotary_emb(k, cos, sin, self.rope_dims) + q = q * self.q_gain.to(dtype=q.dtype)[None, None, :, None] + if cu_seqlens is not None: + y = flash_attn_varlen_func( + q[0], + k[0], + v[0], + cu_seqlens_q=cu_seqlens, + cu_seqlens_k=cu_seqlens, + max_seqlen_q=max_seqlen, + max_seqlen_k=max_seqlen, + causal=True, + window_size=(-1, -1), + )[None] + else: + y = flash_attn_3_func(q, k, v, causal=True) + if self.use_xsa: + y = self._xsa_efficient(y, v) + if self.attn_out_gate: + gate_src = q_raw if self.attn_out_gate_src == "q" else x + gate_in = gate_src[..., : self.gate_window].contiguous() + g = 2.0 * torch.sigmoid(self.attn_gate_proj(gate_in)) + y = y * g[..., None] + if self.gated_attn: + x_c = x.contiguous() + g = torch.sigmoid(F.linear(x_c, self.attn_gate_w.to(x.dtype))) + y = y * g[..., None] + if self.sparse_attn_gate: + gate_in = x[..., : self.gate_window].contiguous() + g = torch.sigmoid( + self.sparse_attn_gate_scale + * F.linear(gate_in, self.attn_gate_w.to(x.dtype)) + ) + y = y * g[..., None] + y = y.reshape(bsz, seqlen, dim) + self._last_proj_input = y.detach() if getattr(self, "_calib", False) else None + return F.linear(y, out_w.to(x.dtype)) +class MLP(nn.Module): + def __init__(self, dim, mlp_mult): + super().__init__() + self.use_fused = True + def forward(self, x, up_w, down_w): + if self.training and self.use_fused: + return FusedLeakyReLUSquareMLP(x, up_w.to(x.dtype), down_w.to(x.dtype)) + hidden = F.leaky_relu(F.linear(x, up_w.to(x.dtype)), negative_slope=0.3).square() + self._last_down_input = hidden.detach() if getattr(self, "_calib", False) else None + return F.linear(hidden, down_w.to(x.dtype)) +class Block(nn.Module): + def __init__( + self, + dim, + num_heads, + num_kv_heads, + mlp_mult, + rope_base, + qk_gain_init, + train_seq_len, + layer_idx=0, + ln_scale=False, + yarn=True, + attn_out_gate=False, + attn_out_gate_src="proj", + gate_window=12, + gated_attn=False, + gated_attn_init_std=0.01, + sparse_attn_gate=False, + sparse_attn_gate_init_std=0.0, + sparse_attn_gate_scale=1.0, + ): + super().__init__() + self.attn_norm = RMSNorm() + self.mlp_norm = RMSNorm() + self.attn = CausalSelfAttention( + dim, num_heads, num_kv_heads, rope_base, qk_gain_init, train_seq_len, yarn=yarn, + attn_out_gate=attn_out_gate, attn_out_gate_src=attn_out_gate_src, gate_window=gate_window, + gated_attn=gated_attn, gated_attn_init_std=gated_attn_init_std, + sparse_attn_gate=sparse_attn_gate, + sparse_attn_gate_init_std=sparse_attn_gate_init_std, + sparse_attn_gate_scale=sparse_attn_gate_scale, + ) + self.mlp = MLP(dim, mlp_mult) + self.attn_scale = nn.Parameter(torch.ones(dim, dtype=torch.float32)) + self.mlp_scale = nn.Parameter(torch.ones(dim, dtype=torch.float32)) + self.resid_mix = nn.Parameter( + torch.stack((torch.ones(dim), torch.zeros(dim))).float() + ) + self.ln_scale_factor = 1.0 / math.sqrt(layer_idx + 1) if ln_scale else 1.0 + def forward(self, x, x0, q_w, k_w, v_w, out_w, up_w, down_w, cu_seqlens=None, max_seqlen=0): + mix = self.resid_mix.to(dtype=x.dtype) + x_in = mix[0][None, None, :] * x + mix[1][None, None, :] * x0 + attn_out = self.attn( + self.attn_norm(x_in) * self.ln_scale_factor, + q_w, k_w, v_w, out_w, + cu_seqlens=cu_seqlens, + max_seqlen=max_seqlen, + ) + x_out = x_in + self.attn_scale.to(dtype=x_in.dtype)[None, None, :] * attn_out + x_out = x_out + self.mlp_scale.to(dtype=x_out.dtype)[ + None, None, : + ] * self.mlp(self.mlp_norm(x_out) * self.ln_scale_factor, up_w, down_w) + return x_out +class GPT(nn.Module): + def __init__(self, h): + super().__init__() + if h.logit_softcap <= 0.0: + raise ValueError(f"logit_softcap must be positive, got {h.logit_softcap}") + self.tied_embed_init_std = h.tied_embed_init_std + self.logit_softcap = h.logit_softcap + self.fused_ce_enabled = bool(h.fused_ce_enabled) + self.tok_emb = nn.Embedding(h.vocab_size, h.model_dim) + self.num_layers = h.num_layers + head_dim = h.model_dim // h.num_heads + kv_dim = h.num_kv_heads * head_dim + hidden_dim = int(h.mlp_mult * h.model_dim) + self.qo_bank = nn.Parameter(torch.empty(2 * h.num_layers, h.model_dim, h.model_dim)) + self.kv_bank = nn.Parameter(torch.empty(2 * h.num_layers, kv_dim, h.model_dim)) + self.mlp_up_bank = nn.Parameter(torch.empty(h.num_layers, hidden_dim, h.model_dim)) + self.mlp_down_bank = nn.Parameter(torch.empty(h.num_layers, h.model_dim, hidden_dim)) + self.num_encoder_layers = h.num_layers // 2 + self.num_decoder_layers = h.num_layers - self.num_encoder_layers + self.blocks = nn.ModuleList( + [ + Block( + h.model_dim, + h.num_heads, + h.num_kv_heads, + h.mlp_mult, + h.rope_base, + h.qk_gain_init, + h.train_seq_len, + layer_idx=i, + ln_scale=h.ln_scale, + yarn=h.rope_yarn, + attn_out_gate=h.attn_out_gate_enabled, + attn_out_gate_src=h.attn_out_gate_src, + gate_window=h.gate_window, + gated_attn=h.gated_attn_enabled, + gated_attn_init_std=h.gated_attn_init_std, + sparse_attn_gate=h.sparse_attn_gate_enabled, + sparse_attn_gate_init_std=h.sparse_attn_gate_init_std, + sparse_attn_gate_scale=h.sparse_attn_gate_scale, + ) + for i in range(h.num_layers) + ] + ) + if h.rope_dims > 0: + head_dim = h.model_dim // h.num_heads + for block in self.blocks: + block.attn.rope_dims = h.rope_dims + block.attn.rotary = Rotary( + head_dim, + base=h.rope_base, + train_seq_len=h.train_seq_len, + rope_dims=h.rope_dims, + yarn=h.rope_yarn, + ) + self.final_norm = RMSNorm() + if h.xsa_last_n > 0: + for i in range(max(0, h.num_layers - h.xsa_last_n), h.num_layers): + self.blocks[i].attn.use_xsa = True + self.looping_active = False + if h.num_loops > 0: + loop_seg = list(range(h.loop_start, h.loop_end + 1)) + all_indices = list(range(h.loop_start)) + for _ in range(h.num_loops + 1): + all_indices.extend(loop_seg) + all_indices.extend(range(h.loop_end + 1, h.num_layers)) + num_enc = len(all_indices) // 2 + self.encoder_indices = all_indices[:num_enc] + self.decoder_indices = all_indices[num_enc:] + else: + self.encoder_indices = list(range(self.num_encoder_layers)) + self.decoder_indices = list(range(self.num_encoder_layers, h.num_layers)) + self.num_skip_weights = min( + len(self.encoder_indices), len(self.decoder_indices) + ) + self.skip_weights = nn.Parameter( + torch.ones(self.num_skip_weights, h.model_dim, dtype=torch.float32) + ) + self.skip_gates = ( + nn.Parameter( + torch.zeros(self.num_skip_weights, h.model_dim, dtype=torch.float32) + ) + if h.skip_gates_enabled + else None + ) + self.parallel_start_layer = h.parallel_start_layer + self.parallel_final_lane = h.parallel_final_lane.lower() + self.parallel_post_lambdas = nn.Parameter( + torch.ones(h.num_layers, 2, 2, dtype=torch.float32) + ) + self.parallel_resid_lambdas = nn.Parameter( + torch.full((h.num_layers, 2), 1.1, dtype=torch.float32) + ) + self.smear_gate_enabled = h.smear_gate_enabled + if self.smear_gate_enabled: + self.smear_window = h.gate_window + self.smear_gate = CastedLinear(self.smear_window, 1, bias=False) + self.smear_gate._zero_init = True + self.smear_lambda = nn.Parameter(torch.zeros(1, dtype=torch.float32)) + self._init_weights() + def _init_weights(self): + nn.init.normal_(self.tok_emb.weight, mean=0.0, std=self.tied_embed_init_std) + n = self.num_layers + proj_scale = 1.0 / math.sqrt(2 * n) + for i in range(n): + nn.init.orthogonal_(self.qo_bank.data[i], gain=1.0) + nn.init.zeros_(self.qo_bank.data[n + i]) + self.qo_bank.data[n + i].mul_(proj_scale) + nn.init.orthogonal_(self.kv_bank.data[i], gain=1.0) + nn.init.orthogonal_(self.kv_bank.data[n + i], gain=1.0) + for i in range(n): + nn.init.orthogonal_(self.mlp_up_bank.data[i], gain=1.0) + nn.init.zeros_(self.mlp_down_bank.data[i]) + self.mlp_down_bank.data[i].mul_(proj_scale) + for name, module in self.named_modules(): + if isinstance(module, nn.Linear): + if getattr(module, "_zero_init", False): + nn.init.zeros_(module.weight) + elif ( + module.weight.ndim == 2 + and module.weight.shape[0] >= 64 + and module.weight.shape[1] >= 64 + ): + nn.init.orthogonal_(module.weight, gain=1.0) + def _bank_weights(self, i): + n = self.num_layers + return ( + self.qo_bank[i], + self.kv_bank[i], + self.kv_bank[n + i], + self.qo_bank[n + i], + self.mlp_up_bank[i], + self.mlp_down_bank[i], + ) + def _parallel_block( + self, block_idx, lane0, lane1, x0, + q_w, k_w, v_w, out_w, up_w, down_w, + cu_seqlens=None, max_seqlen=0, + ): + block = self.blocks[block_idx] + mix = block.resid_mix.to(dtype=lane0.dtype) + attn_read = mix[0][None, None, :] * lane0 + mix[1][None, None, :] * x0 + attn_out = block.attn( + block.attn_norm(attn_read) * block.ln_scale_factor, + q_w, k_w, v_w, out_w, + cu_seqlens=cu_seqlens, max_seqlen=max_seqlen, + ) + attn_out = block.attn_scale.to(dtype=attn_out.dtype)[None, None, :] * attn_out + mlp_read = lane1 + mlp_out = block.mlp_scale.to(dtype=lane1.dtype)[None, None, :] * block.mlp( + block.mlp_norm(mlp_read) * block.ln_scale_factor, up_w, down_w + ) + attn_resid = self.parallel_resid_lambdas[block_idx, 0].to(dtype=lane0.dtype) + attn_post = self.parallel_post_lambdas[block_idx, 0].to(dtype=lane0.dtype) + mlp_resid = self.parallel_resid_lambdas[block_idx, 1].to(dtype=lane0.dtype) + mlp_post = self.parallel_post_lambdas[block_idx, 1].to(dtype=lane0.dtype) + lane0 = attn_resid * lane0 + attn_post[0] * attn_out + mlp_post[0] * mlp_out + lane1 = mlp_resid * lane1 + attn_post[1] * attn_out + mlp_post[1] * mlp_out + return lane0, lane1 + def _final_parallel_hidden(self, lane0, lane1): + if self.parallel_final_lane == "mlp": + return lane1 + if self.parallel_final_lane == "attn": + return lane0 + return 0.5 * (lane0 + lane1) + def _forward_hidden(self, input_ids, cu_seqlens=None, max_seqlen=0): + x = self.tok_emb(input_ids) + if self.smear_gate_enabled: + sl = self.smear_lambda.to(dtype=x.dtype) + gate_in = x[:, 1:, : self.smear_window].contiguous() + g = sl * torch.sigmoid(self.smear_gate(gate_in)) + bos_mask = (input_ids[:, 1:] == 1).unsqueeze(-1) + g = g.masked_fill(bos_mask, 0.0) + x = torch.cat([x[:, :1], x[:, 1:] + g * x[:, :-1]], dim=1) + x = F.rms_norm(x, (x.size(-1),)) + x0 = x + skips = [] + enc_iter = ( + self.encoder_indices + if self.looping_active + else range(self.num_encoder_layers) + ) + dec_iter = ( + self.decoder_indices + if self.looping_active + else range( + self.num_encoder_layers, + self.num_encoder_layers + self.num_decoder_layers, + ) + ) + for i in enc_iter: + q_w, k_w, v_w, out_w, up_w, down_w = self._bank_weights(i) + x = self.blocks[i](x, x0, q_w, k_w, v_w, out_w, up_w, down_w, cu_seqlens=cu_seqlens, max_seqlen=max_seqlen) + skips.append(x) + psl = self.parallel_start_layer + lane0 = None + lane1 = None + for skip_idx, i in enumerate(dec_iter): + q_w, k_w, v_w, out_w, up_w, down_w = self._bank_weights(i) + if i >= psl and psl > 0: + if lane0 is None: + lane0 = x + lane1 = x + if skip_idx < self.num_skip_weights and skips: + skip = skips.pop() + w = self.skip_weights[skip_idx].to(dtype=lane0.dtype)[None, None, :] + if self.skip_gates is not None: + g = torch.sigmoid(self.skip_gates[skip_idx].to(dtype=lane0.dtype))[None, None, :] + lane0 = torch.lerp(w * skip, lane0, g) + else: + lane0 = lane0 + w * skip + lane0, lane1 = self._parallel_block( + i, lane0, lane1, x0, q_w, k_w, v_w, out_w, up_w, down_w, + cu_seqlens=cu_seqlens, max_seqlen=max_seqlen, + ) + else: + if skip_idx < self.num_skip_weights and skips: + scaled_skip = ( + self.skip_weights[skip_idx].to(dtype=x.dtype)[None, None, :] + * skips.pop() + ) + if self.skip_gates is not None: + g = torch.sigmoid(self.skip_gates[skip_idx].to(dtype=x.dtype))[None, None, :] + x = torch.lerp(scaled_skip, x, g) + else: + x = x + scaled_skip + x = self.blocks[i](x, x0, q_w, k_w, v_w, out_w, up_w, down_w, cu_seqlens=cu_seqlens, max_seqlen=max_seqlen) + if lane0 is not None: + x = self._final_parallel_hidden(lane0, lane1) + x = self.final_norm(x) + return x + def _project_logits(self, hidden): + return F.linear(hidden, self.tok_emb.weight) + def forward_logits(self, input_ids, cu_seqlens=None, max_seqlen=0): + hidden = self._forward_hidden(input_ids, cu_seqlens=cu_seqlens, max_seqlen=max_seqlen) + logits_proj = self._project_logits(hidden) + return self.logit_softcap * torch.tanh(logits_proj / self.logit_softcap) + def forward(self, input_ids, target_ids, cu_seqlens=None, max_seqlen=0): + hidden = self._forward_hidden(input_ids, cu_seqlens=cu_seqlens, max_seqlen=max_seqlen) + logits_proj = self._project_logits(hidden) + flat_targets = target_ids.reshape(-1) + if self.fused_ce_enabled: + return softcapped_cross_entropy( + logits_proj.reshape(-1, logits_proj.size(-1)), + flat_targets, + self.logit_softcap, + reduction="mean", + ) + logits = self.logit_softcap * torch.tanh(logits_proj / self.logit_softcap) + return F.cross_entropy( + logits.reshape(-1, logits.size(-1)).float(), + flat_targets, + reduction="mean", + ) + def forward_ttt(self, input_ids, target_ids, lora): + x = self.tok_emb(input_ids) + if self.smear_gate_enabled: + sl = self.smear_lambda.to(dtype=x.dtype) + gate_in = x[:, 1:, : self.smear_window].contiguous() + g = sl * torch.sigmoid(self.smear_gate(gate_in)) + bos_mask = (input_ids[:, 1:] == 1).unsqueeze(-1) + g = g.masked_fill(bos_mask, 0.0) + x = torch.cat([x[:, :1], x[:, 1:] + g * x[:, :-1]], dim=1) + x = F.rms_norm(x, (x.size(-1),)) + x0 = x + skips = [] + enc_iter = ( + self.encoder_indices + if self.looping_active + else list(range(self.num_encoder_layers)) + ) + dec_iter = ( + self.decoder_indices + if self.looping_active + else list( + range( + self.num_encoder_layers, + self.num_encoder_layers + self.num_decoder_layers, + ) + ) + ) + slot = 0 + for i in enc_iter: + q_w, k_w, v_w, out_w, up_w, down_w = self._bank_weights(i) + x = self._block_with_lora(self.blocks[i], x, x0, lora, slot, q_w, k_w, v_w, out_w, up_w, down_w) + slot += 1 + skips.append(x) + psl = self.parallel_start_layer + lane0 = None + lane1 = None + for skip_idx, i in enumerate(dec_iter): + q_w, k_w, v_w, out_w, up_w, down_w = self._bank_weights(i) + if i >= psl and psl > 0: + if lane0 is None: + lane0 = x + lane1 = x + if skip_idx < self.num_skip_weights and skips: + skip = skips.pop() + w = self.skip_weights[skip_idx].to(dtype=lane0.dtype)[None, None, :] + if self.skip_gates is not None: + g = torch.sigmoid(self.skip_gates[skip_idx].to(dtype=lane0.dtype))[None, None, :] + lane0 = torch.lerp(w * skip, lane0, g) + else: + lane0 = lane0 + w * skip + lane0, lane1 = self._parallel_block_with_lora( + i, lane0, lane1, x0, lora, slot, + q_w, k_w, v_w, out_w, up_w, down_w, + ) + else: + if skip_idx < self.num_skip_weights and skips: + scaled_skip = ( + self.skip_weights[skip_idx].to(dtype=x.dtype)[None, None, :] + * skips.pop() + ) + if self.skip_gates is not None: + g = torch.sigmoid(self.skip_gates[skip_idx].to(dtype=x.dtype))[None, None, :] + x = torch.lerp(scaled_skip, x, g) + else: + x = x + scaled_skip + x = self._block_with_lora(self.blocks[i], x, x0, lora, slot, q_w, k_w, v_w, out_w, up_w, down_w) + slot += 1 + if lane0 is not None: + x = self._final_parallel_hidden(lane0, lane1) + x = self.final_norm(x) + logits = F.linear(x, self.tok_emb.weight) + lora.lm_head_lora(x) + logits = self.logit_softcap * torch.tanh(logits / self.logit_softcap) + bsz, sl, V = logits.shape + return F.cross_entropy( + logits.float().reshape(-1, V), target_ids.reshape(-1), reduction="none" + ).reshape(bsz, sl) + def _block_with_lora(self, block, x, x0, lora, slot, q_w, k_w, v_w, out_w, up_w, down_w): + mix = block.resid_mix.to(dtype=x.dtype) + x_in = mix[0][None, None, :] * x + mix[1][None, None, :] * x0 + n = block.attn_norm(x_in) * block.ln_scale_factor + attn = block.attn + bsz, seqlen, dim = n.shape + q_raw = F.linear(n, q_w.to(n.dtype)) + lora.q_loras[slot](n) + q = q_raw.reshape(bsz, seqlen, attn.num_heads, attn.head_dim) + k = F.linear(n, k_w.to(n.dtype)) + if lora.k_loras is not None: + k = k + lora.k_loras[slot](n) + k = k.reshape(bsz, seqlen, attn.num_kv_heads, attn.head_dim) + v = (F.linear(n, v_w.to(n.dtype)) + lora.v_loras[slot](n)).reshape( + bsz, seqlen, attn.num_kv_heads, attn.head_dim + ) + q = F.rms_norm(q, (q.size(-1),)) + k = F.rms_norm(k, (k.size(-1),)) + cos, sin = attn.rotary(seqlen, n.device, q.dtype) + q = apply_rotary_emb(q, cos, sin, attn.rope_dims) + k = apply_rotary_emb(k, cos, sin, attn.rope_dims) + q = q * attn.q_gain.to(dtype=q.dtype)[None, None, :, None] + y = flash_attn_3_func(q, k, v, causal=True) + if attn.use_xsa: + y = attn._xsa_efficient(y, v) + if attn.attn_out_gate: + gate_src = q_raw if attn.attn_out_gate_src == "q" else n + gate_in = gate_src[..., : attn.gate_window].contiguous() + g = 2.0 * torch.sigmoid(attn.attn_gate_proj(gate_in)) + y = y * g[..., None] + if attn.gated_attn: + n_c = n.contiguous() + g = torch.sigmoid(F.linear(n_c, attn.attn_gate_w.to(n.dtype))) + y = y * g[..., None] + if attn.sparse_attn_gate: + gate_in = n[..., : attn.gate_window].contiguous() + g = torch.sigmoid( + attn.sparse_attn_gate_scale + * F.linear(gate_in, attn.attn_gate_w.to(n.dtype)) + ) + y = y * g[..., None] + y = y.reshape(bsz, seqlen, dim) + attn_out = F.linear(y, out_w.to(n.dtype)) + if lora.o_loras is not None: + attn_out = attn_out + lora.o_loras[slot](n) + x_out = x_in + block.attn_scale.to(dtype=x_in.dtype)[None, None, :] * attn_out + mlp_n = block.mlp_norm(x_out) * block.ln_scale_factor + mlp_out = block.mlp(mlp_n, up_w, down_w) + if lora.mlp_loras is not None: + mlp_out = mlp_out + lora.mlp_loras[slot](mlp_n) + x_out = x_out + block.mlp_scale.to(dtype=x_out.dtype)[None, None, :] * mlp_out + return x_out + def _parallel_block_with_lora( + self, block_idx, lane0, lane1, x0, lora, slot, + q_w, k_w, v_w, out_w, up_w, down_w, + ): + block = self.blocks[block_idx] + mix = block.resid_mix.to(dtype=lane0.dtype) + attn_read = mix[0][None, None, :] * lane0 + mix[1][None, None, :] * x0 + n = block.attn_norm(attn_read) * block.ln_scale_factor + attn = block.attn + bsz, seqlen, dim = n.shape + q_raw = F.linear(n, q_w.to(n.dtype)) + lora.q_loras[slot](n) + q = q_raw.reshape(bsz, seqlen, attn.num_heads, attn.head_dim) + k = F.linear(n, k_w.to(n.dtype)) + if lora.k_loras is not None: + k = k + lora.k_loras[slot](n) + k = k.reshape(bsz, seqlen, attn.num_kv_heads, attn.head_dim) + v = (F.linear(n, v_w.to(n.dtype)) + lora.v_loras[slot](n)).reshape( + bsz, seqlen, attn.num_kv_heads, attn.head_dim + ) + q = F.rms_norm(q, (q.size(-1),)) + k = F.rms_norm(k, (k.size(-1),)) + cos, sin = attn.rotary(seqlen, n.device, q.dtype) + q = apply_rotary_emb(q, cos, sin, attn.rope_dims) + k = apply_rotary_emb(k, cos, sin, attn.rope_dims) + q = q * attn.q_gain.to(dtype=q.dtype)[None, None, :, None] + y = flash_attn_3_func(q, k, v, causal=True) + if attn.use_xsa: + y = attn._xsa_efficient(y, v) + if attn.attn_out_gate: + gate_src = q_raw if attn.attn_out_gate_src == "q" else n + gate_in = gate_src[..., : attn.gate_window].contiguous() + g = 2.0 * torch.sigmoid(attn.attn_gate_proj(gate_in)) + y = y * g[..., None] + if attn.gated_attn: + n_c = n.contiguous() + g = torch.sigmoid(F.linear(n_c, attn.attn_gate_w.to(n.dtype))) + y = y * g[..., None] + if attn.sparse_attn_gate: + gate_in = n[..., : attn.gate_window].contiguous() + g = torch.sigmoid( + attn.sparse_attn_gate_scale + * F.linear(gate_in, attn.attn_gate_w.to(n.dtype)) + ) + y = y * g[..., None] + y = y.reshape(bsz, seqlen, dim) + attn_out = F.linear(y, out_w.to(n.dtype)) + if lora.o_loras is not None: + attn_out = attn_out + lora.o_loras[slot](n) + attn_out = block.attn_scale.to(dtype=attn_out.dtype)[None, None, :] * attn_out + mlp_read = lane1 + mlp_n = block.mlp_norm(mlp_read) * block.ln_scale_factor + mlp_out = block.mlp(mlp_n, up_w, down_w) + if lora.mlp_loras is not None: + mlp_out = mlp_out + lora.mlp_loras[slot](mlp_n) + mlp_out = block.mlp_scale.to(dtype=lane1.dtype)[None, None, :] * mlp_out + attn_resid = self.parallel_resid_lambdas[block_idx, 0].to(dtype=lane0.dtype) + attn_post = self.parallel_post_lambdas[block_idx, 0].to(dtype=lane0.dtype) + mlp_resid = self.parallel_resid_lambdas[block_idx, 1].to(dtype=lane0.dtype) + mlp_post = self.parallel_post_lambdas[block_idx, 1].to(dtype=lane0.dtype) + lane0 = attn_resid * lane0 + attn_post[0] * attn_out + mlp_post[0] * mlp_out + lane1 = mlp_resid * lane1 + attn_post[1] * attn_out + mlp_post[1] * mlp_out + return lane0, lane1 +class BatchedLinearLoRA(nn.Module): + _ALPHA = float(os.environ.get("TTT_LORA_ALPHA", "144")) + _WARM_START_A = bool(int(os.environ.get("TTT_WARM_START_A", "1"))) + def __init__(self, bsz, in_features, out_features, rank): + super().__init__() + self._bound = 1.0 / math.sqrt(in_features) + self._scale = self._ALPHA / rank + self.A = nn.Parameter( + torch.empty(bsz, rank, in_features).uniform_(-self._bound, self._bound) + ) + self.B = nn.Parameter(torch.zeros(bsz, out_features, rank)) + def reset(self): + with torch.no_grad(): + if not self._WARM_START_A: + self.A.uniform_(-self._bound, self._bound) + self.B.zero_() + def forward(self, x): + return ((x @ self.A.transpose(1, 2)) @ self.B.transpose(1, 2)) * self._scale +class BatchedTTTLoRA(nn.Module): + def __init__(self, bsz, model, rank, k_lora=True, mlp_lora=True, o_lora=True): + super().__init__() + self.bsz = bsz + dim = model.qo_bank.shape[-1] + vocab = model.tok_emb.num_embeddings + if getattr(model, "looping_active", False): + num_slots = len(model.encoder_indices) + len(model.decoder_indices) + else: + num_slots = len(model.blocks) + kv_dim = model.blocks[0].attn.num_kv_heads * ( + dim // model.blocks[0].attn.num_heads + ) + embed_dim = model.tok_emb.embedding_dim + self.lm_head_lora = BatchedLinearLoRA(bsz, embed_dim, vocab, rank) + self.q_loras = nn.ModuleList( + [BatchedLinearLoRA(bsz, dim, dim, rank) for _ in range(num_slots)] + ) + self.v_loras = nn.ModuleList( + [BatchedLinearLoRA(bsz, dim, kv_dim, rank) for _ in range(num_slots)] + ) + self.k_loras = ( + nn.ModuleList( + [BatchedLinearLoRA(bsz, dim, kv_dim, rank) for _ in range(num_slots)] + ) + if k_lora + else None + ) + self.mlp_loras = ( + nn.ModuleList( + [BatchedLinearLoRA(bsz, dim, dim, rank) for _ in range(num_slots)] + ) + if mlp_lora + else None + ) + self.o_loras = ( + nn.ModuleList( + [BatchedLinearLoRA(bsz, dim, dim, rank) for _ in range(num_slots)] + ) + if o_lora + else None + ) + def reset(self): + with torch.no_grad(): + self.lm_head_lora.reset() + for loras in [self.q_loras, self.v_loras, self.k_loras, + self.mlp_loras, self.o_loras]: + if loras is not None: + for lora in loras: + lora.reset() +_PE_COEFFS = ( + (8.156554524902461, -22.48329292557795, 15.878769915207462), + (4.042929935166739, -2.808917465908714, 0.5000178451051316), + (3.8916678022926607, -2.772484153217685, 0.5060648178503393), + (3.285753657755655, -2.3681294933425376, 0.46449024233003106), + (2.3465413258596377, -1.7097828382687081, 0.42323551169305323), +) +@torch.compile +def zeropower_via_newtonschulz5(G, steps=10, eps=1e-07): + was_2d = G.ndim == 2 + if was_2d: + G = G.unsqueeze(0) + X = G.bfloat16() + transposed = X.size(-2) > X.size(-1) + if transposed: + X = X.mT + X = X / (X.norm(dim=(-2, -1), keepdim=True) + eps) + coeffs = _PE_COEFFS[:steps] if steps <= len(_PE_COEFFS) else _PE_COEFFS + for a, b, c in coeffs: + A = X @ X.mT + B = b * A + c * (A @ A) + X = a * X + B @ X + if transposed: + X = X.mT + if was_2d: + X = X.squeeze(0) + return X +class Muon(torch.optim.Optimizer): + def __init__( + self, + params, + lr, + momentum, + backend_steps, + nesterov=True, + weight_decay=0.0, + row_normalize=False, + ): + super().__init__( + params, + dict( + lr=lr, + momentum=momentum, + backend_steps=backend_steps, + nesterov=nesterov, + weight_decay=weight_decay, + row_normalize=row_normalize, + ), + ) + self._built = False + def _build(self): + self._distributed = dist.is_available() and dist.is_initialized() + self._world_size = dist.get_world_size() if self._distributed else 1 + self._rank = dist.get_rank() if self._distributed else 0 + ws = self._world_size + self._bank_meta = [] + for group in self.param_groups: + for p in group["params"]: + B = p.shape[0] + padded_B = ((B + ws - 1) // ws) * ws + shard_B = padded_B // ws + tail = p.shape[1:] + dev = p.device + self._bank_meta.append({ + "p": p, + "B": B, + "padded_grad": torch.zeros(padded_B, *tail, device=dev, dtype=torch.bfloat16), + "shard": torch.zeros(shard_B, *tail, device=dev, dtype=torch.bfloat16), + "shard_mom": torch.zeros(shard_B, *tail, device=dev, dtype=torch.bfloat16), + "full_update": torch.zeros(padded_B, *tail, device=dev, dtype=torch.bfloat16), + "scale": max(1, p.shape[-2] / p.shape[-1]) ** 0.5, + }) + self._bank_meta.sort(key=lambda m: -m["p"].numel()) + self._built = True + def launch_reduce_scatters(self): + if not self._built: + self._build() + if not self._distributed: + return + self._rs_futures = [] + for m in self._bank_meta: + p = m["p"] + if p.grad is None: + self._rs_futures.append(None) + continue + pg = m["padded_grad"] + pg[: m["B"]].copy_(p.grad.bfloat16()) + if pg.shape[0] > m["B"]: + pg[m["B"] :].zero_() + fut = dist.reduce_scatter_tensor( + m["shard"], pg, op=dist.ReduceOp.AVG, async_op=True + ) + self._rs_futures.append(fut) + @torch.no_grad() + def step(self, closure=None): + loss = None + if closure is not None: + with torch.enable_grad(): + loss = closure() + if not self._built: + self._build() + for group in self.param_groups: + lr = group["lr"] + momentum = group["momentum"] + backend_steps = group["backend_steps"] + nesterov = group["nesterov"] + wd = group.get("weight_decay", 0.0) + row_normalize = group.get("row_normalize", False) + prev_ag_handle = None + prev_m = None + sharded = self._distributed and hasattr(self, "_rs_futures") + for idx, m in enumerate(self._bank_meta): + p = m["p"] + if p.grad is None: + continue + if prev_ag_handle is not None: + prev_ag_handle.wait() + pp = prev_m["p"] + upd = prev_m["full_update"][: prev_m["B"]] + if wd > 0.0: + pp.data.mul_(1.0 - lr * wd) + pp.add_(upd.to(dtype=pp.dtype), alpha=-lr * prev_m["scale"]) + if sharded and self._rs_futures[idx] is not None: + self._rs_futures[idx].wait() + g = m["shard"] + buf = m["shard_mom"] + else: + g = p.grad.bfloat16() + state = self.state[p] + if "momentum_buffer" not in state: + state["momentum_buffer"] = torch.zeros_like(g) + buf = state["momentum_buffer"] + buf.mul_(momentum).add_(g) + if nesterov: + update = g.add(buf, alpha=momentum) + else: + update = buf + if row_normalize: + rn = update.float().norm(dim=-1, keepdim=True).clamp_min(1e-07) + update = update / rn.to(update.dtype) + update = zeropower_via_newtonschulz5(update, steps=backend_steps) + if sharded: + prev_ag_handle = dist.all_gather_into_tensor( + m["full_update"], update, async_op=True + ) + prev_m = m + else: + if wd > 0.0: + p.data.mul_(1.0 - lr * wd) + p.add_(update.to(dtype=p.dtype), alpha=-lr * m["scale"]) + if prev_ag_handle is not None: + prev_ag_handle.wait() + pp = prev_m["p"] + upd = prev_m["full_update"][: prev_m["B"]] + if wd > 0.0: + pp.data.mul_(1.0 - lr * wd) + pp.add_(upd.to(dtype=pp.dtype), alpha=-lr * prev_m["scale"]) + if hasattr(self, "_rs_futures"): + del self._rs_futures + return loss +CONTROL_TENSOR_NAME_PATTERNS = tuple( + pattern + for pattern in os.environ.get( + "CONTROL_TENSOR_NAME_PATTERNS", + "attn_scale,attn_scales,mlp_scale,mlp_scales,resid_mix,resid_mixes,q_gain,skip_weight,skip_weights,skip_gates,parallel_post_lambdas,parallel_resid_lambdas,attn_gate_proj,attn_gate_w,smear_gate,smear_lambda", + ).split(",") + if pattern +) +PACKED_REPLICATED_GRAD_MAX_NUMEL = 1 << 15 +class Optimizers: + def __init__(self, h, base_model): + matrix_params = [ + base_model.qo_bank, + base_model.kv_bank, + base_model.mlp_up_bank, + base_model.mlp_down_bank, + ] + block_named_params = list(base_model.blocks.named_parameters()) + scalar_params = [ + p + for (name, p) in block_named_params + if p.ndim < 2 + or any(pattern in name for pattern in CONTROL_TENSOR_NAME_PATTERNS) + ] + if base_model.skip_weights.numel() > 0: + scalar_params.append(base_model.skip_weights) + if base_model.skip_gates is not None and base_model.skip_gates.numel() > 0: + scalar_params.append(base_model.skip_gates) + if base_model.parallel_post_lambdas is not None: + scalar_params.append(base_model.parallel_post_lambdas) + if base_model.parallel_resid_lambdas is not None: + scalar_params.append(base_model.parallel_resid_lambdas) + if getattr(base_model, "smear_gate_enabled", False): + scalar_params.append(base_model.smear_gate.weight) + scalar_params.append(base_model.smear_lambda) + token_lr = h.tied_embed_lr + tok_params = [ + {"params": [base_model.tok_emb.weight], "lr": token_lr, "base_lr": token_lr} + ] + self.optimizer_tok = torch.optim.AdamW( + tok_params, + betas=(h.beta1, h.beta2), + eps=h.adam_eps, + weight_decay=h.embed_wd, + fused=True, + ) + self.optimizer_muon = Muon( + matrix_params, + lr=h.matrix_lr, + momentum=h.muon_momentum, + backend_steps=h.muon_backend_steps, + weight_decay=h.muon_wd, + row_normalize=h.muon_row_normalize, + ) + for group in self.optimizer_muon.param_groups: + group["base_lr"] = h.matrix_lr + self.optimizer_scalar = torch.optim.AdamW( + [{"params": scalar_params, "lr": h.scalar_lr, "base_lr": h.scalar_lr}], + betas=(h.beta1, h.beta2), + eps=h.adam_eps, + weight_decay=h.adam_wd, + fused=True, + ) + self.optimizers = [ + self.optimizer_tok, + self.optimizer_muon, + self.optimizer_scalar, + ] + self.replicated_params = list(tok_params[0]["params"]) + self.replicated_params.extend(scalar_params) + self.replicated_large_params = [] + self.replicated_packed_params = [] + for p in self.replicated_params: + if p.numel() <= PACKED_REPLICATED_GRAD_MAX_NUMEL: + self.replicated_packed_params.append(p) + else: + self.replicated_large_params.append(p) + def __iter__(self): + return iter(self.optimizers) + def zero_grad_all(self): + for opt in self.optimizers: + opt.zero_grad(set_to_none=True) + def _all_reduce_packed_grads(self): + grads_by_key = collections.defaultdict(list) + for p in self.replicated_packed_params: + if p.grad is not None: + grads_by_key[(p.grad.device, p.grad.dtype)].append(p.grad) + for grads in grads_by_key.values(): + flat = torch.empty( + sum(g.numel() for g in grads), + device=grads[0].device, + dtype=grads[0].dtype, + ) + offset = 0 + for g in grads: + n = g.numel() + flat[offset : offset + n].copy_(g.contiguous().view(-1)) + offset += n + dist.all_reduce(flat, op=dist.ReduceOp.AVG) + offset = 0 + for g in grads: + n = g.numel() + g.copy_(flat[offset : offset + n].view_as(g)) + offset += n + def step(self, distributed=False): + self.optimizer_muon.launch_reduce_scatters() + if distributed: + reduce_handles = [ + dist.all_reduce(p.grad, op=dist.ReduceOp.AVG, async_op=True) + for p in self.replicated_large_params + if p.grad is not None + ] + self._all_reduce_packed_grads() + for handle in reduce_handles: + handle.wait() + self.optimizer_tok.step() + self.optimizer_scalar.step() + self.optimizer_muon.step() + self.zero_grad_all() +def restore_fp32_params(model): + for module in model.modules(): + if isinstance(module, CastedLinear): + module.float() + for name, param in model.named_parameters(): + if ( + param.ndim < 2 + or any(pattern in name for pattern in CONTROL_TENSOR_NAME_PATTERNS) + ) and param.dtype != torch.float32: + param.data = param.data.float() + if hasattr(model, "qo_bank") and model.qo_bank is not None: + model.qo_bank.data = model.qo_bank.data.float() + model.kv_bank.data = model.kv_bank.data.float() + model.mlp_up_bank.data = model.mlp_up_bank.data.float() + model.mlp_down_bank.data = model.mlp_down_bank.data.float() +def collect_hessians(model, train_loader, h, device, n_calibration_batches=64): + hessians = {} + hooks = [] + for i, block in enumerate(model.blocks): + block.attn._calib = True + block.mlp._calib = True + block.mlp.use_fused = False + def make_attn_hook(layer_idx): + def hook_fn(module, inp, out): + x = inp[0].detach().float() + if x.ndim == 3: + x = x.reshape(-1, x.shape[-1]) + for suffix in ["c_q", "c_k", "c_v"]: + name = f"blocks.{layer_idx}.attn.{suffix}.weight" + if name not in hessians: + hessians[name] = torch.zeros( + x.shape[1], x.shape[1], dtype=torch.float32, device=device + ) + hessians[name].addmm_(x.T, x) + y = module._last_proj_input + if y is not None: + y = y.float() + if y.ndim == 3: + y = y.reshape(-1, y.shape[-1]) + name = f"blocks.{layer_idx}.attn.proj.weight" + if name not in hessians: + hessians[name] = torch.zeros( + y.shape[1], y.shape[1], dtype=torch.float32, device=device + ) + hessians[name].addmm_(y.T, y) + return hook_fn + def make_mlp_hook(layer_idx): + def hook_fn(module, inp, out): + x = inp[0].detach().float() + if x.ndim == 3: + x = x.reshape(-1, x.shape[-1]) + name = f"blocks.{layer_idx}.mlp.fc.weight" + if name not in hessians: + hessians[name] = torch.zeros( + x.shape[1], x.shape[1], dtype=torch.float32, device=device + ) + hessians[name].addmm_(x.T, x) + h_act = module._last_down_input + if h_act is not None: + h_act = h_act.float() + if h_act.ndim == 3: + h_act = h_act.reshape(-1, h_act.shape[-1]) + name = f"blocks.{layer_idx}.mlp.proj.weight" + if name not in hessians: + hessians[name] = torch.zeros( + h_act.shape[1], h_act.shape[1], dtype=torch.float32, device=device + ) + hessians[name].addmm_(h_act.T, h_act) + return hook_fn + for i, block in enumerate(model.blocks): + hooks.append(block.attn.register_forward_hook(make_attn_hook(i))) + hooks.append(block.mlp.register_forward_hook(make_mlp_hook(i))) + def make_output_hook(name): + def hook_fn(module, inp, out): + x = out.detach().float() + if x.ndim == 3: + x = x.reshape(-1, x.shape[-1]) + if name not in hessians: + hessians[name] = torch.zeros( + x.shape[1], x.shape[1], dtype=torch.float32, device=device + ) + hessians[name].addmm_(x.T, x) + return hook_fn + hooks.append( + model.final_norm.register_forward_hook(make_output_hook("tok_emb.weight")) + ) + model.eval() + with torch.no_grad(): + for _ in range(n_calibration_batches): + x, _ = train_loader.next_batch(h.train_batch_tokens, h.grad_accum_steps) + model.forward_logits(x) + for hook in hooks: + hook.remove() + for i, block in enumerate(model.blocks): + block.attn._calib = False + block.mlp._calib = False + block.mlp.use_fused = True + for name in hessians: + hessians[name] = hessians[name].cpu() / n_calibration_batches + return hessians +def gptq_quantize_weight(w, H, clip_sigmas=3.0, clip_range=63, block_size=128): + W_orig = w.float().clone() + rows, cols = W_orig.shape + H = H.float().clone() + dead = torch.diag(H) == 0 + H[dead, dead] = 1 + damp = 0.01 * H.diag().mean() + H.diagonal().add_(damp) + perm = torch.argsort(H.diag(), descending=True) + invperm = torch.argsort(perm) + W_perm = W_orig[:, perm].clone() + W_perm[:, dead[perm]] = 0 + H = H[perm][:, perm] + H_flip = torch.flip(H, dims=(0, 1)) + L_flip = torch.linalg.cholesky(H_flip) + U = torch.flip(L_flip, dims=(0, 1)) + eye = torch.eye(cols, dtype=H.dtype, device=H.device) + Hinv = torch.linalg.solve_triangular(U, eye, upper=True) + row_std = W_orig.std(dim=1) + s = (clip_sigmas * row_std / clip_range).clamp_min(1e-10).to(torch.float16) + sf = s.float() + Q = torch.zeros(rows, cols, dtype=torch.int8) + W_work = W_perm.clone() + for i1 in range(0, cols, block_size): + i2 = min(i1 + block_size, cols) + W_block = W_work[:, i1:i2].clone() + Hinv_block = Hinv[i1:i2, i1:i2] + Err = torch.zeros(rows, i2 - i1) + for j in range(i2 - i1): + w_col = W_block[:, j] + d = Hinv_block[j, j] + q_col = torch.clamp(torch.round(w_col / sf), -clip_range, clip_range) + Q[:, i1 + j] = q_col.to(torch.int8) + err = (w_col - q_col.float() * sf) / d + Err[:, j] = err + W_block[:, j:] -= err.unsqueeze(1) * Hinv_block[j, j:].unsqueeze(0) + if i2 < cols: + W_work[:, i2:] -= Err @ Hinv[i1:i2, i2:] + return Q[:, invperm], s +def _quantize_gate_int8_row(w): + W = w.float().contiguous() + row_max = W.abs().amax(dim=1).clamp_min(1e-10) + s = (row_max / 127.0).to(torch.float16) + sf = s.float().view(-1, 1) + q = torch.clamp(torch.round(W / sf), -127, 127).to(torch.int8) + return q, s +def _lqer_pack(A, B, bits): + rng = 2 ** (bits - 1) - 1 + sA = (A.abs().amax(dim=1).clamp_min(1e-10) / rng).to(torch.float16) + sB = (B.abs().amax(dim=1).clamp_min(1e-10) / rng).to(torch.float16) + qA = torch.clamp(torch.round(A / sA.float().view(-1, 1)), -rng, rng).to(torch.int8) + qB = torch.clamp(torch.round(B / sB.float().view(-1, 1)), -rng, rng).to(torch.int8) + return qA, sA, qB, sB +def _lqer_pack_asym(A, B, g=64): + sA = (A.abs().amax().clamp_min(1e-10) / 1.5).to(torch.float16) + qA = torch.clamp(torch.round(A / sA.float()), -2, 1).to(torch.int8) + Bf = B.reshape(-1, g) + Bmax = Bf.abs().amax(dim=-1, keepdim=True).clamp_min(1e-10) + sB = (Bmax / 7.5).to(torch.float16).reshape(-1) + qB = torch.clamp(torch.round(Bf / sB.float().reshape(-1, 1)), -8, 7).to( + torch.int8 + ).reshape(B.shape) + return qA, sA, qB, sB +def gptq_mixed_quantize(state_dict, hessians, h): + result = {} + meta = {} + quant_gate = bool(getattr(h, "gated_attn_quant_gate", False)) + lqer_on = bool(getattr(h, "lqer_enabled", False)) + lqer_cands = {} + for (name, tensor) in state_dict.items(): + t = tensor.detach().cpu().contiguous() + if ( + quant_gate + and t.is_floating_point() + and t.ndim == 2 + and name.endswith(".attn_gate_w") + and 32 <= t.numel() <= 8192 + ): + gq, gs = _quantize_gate_int8_row(t) + result[name + ".gq"] = gq + result[name + ".gs"] = gs + meta[name] = "gate_int8_row" + continue + if not t.is_floating_point() or t.numel() <= 65536: + result[name] = t.to(torch.float16) if t.is_floating_point() else t + meta[name] = "passthrough (float16)" + continue + if "tok_emb" in name: + cs = h.embed_clip_sigmas + elif ".mlp." in name: + cs = h.mlp_clip_sigmas + elif ".attn." in name: + cs = h.attn_clip_sigmas + else: + cs = h.matrix_clip_sigmas + bits = h.embed_bits if "tok_emb" in name else h.matrix_bits + clip_range = 2 ** (bits - 1) - 1 + ret = gptq_quantize_weight( + t, hessians[name], clip_sigmas=cs, clip_range=clip_range + ) + q, s = ret + result[name + ".q"] = q + result[name + ".scale"] = s + meta[name] = f"gptq (int{bits})" + if lqer_on: + W_q = q.float() * s.float().view(-1, 1) + E = t.float() - W_q + lqer_cands[name] = (E, float(E.norm())) + if lqer_on and lqer_cands: + top = sorted(lqer_cands.items(), key=lambda kv: -kv[1][1])[: h.lqer_top_k] + asym_on = bool(getattr(h, "lqer_asym_enabled", False)) + asym_g = int(getattr(h, "lqer_asym_group", 64)) + for (name, (E, _)) in top: + U, S, Vh = torch.linalg.svd(E, full_matrices=False) + r = min(h.lqer_rank, S.numel()) + A = (U[:, :r] * S[:r]).contiguous() + B = Vh[:r, :].contiguous() + if asym_on and B.numel() % asym_g == 0: + qA, sA, qB, sB = _lqer_pack_asym(A, B, asym_g) + result[name + ".lqA_a"] = qA + result[name + ".lqAs_a"] = sA + result[name + ".lqB_a"] = qB + result[name + ".lqBs_a"] = sB + meta[name] = meta[name] + "+lqer_asym" + else: + qA, sA, qB, sB = _lqer_pack(A, B, h.lqer_factor_bits) + result[name + ".lqA"] = qA + result[name + ".lqAs"] = sA + result[name + ".lqB"] = qB + result[name + ".lqBs"] = sB + meta[name] = meta[name] + "+lqer" + categories = collections.defaultdict(set) + for (name, cat) in meta.items(): + short = re.sub("\\.\\d+$", "", re.sub("blocks\\.\\d+", "blocks", name)) + categories[cat].add(short) + log("Quantized weights:") + for cat in sorted(categories): + log(f" {cat}: {', '.join(sorted(categories[cat]))}") + return result, meta +def dequantize_mixed(result, meta, template_sd): + out = {} + for (name, orig) in template_sd.items(): + info = meta.get(name) + if info is None: + continue + orig_dtype = orig.dtype + if "passthrough" in info: + t = result[name] + if t.dtype == torch.float16 and orig_dtype in ( + torch.float32, + torch.bfloat16, + ): + t = t.to(orig_dtype) + out[name] = t + continue + if info == "gate_int8_row": + gq = result[name + ".gq"] + gs = result[name + ".gs"] + out[name] = (gq.float() * gs.float().view(-1, 1)).to(orig_dtype) + continue + q, s = result[name + ".q"], result[name + ".scale"] + if s.ndim > 0: + W = q.float() * s.float().view(q.shape[0], *[1] * (q.ndim - 1)) + else: + W = q.float() * float(s.item()) + if "lqer_asym" in info: + qA_t = result[name + ".lqA_a"] + sA_t = result[name + ".lqAs_a"] + qB_t = result[name + ".lqB_a"] + sB_t = result[name + ".lqBs_a"] + qA = qA_t.float() * float(sA_t) + g_sz = qB_t.numel() // sB_t.numel() + qB = (qB_t.reshape(-1, g_sz).float() * sB_t.float().view(-1, 1)).reshape( + qB_t.shape + ) + W = W + qA @ qB + elif "lqer" in info: + qA = result[name + ".lqA"].float() * result[name + ".lqAs"].float().view(-1, 1) + qB = result[name + ".lqB"].float() * result[name + ".lqBs"].float().view(-1, 1) + W = W + qA @ qB + out[name] = W.to(orig_dtype) + return out +_BSHF_MAGIC = b"BSHF" +def _byte_shuffle(data, stride=2): + if stride <= 1 or len(data) < stride: + return data + src = np.frombuffer(data, dtype=np.uint8) + n = len(src) + out = np.empty(n, dtype=np.uint8) + dest_off = 0 + for pos in range(stride): + chunk = src[pos::stride] + out[dest_off : dest_off + len(chunk)] = chunk + dest_off += len(chunk) + return _BSHF_MAGIC + bytes([stride]) + out.tobytes() +def _byte_unshuffle(data): + if len(data) < 5 or data[:4] != _BSHF_MAGIC: + return data + stride = data[4] + if stride < 2: + return data[5:] + payload = np.frombuffer(data, dtype=np.uint8, offset=5) + n = len(payload) + out = np.empty(n, dtype=np.uint8) + src_off = 0 + for pos in range(stride): + chunk_len = n // stride + (1 if pos < n % stride else 0) + out[pos::stride][:chunk_len] = payload[src_off : src_off + chunk_len] + src_off += chunk_len + return out.tobytes() +def _compress(data, compressor): + data = _byte_shuffle(data) + if compressor == "lzma": + return lzma.compress(data, preset=6) + elif compressor == "brotli": + import brotli + return brotli.compress(data, quality=11) + raise ValueError(f"Unknown compressor: {compressor!r}") +def _decompress(data, compressor): + if compressor == "lzma": + raw = lzma.decompress(data) + elif compressor == "brotli": + import brotli + raw = brotli.decompress(data) + else: + raise ValueError(f"Unknown compressor: {compressor!r}") + raw = _byte_unshuffle(raw) + return raw +def _unbank_state_dict(state_dict, num_layers): + sd = {} + n = num_layers + for k, v in state_dict.items(): + t = v.detach().cpu() if v is not None else None + if k == "qo_bank": + for i in range(n): + sd[f"blocks.{i}.attn.c_q.weight"] = t[i] + sd[f"blocks.{i}.attn.proj.weight"] = t[n + i] + elif k == "kv_bank": + for i in range(n): + sd[f"blocks.{i}.attn.c_k.weight"] = t[i] + sd[f"blocks.{i}.attn.c_v.weight"] = t[n + i] + elif k == "mlp_up_bank": + for i in range(n): + sd[f"blocks.{i}.mlp.fc.weight"] = t[i] + elif k == "mlp_down_bank": + for i in range(n): + sd[f"blocks.{i}.mlp.proj.weight"] = t[i] + else: + if t is not None: + sd[k] = t + return sd +def _rebank_state_dict(flat_sd, num_layers, model_dim, kv_dim, hidden_dim): + sd = {} + n = num_layers + sd["qo_bank"] = torch.zeros(2 * n, model_dim, model_dim) + sd["kv_bank"] = torch.zeros(2 * n, kv_dim, model_dim) + for i in range(n): + sd["qo_bank"][i] = flat_sd[f"blocks.{i}.attn.c_q.weight"] + sd["qo_bank"][n + i] = flat_sd[f"blocks.{i}.attn.proj.weight"] + sd["kv_bank"][i] = flat_sd[f"blocks.{i}.attn.c_k.weight"] + sd["kv_bank"][n + i] = flat_sd[f"blocks.{i}.attn.c_v.weight"] + sd["mlp_up_bank"] = torch.zeros(n, hidden_dim, model_dim) + sd["mlp_down_bank"] = torch.zeros(n, model_dim, hidden_dim) + for i in range(n): + sd["mlp_up_bank"][i] = flat_sd[f"blocks.{i}.mlp.fc.weight"] + sd["mlp_down_bank"][i] = flat_sd[f"blocks.{i}.mlp.proj.weight"] + for k, v in flat_sd.items(): + if not ( + k.startswith("blocks.") + and any( + p in k + for p in [ + ".attn.c_q.", ".attn.c_k.", ".attn.c_v.", + ".attn.proj.", ".mlp.fc.", ".mlp.proj.", + ] + ) + ): + sd[k] = v + return sd +def _compressed_code_size(code): + code_raw = code.encode("utf-8") + minified = subprocess.run( + ["pyminify", "--no-rename-locals", "--no-hoist-literals", "--remove-literal-statements", "-"], + input=code_raw, capture_output=True, check=True, + ).stdout + compressed = lzma.compress(minified) + encoded = base64.b85encode(compressed) + wrapper = b'import lzma as L,base64 as B\nexec(L.decompress(B.b85decode("' + encoded + b'")))\n' + return len(code_raw), len(wrapper) +def serialize(h, base_model, code): + try: + code_bytes_uncompressed, code_bytes = _compressed_code_size(code) + except (FileNotFoundError, subprocess.CalledProcessError) as e: + code_bytes_uncompressed, code_bytes = len(code.encode("utf-8")), 0 + log(f"pyminify unavailable ({type(e).__name__}); skipping compressed-code-size measurement") + if h.is_main_process: + torch.save(base_model.state_dict(), h.model_path) + model_bytes = os.path.getsize(h.model_path) + log(f"Serialized model: {model_bytes} bytes") + log(f"Code size (uncompressed): {code_bytes_uncompressed} bytes") + log(f"Code size (compressed): {code_bytes} bytes") + sd_cpu = _unbank_state_dict(base_model.state_dict(), h.num_layers) + device = torch.device("cuda", h.local_rank) + t0 = time.perf_counter() + calib_loader = ShuffledSequenceLoader(h, device) + log("GPTQ:collecting Hessians from calibration data...") + hessians = collect_hessians( + base_model, + calib_loader, + h, + device, + n_calibration_batches=h.gptq_calibration_batches, + ) + log(f"GPTQ:collected {len(hessians)} Hessians in {time.perf_counter()-t0:.1f}s") + t_quant = time.perf_counter() + quant_result, quant_meta = gptq_mixed_quantize(sd_cpu, hessians, h) + log(f"GPTQ:quantized weights in {time.perf_counter()-t_quant:.2f}s") + quant_buf = io.BytesIO() + torch.save({"w": quant_result, "m": quant_meta}, quant_buf) + quant_raw = quant_buf.getvalue() + quant_blob = _compress(quant_raw, h.compressor) + quant_file_bytes = len(quant_blob) + bytes_total = quant_file_bytes + code_bytes + if h.is_main_process: + with open(h.quantized_model_path, "wb") as f: + f.write(quant_blob) + log(f"Serialized model quantized+{h.compressor}: {quant_file_bytes} bytes") + log(f"Total submission size quantized+{h.compressor}: {bytes_total} bytes") + return bytes_total, quant_file_bytes +def deserialize(h, device): + eval_model = GPT(h).to(device).bfloat16() + restore_fp32_params(eval_model) + flat_template = _unbank_state_dict(eval_model.state_dict(), h.num_layers) + with open(h.quantized_model_path, "rb") as f: + quant_blob_disk = f.read() + quant_state = torch.load( + io.BytesIO(_decompress(quant_blob_disk, h.compressor)), map_location="cpu" + ) + deq_flat = dequantize_mixed(quant_state["w"], quant_state["m"], flat_template) + head_dim = h.model_dim // h.num_heads + kv_dim = h.num_kv_heads * head_dim + hidden_dim = int(h.mlp_mult * h.model_dim) + deq_state = _rebank_state_dict(deq_flat, h.num_layers, h.model_dim, kv_dim, hidden_dim) + eval_model.load_state_dict(deq_state, strict=True) + return eval_model +def _loss_bpb(loss_sum, token_count, byte_count): + val_loss = (loss_sum / token_count).item() + val_bpb = val_loss / math.log(2.0) * (token_count.item() / byte_count.item()) + return val_loss, val_bpb +def eval_val(h, device, val_data, model, forward_logits_fn=None): + seq_len = h.eval_seq_len + local_batch_tokens = h.val_batch_tokens // (h.world_size * h.grad_accum_steps) + if local_batch_tokens < seq_len: + raise ValueError( + f"VAL_BATCH_SIZE must provide at least one sequence per rank; got VAL_BATCH_SIZE={h.val_batch_tokens}, WORLD_SIZE={h.world_size}, GRAD_ACCUM_STEPS={h.grad_accum_steps}, seq_len={seq_len}" + ) + local_batch_seqs = local_batch_tokens // seq_len + total_seqs = (val_data.val_tokens.numel() - 1) // seq_len + seq_start = total_seqs * h.rank // h.world_size + seq_end = total_seqs * (h.rank + 1) // h.world_size + seq_end = seq_start + ((seq_end - seq_start) // local_batch_seqs) * local_batch_seqs + val_loss_sum = torch.zeros((), device=device, dtype=torch.float64) + val_token_count = torch.zeros((), device=device, dtype=torch.float64) + val_byte_count = torch.zeros((), device=device, dtype=torch.float64) + run_forward_logits = ( + (model.module.forward_logits if hasattr(model, "module") else model.forward_logits) + if forward_logits_fn is None + else forward_logits_fn + ) + model.eval() + global BOS_ID + if BOS_ID is None: + BOS_ID = 1 + with torch.no_grad(): + for batch_seq_start in range(seq_start, seq_end, local_batch_seqs): + batch_seq_end = min(batch_seq_start + local_batch_seqs, seq_end) + raw_start = batch_seq_start * seq_len + raw_end = batch_seq_end * seq_len + 1 + local = val_data.val_tokens[raw_start:raw_end].to( + device=device, dtype=torch.int64, non_blocking=True + ) + x = local[:-1] + y = local[1:] + bos_pos = (x == BOS_ID).nonzero(as_tuple=True)[0].tolist() + cu_seqlens, max_seqlen = _build_cu_seqlens( + bos_pos, x.numel(), x.device, h.eval_seq_len, 64 + ) + with torch.autocast(device_type="cuda", dtype=torch.bfloat16, enabled=True): + logits = run_forward_logits( + x[None], cu_seqlens=cu_seqlens, max_seqlen=max_seqlen + ).detach() + per_token_loss = F.cross_entropy( + logits.reshape(-1, logits.size(-1)).float(), + y.reshape(-1), + reduction="none", + ) + val_loss_sum += per_token_loss.to(torch.float64).sum() + val_token_count += float(y.numel()) + prev_ids = x + tgt_ids = y + if val_data.caseops_enabled and val_data.val_bytes is not None: + sidecar_slice = val_data.val_bytes[raw_start + 1 : raw_end].to( + device=device, dtype=torch.int32, non_blocking=True + ) + val_byte_count += sidecar_slice.to(torch.float64).sum() + else: + token_bytes = val_data.base_bytes_lut[tgt_ids].to(dtype=torch.int16) + token_bytes += ( + val_data.has_leading_space_lut[tgt_ids] + & ~val_data.is_boundary_token_lut[prev_ids] + ).to(dtype=torch.int16) + val_byte_count += token_bytes.to(torch.float64).sum() + if dist.is_available() and dist.is_initialized(): + dist.all_reduce(val_loss_sum, op=dist.ReduceOp.SUM) + dist.all_reduce(val_token_count, op=dist.ReduceOp.SUM) + dist.all_reduce(val_byte_count, op=dist.ReduceOp.SUM) + model.train() + return _loss_bpb(val_loss_sum, val_token_count, val_byte_count) +def _find_docs(all_tokens): + bos_positions = (all_tokens == BOS_ID).nonzero(as_tuple=True)[0].numpy() + docs = [] + for i in range(len(bos_positions)): + start = int(bos_positions[i]) + end = ( + int(bos_positions[i + 1]) + if i + 1 < len(bos_positions) + else all_tokens.numel() + ) + if i + 1 < len(bos_positions): + end += 1 + assert end - start >= 2 + docs.append((start, end - start)) + return docs +def _build_ttt_global_batches(doc_entries, h, ascending=False): + batch_size = h.ttt_batch_size + global_doc_entries = sorted(doc_entries, key=lambda x: x[1][1]) + global_batches = [ + global_doc_entries[i : i + batch_size] + for i in range(0, len(global_doc_entries), batch_size) + ] + indexed = list(enumerate(global_batches)) + if not ascending: + indexed.sort(key=lambda ib: -max(dl for _, (_, dl) in ib[1])) + return indexed +def _init_batch_counter(path): + with open(path, "wb") as f: + f.write((0).to_bytes(4, "little")) +def _claim_next_batch(counter_path, queue_len): + try: + with open(counter_path, "r+b") as f: + fcntl.flock(f, fcntl.LOCK_EX) + idx = int.from_bytes(f.read(4), "little") + f.seek(0) + f.write((idx + 1).to_bytes(4, "little")) + f.flush() + except FileNotFoundError: + return queue_len + return idx +def _compute_chunk_window(ci, pred_len, num_chunks, chunk_size, eval_seq_len): + chunk_end = pred_len if ci == num_chunks - 1 else (ci + 1) * chunk_size + win_start = max(0, chunk_end - eval_seq_len) + win_len = chunk_end - win_start + chunk_start = ci * chunk_size + chunk_offset = chunk_start - win_start + chunk_len = chunk_end - chunk_start + return win_start, win_len, chunk_offset, chunk_len +def _accumulate_bpb( + ptl, + x, + y, + chunk_offsets, + chunk_lens, + pos_idx, + base_bytes_lut, + has_leading_space_lut, + is_boundary_token_lut, + loss_sum, + byte_sum, + token_count, + y_bytes=None, +): + pos = pos_idx[: x.size(1)].unsqueeze(0) + mask = ( + (chunk_lens.unsqueeze(1) > 0) + & (pos >= chunk_offsets.unsqueeze(1)) + & (pos < (chunk_offsets + chunk_lens).unsqueeze(1)) + ) + mask_f64 = mask.to(torch.float64) + if y_bytes is not None: + tok_bytes = y_bytes.to(torch.float64) + else: + tok_bytes = base_bytes_lut[y].to(torch.float64) + tok_bytes += (has_leading_space_lut[y] & ~is_boundary_token_lut[x]).to( + torch.float64 + ) + loss_sum += (ptl.to(torch.float64) * mask_f64).sum() + byte_sum += (tok_bytes * mask_f64).sum() + token_count += chunk_lens.to(torch.float64).sum() +def _add_to_counter(path, delta): + try: + with open(path, "r+b") as f: + fcntl.flock(f, fcntl.LOCK_EX) + cur = int.from_bytes(f.read(8), "little", signed=True) + cur += int(delta) + f.seek(0) + f.write(int(cur).to_bytes(8, "little", signed=True)) + f.flush() + return cur + except FileNotFoundError: + return int(delta) +def _init_int64_counter(path): + with open(path, "wb") as f: + f.write((0).to_bytes(8, "little", signed=True)) +def _select_ttt_doc_entries(docs, h): + doc_entries = list(enumerate(docs)) + if h.val_doc_fraction < 1.0: + sample_n = max(1, int(round(len(docs) * h.val_doc_fraction))) + sampled_indices = sorted( + random.Random(h.seed).sample(range(len(docs)), sample_n) + ) + return [(i, docs[i]) for i in sampled_indices] + return doc_entries +def train_val_ttt_global_sgd_distributed(h, device, val_data, base_model, val_tokens, batch_seqs=None): + global BOS_ID + if BOS_ID is None: + BOS_ID = 1 + base_model.eval() + seq_len = h.eval_seq_len + total_tokens = val_tokens.numel() - 1 + ttt_chunk = h.global_ttt_chunk_tokens + batch_seqs = h.global_ttt_batch_seqs if batch_seqs is None else batch_seqs + num_chunks = (total_tokens + ttt_chunk - 1) // ttt_chunk + ttt_params = [p for p in base_model.parameters()] + for p in ttt_params: + p.requires_grad_(True) + optimizer = torch.optim.SGD( + ttt_params, lr=h.global_ttt_lr, momentum=h.global_ttt_momentum + ) + t_start = time.perf_counter() + for ci in range(num_chunks): + chunk_start = ci * ttt_chunk + chunk_end = min((ci + 1) * ttt_chunk, total_tokens) + is_last_chunk = ci == num_chunks - 1 + if is_last_chunk or h.global_ttt_epochs <= 0: + continue + base_model.train() + chunk_seqs = (chunk_end - chunk_start) // seq_len + if chunk_seqs <= 0: + continue + warmup_chunks = max(0, min(h.global_ttt_warmup_chunks, num_chunks - 1)) + if warmup_chunks > 0 and ci < warmup_chunks: + warmup_denom = max(warmup_chunks - 1, 1) + warmup_t = ci / warmup_denom + lr_now = ( + h.global_ttt_warmup_start_lr + + (h.global_ttt_lr - h.global_ttt_warmup_start_lr) * warmup_t + ) + else: + decay_steps = max(num_chunks - 1 - warmup_chunks, 1) + decay_ci = max(ci - warmup_chunks, 0) + lr_now = h.global_ttt_lr * 0.5 * ( + 1.0 + math.cos(math.pi * decay_ci / decay_steps) + ) + for pg in optimizer.param_groups: + pg["lr"] = lr_now + my_seq_s = chunk_seqs * h.rank // h.world_size + my_seq_e = chunk_seqs * (h.rank + 1) // h.world_size + my_chunk_seqs = my_seq_e - my_seq_s + for _ in range(h.global_ttt_epochs): + for bs in range(0, my_chunk_seqs, batch_seqs): + be = min(bs + batch_seqs, my_chunk_seqs) + actual_bs = my_seq_s + bs + start_tok = chunk_start + actual_bs * seq_len + end_tok = chunk_start + (my_seq_s + be) * seq_len + 1 + if end_tok > val_tokens.numel(): + continue + local = val_tokens[start_tok:end_tok].to(device=device, dtype=torch.int64) + x_flat = local[:-1] + y_flat = local[1:] + optimizer.zero_grad(set_to_none=True) + with torch.enable_grad(): + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + if h.global_ttt_respect_doc_boundaries: + bos_pos = (x_flat == BOS_ID).nonzero(as_tuple=True)[0].tolist() + cu_seqlens, max_seqlen = _build_cu_seqlens( + bos_pos, x_flat.numel(), x_flat.device, h.eval_seq_len, 64 + ) + loss = base_model( + x_flat[None], + y_flat[None], + cu_seqlens=cu_seqlens, + max_seqlen=max_seqlen, + ) + else: + x = x_flat.reshape(-1, seq_len) + y = y_flat.reshape(-1, seq_len) + loss = base_model(x, y) + loss.backward() + if dist.is_available() and dist.is_initialized(): + for p in ttt_params: + if p.grad is not None: + dist.all_reduce(p.grad, op=dist.ReduceOp.SUM) + p.grad.mul_(1.0 / h.world_size) + if h.global_ttt_grad_clip > 0: + torch.nn.utils.clip_grad_norm_(ttt_params, h.global_ttt_grad_clip) + optimizer.step() + base_model.eval() + if h.rank == 0: + elapsed = time.perf_counter() - t_start + log( + f"tttg: c{ci+1}/{num_chunks} lr:{lr_now:.6f} t:{elapsed:.1f}s" + ) + for p in base_model.parameters(): + p.requires_grad_(True) + base_model.eval() +def eval_val_ttt_phased(h, base_model, device, val_data, forward_ttt_train): + global BOS_ID + if BOS_ID is None: + BOS_ID = 1 + base_model.eval() + for p in base_model.parameters(): + p.requires_grad_(False) + all_tokens = val_data.val_tokens + all_tokens_idx = all_tokens.to(torch.int32) + docs = _find_docs(all_tokens) + doc_entries = _select_ttt_doc_entries(docs, h) + prefix_doc_limit = max(0, min(len(doc_entries), int(h.phased_ttt_prefix_docs))) + num_phases = max(1, int(h.phased_ttt_num_phases)) + phase_boundaries = [] + for pi in range(num_phases): + boundary = prefix_doc_limit * (pi + 1) // num_phases + phase_boundaries.append(boundary) + current_phase = 0 + current_phase_boundary = phase_boundaries[0] + log( + "ttt_phased:" + f" total_docs:{len(doc_entries)} prefix_docs:{prefix_doc_limit} " + f"suffix_docs:{len(doc_entries) - prefix_doc_limit}" + f" num_phases:{num_phases} boundaries:{phase_boundaries}" + ) + chunk_size, eval_seq_len = h.ttt_chunk_size, h.ttt_eval_seq_len + eval_batch_set = None + if h.ttt_eval_batches: + eval_batch_set = set(int(x) for x in h.ttt_eval_batches.split(",") if x.strip()) + use_ascending = eval_batch_set is not None + global_batches_sorted = _build_ttt_global_batches( + doc_entries, h, ascending=use_ascending + ) + queue_len = len(global_batches_sorted) + counter_path = f"/tmp/ttt_counter_{h.run_id}" + prefix_counter_path = f"/tmp/ttt_prefix_counter_{h.run_id}" + pause_flag_path = f"/tmp/ttt_pause_flag_{h.run_id}" + if h.rank == 0: + _init_batch_counter(counter_path) + _init_int64_counter(prefix_counter_path) + try: + os.remove(pause_flag_path) + except FileNotFoundError: + pass + if dist.is_available() and dist.is_initialized(): + path_list = [counter_path, prefix_counter_path, pause_flag_path] + dist.broadcast_object_list(path_list, src=0) + counter_path, prefix_counter_path, pause_flag_path = path_list + dist.barrier() + loss_sum = torch.zeros((), device=device, dtype=torch.float64) + byte_sum = torch.zeros((), device=device, dtype=torch.float64) + token_count = torch.zeros((), device=device, dtype=torch.float64) + t_start = time.perf_counter() + reusable_lora = BatchedTTTLoRA( + h.ttt_batch_size, base_model, h.ttt_lora_rank, + k_lora=h.ttt_k_lora, mlp_lora=h.ttt_mlp_lora, o_lora=h.ttt_o_lora, + ).to(device) + def _build_opt(lora): + return torch.optim.AdamW( + lora.parameters(), lr=h.ttt_lora_lr, + betas=(h.ttt_beta1, h.ttt_beta2), + eps=1e-10, weight_decay=h.ttt_weight_decay, fused=True, + ) + reusable_opt = _build_opt(reusable_lora) + local_scored_docs = [] + global_ttt_done = prefix_doc_limit == 0 + while True: + queue_idx = _claim_next_batch(counter_path, queue_len) + if queue_idx >= queue_len: + break + orig_batch_idx, batch_entries = global_batches_sorted[queue_idx] + batch = [doc for _, doc in batch_entries] + bsz = len(batch) + prev_loss = loss_sum.item() + prev_bytes = byte_sum.item() + prev_tokens = token_count.item() + if bsz == reusable_lora.bsz: + reusable_lora.reset() + for s in reusable_opt.state.values(): + for k, v in s.items(): + if isinstance(v, torch.Tensor): + v.zero_() + elif k == "step": + s[k] = 0 + cur_lora = reusable_lora + cur_opt = reusable_opt + else: + cur_lora = BatchedTTTLoRA( + bsz, base_model, h.ttt_lora_rank, + k_lora=h.ttt_k_lora, mlp_lora=h.ttt_mlp_lora, o_lora=h.ttt_o_lora, + ).to(device) + cur_opt = _build_opt(cur_lora) + pred_lens = [doc_len - 1 for _, doc_len in batch] + num_chunks = [(pl + chunk_size - 1) // chunk_size for pl in pred_lens] + max_nc = max(num_chunks) + num_chunks_t = torch.tensor(num_chunks, dtype=torch.int64, device=device) + for ci in range(max_nc): + active = [ci < nc for nc in num_chunks] + needs_train = any(ci < nc - 1 for nc in num_chunks) + tok_starts = torch.zeros(bsz, dtype=torch.int64) + tok_wls = torch.zeros(bsz, dtype=torch.int64) + chunk_offsets_cpu = torch.zeros(bsz, dtype=torch.int64) + chunk_lens_cpu = torch.zeros(bsz, dtype=torch.int64) + for b in range(bsz): + if not active[b]: + continue + doc_start, doc_len = batch[b] + win_start, win_len, chunk_offset, chunk_len = _compute_chunk_window( + ci, pred_lens[b], num_chunks[b], chunk_size, eval_seq_len + ) + tok_starts[b] = doc_start + win_start + tok_wls[b] = win_len + chunk_offsets_cpu[b] = chunk_offset + chunk_lens_cpu[b] = chunk_len + _, context_size, chunk_offset, _ = _compute_chunk_window( + ci, (ci + 1) * chunk_size, ci + 1, chunk_size, eval_seq_len + ) + col_idx = torch.arange(context_size + 1) + idx = tok_starts.unsqueeze(1) + col_idx.unsqueeze(0) + idx.clamp_(max=all_tokens.numel() - 1) + gathered_gpu = all_tokens_idx[idx].to( + device=device, dtype=torch.int64, non_blocking=True + ) + valid = (col_idx[:context_size].unsqueeze(0) < tok_wls.unsqueeze(1)).to( + device, non_blocking=True + ) + chunk_offsets = chunk_offsets_cpu.to(device, non_blocking=True) + chunk_lens = chunk_lens_cpu.to(device, non_blocking=True) + x = torch.where(valid, gathered_gpu[:, :context_size], 0) + y = torch.where(valid, gathered_gpu[:, 1 : context_size + 1], 0) + ctx_pos = torch.arange(context_size, device=device, dtype=torch.int64) + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + per_tok_loss = forward_ttt_train(x, y, lora=cur_lora) + y_bytes_arg = None + if val_data.caseops_enabled and val_data.val_bytes is not None: + y_idx = ( + tok_starts.unsqueeze(1) + + 1 + + col_idx[:context_size].unsqueeze(0) + ) + y_idx = y_idx.clamp_(max=val_data.val_bytes.numel() - 1) + y_bytes_arg = val_data.val_bytes[y_idx].to( + device=device, dtype=torch.int32, non_blocking=True + ) + y_bytes_arg = torch.where( + valid, y_bytes_arg, torch.zeros_like(y_bytes_arg) + ) + with torch.no_grad(): + _accumulate_bpb( + per_tok_loss, + x, + y, + chunk_offsets, + chunk_lens, + ctx_pos, + val_data.base_bytes_lut, + val_data.has_leading_space_lut, + val_data.is_boundary_token_lut, + loss_sum, + byte_sum, + token_count, + y_bytes=y_bytes_arg, + ) + if needs_train: + activate_chunk_mask = (num_chunks_t - 1 > ci).float() + for gi in range(h.ttt_grad_steps): + if gi > 0: + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + per_tok_loss = forward_ttt_train(x, y, lora=cur_lora) + per_doc = per_tok_loss[ + :, chunk_offset : chunk_offset + chunk_size + ].mean(dim=-1) + cur_opt.zero_grad(set_to_none=True) + (per_doc * activate_chunk_mask).sum().backward() + cur_opt.step() + else: + del per_tok_loss + batch_num = orig_batch_idx + 1 + doc_lens = [dl for _, dl in batch] + should_report = batch_num in eval_batch_set if eval_batch_set is not None else True + if should_report: + cur_tokens = token_count.item() + cur_loss_val = loss_sum.item() + cur_bytes_val = byte_sum.item() + dt = cur_tokens - prev_tokens + db = cur_bytes_val - prev_bytes + if dt > 0 and db > 0: + b_loss = (cur_loss_val - prev_loss) / dt + b_bpb = b_loss / math.log(2.0) * (dt / db) + else: + b_loss = b_bpb = 0.0 + r_loss = cur_loss_val / max(cur_tokens, 1) + r_bpb = r_loss / math.log(2.0) * (cur_tokens / max(cur_bytes_val, 1)) + elapsed = time.perf_counter() - t_start + log( + f"ttp: b{batch_num}/{queue_len} bl:{b_loss:.4f} bb:{b_bpb:.4f} " + f"rl:{r_loss:.4f} rb:{r_bpb:.4f} dl:{min(doc_lens)}-{max(doc_lens)} " + f"gd:{int(global_ttt_done)}" + ) + if not global_ttt_done: + local_scored_docs.extend( + (orig_batch_idx, pos, doc_start, doc_len) + for pos, (doc_start, doc_len) in enumerate(batch) + ) + prefix_done = _add_to_counter(prefix_counter_path, len(batch_entries)) + if prefix_done >= current_phase_boundary: + try: + with open(pause_flag_path, "x"): + pass + except FileExistsError: + pass + should_pause = os.path.exists(pause_flag_path) + if should_pause: + if dist.is_available() and dist.is_initialized(): + dist.barrier() + gathered_scored_docs = [None] * h.world_size + if dist.is_available() and dist.is_initialized(): + dist.all_gather_object(gathered_scored_docs, local_scored_docs) + else: + gathered_scored_docs = [local_scored_docs] + scored_docs_for_global = [] + for rank_docs in gathered_scored_docs: + if rank_docs: + scored_docs_for_global.extend(rank_docs) + scored_docs_for_global.sort(key=lambda x: (x[0], x[1])) + scored_docs_for_global = scored_docs_for_global[:current_phase_boundary] + scored_token_chunks = [ + val_data.val_tokens[doc_start : doc_start + doc_len] + for _, _, doc_start, doc_len in scored_docs_for_global + ] + if scored_token_chunks: + global_ttt_tokens = torch.cat(scored_token_chunks) + else: + global_ttt_tokens = val_data.val_tokens[:0] + if h.rank == 0: + prefix_done = 0 + try: + with open(prefix_counter_path, "rb") as f: + prefix_done = int.from_bytes( + f.read(8), "little", signed=True + ) + except FileNotFoundError: + pass + log( + f"ttpp: phase:{current_phase + 1}/{num_phases} pd:{prefix_done} " + f"gd:{len(scored_docs_for_global)} " + f"t:{time.perf_counter() - t_start:.1f}s" + ) + train_val_ttt_global_sgd_distributed( + h, device, val_data, base_model, global_ttt_tokens + ) + for p in base_model.parameters(): + p.requires_grad_(False) + reusable_lora = BatchedTTTLoRA( + h.ttt_batch_size, base_model, h.ttt_lora_rank, + k_lora=h.ttt_k_lora, mlp_lora=h.ttt_mlp_lora, o_lora=h.ttt_o_lora, + ).to(device) + reusable_opt = _build_opt(reusable_lora) + current_phase += 1 + if current_phase >= num_phases: + global_ttt_done = True + else: + current_phase_boundary = phase_boundaries[current_phase] + if h.rank == 0: + try: + os.remove(pause_flag_path) + except FileNotFoundError: + pass + if dist.is_available() and dist.is_initialized(): + dist.barrier() + if h.rank == 0: + log(f"ttpr: phase:{current_phase}/{num_phases} t:{time.perf_counter() - t_start:.1f}s") + del cur_lora, cur_opt + if dist.is_available() and dist.is_initialized(): + dist.all_reduce(loss_sum, op=dist.ReduceOp.SUM) + dist.all_reduce(byte_sum, op=dist.ReduceOp.SUM) + dist.all_reduce(token_count, op=dist.ReduceOp.SUM) + for p in base_model.parameters(): + p.requires_grad_(True) + base_model.train() + return _loss_bpb(loss_sum, token_count, byte_sum) +def timed_eval(label, fn, *args, **kwargs): + torch.cuda.synchronize() + t0 = time.perf_counter() + val_loss, val_bpb = fn(*args, **kwargs) + torch.cuda.synchronize() + elapsed_ms = 1e3 * (time.perf_counter() - t0) + log( + f"{label} val_loss:{val_loss:.8f} val_bpb:{val_bpb:.8f} eval_time:{elapsed_ms:.0f}ms" + ) + return val_loss, val_bpb +def train_model(h, device, val_data): + base_model = GPT(h).to(device).bfloat16() + restore_fp32_params(base_model) + compiled_model = torch.compile(base_model, dynamic=False, fullgraph=True) + compiled_forward_logits = torch.compile( + base_model.forward_logits, dynamic=False, fullgraph=True + ) + model = compiled_model + log(f"model_params:{sum(p.numel()for p in base_model.parameters())}") + optimizers = Optimizers(h, base_model) + train_loader = DocumentPackingLoader(h, device) + max_wallclock_ms = ( + 1e3 * h.max_wallclock_seconds if h.max_wallclock_seconds > 0 else None + ) + if max_wallclock_ms is not None: + max_wallclock_ms -= h.gptq_reserve_seconds * 1e3 + log( + f"gptq:reserving {h.gptq_reserve_seconds:.0f}s, effective={max_wallclock_ms:.0f}ms" + ) + def training_frac(step, elapsed_ms): + if max_wallclock_ms is None: + return step / max(h.iterations, 1) + return elapsed_ms / max(max_wallclock_ms, 1e-09) + def lr_mul(frac): + if h.warmdown_frac <= 0: + return 1.0 + if frac >= 1.0 - h.warmdown_frac: + return max((1.0 - frac) / h.warmdown_frac, h.min_lr) + return 1.0 + def step_fn(step, lr_scale): + optimizers.zero_grad_all() + train_loss = torch.zeros((), device=device) + for micro_step in range(h.grad_accum_steps): + x, y, cu_seqlens, _max_seqlen = train_loader.next_batch( + h.train_batch_tokens, h.grad_accum_steps + ) + with torch.autocast(device_type="cuda", dtype=torch.bfloat16, enabled=True): + loss = model(x, y, cu_seqlens=cu_seqlens, max_seqlen=h.train_seq_len) + train_loss += loss.detach() + (loss / h.grad_accum_steps).backward() + train_loss /= h.grad_accum_steps + frac = ( + min(step / h.muon_momentum_warmup_steps, 1.0) + if h.muon_momentum_warmup_steps > 0 + else 1.0 + ) + muon_momentum = ( + 1 - frac + ) * h.muon_momentum_warmup_start + frac * h.muon_momentum + for group in optimizers.optimizer_muon.param_groups: + group["momentum"] = muon_momentum + for opt in optimizers: + for group in opt.param_groups: + group["lr"] = group["base_lr"] * lr_scale + if h.grad_clip_norm > 0: + torch.nn.utils.clip_grad_norm_(base_model.parameters(), h.grad_clip_norm) + optimizers.step(distributed=h.distributed) + return train_loss + if h.warmup_steps > 0: + initial_model_state = { + name: tensor.detach().cpu().clone() + for (name, tensor) in base_model.state_dict().items() + } + initial_optimizer_states = [ + copy.deepcopy(opt.state_dict()) for opt in optimizers + ] + model.train() + num_tokens_local = h.train_batch_tokens // h.world_size + for blk in base_model.blocks: + blk.attn.rotary(num_tokens_local, device, torch.bfloat16) + cu_bucket_size = train_loader.cu_bucket_size + warmup_cu_buckets = tuple(cu_bucket_size * i for i in range(1, 5)) + warmup_cu_iters = 3 + x, y, cu_seqlens, _ = train_loader.next_batch( + h.train_batch_tokens, h.grad_accum_steps + ) + log(f"warmup_cu_buckets:{','.join(str(b) for b in warmup_cu_buckets)} iters_each:{warmup_cu_iters}") + def _run_cu_bucket_warmup(): + for bucket_len in warmup_cu_buckets: + boundaries = list(range(0, x.size(1), max(h.train_seq_len, 1))) + if boundaries[-1] != x.size(1): + boundaries.append(x.size(1)) + cu = torch.full((bucket_len,), x.size(1), dtype=torch.int32, device=device) + cu[: len(boundaries)] = torch.tensor(boundaries, dtype=torch.int32, device=device) + for _ in range(warmup_cu_iters): + optimizers.zero_grad_all() + with torch.autocast(device_type="cuda", dtype=torch.bfloat16, enabled=True): + wloss = model(x, y, cu_seqlens=cu, max_seqlen=h.train_seq_len) + (wloss / h.grad_accum_steps).backward() + optimizers.zero_grad_all() + _run_cu_bucket_warmup() + if h.num_loops > 0: + base_model.looping_active = True + _run_cu_bucket_warmup() + base_model.looping_active = False + for warmup_step in range(h.warmup_steps): + step_fn(warmup_step, 1.0) + if ( + warmup_step <= 5 + or (warmup_step + 1) % 10 == 0 + or warmup_step + 1 == h.warmup_steps + ): + log(f"warmup_step: {warmup_step+1}/{h.warmup_steps}") + if h.num_loops > 0: + base_model.looping_active = True + log( + f"loop_warmup:enabled encoder:{base_model.encoder_indices} decoder:{base_model.decoder_indices}" + ) + for warmup_step in range(h.warmup_steps): + step_fn(warmup_step, 1.0) + if ( + warmup_step <= 5 + or (warmup_step + 1) % 10 == 0 + or warmup_step + 1 == h.warmup_steps + ): + log(f"loop_warmup_step: {warmup_step+1}/{h.warmup_steps}") + base_model.looping_active = False + base_model.load_state_dict(initial_model_state, strict=True) + for (opt, state) in zip(optimizers, initial_optimizer_states, strict=True): + opt.load_state_dict(state) + optimizers.zero_grad_all() + train_loader = DocumentPackingLoader(h, device) + ema_state = { + name: t.detach().float().clone() + for (name, t) in base_model.state_dict().items() + } + ema_decay = h.ema_decay + training_time_ms = 0.0 + stop_after_step = None + torch.cuda.synchronize() + t0 = time.perf_counter() + step = 0 + while True: + last_step = ( + step == h.iterations + or stop_after_step is not None + and step >= stop_after_step + ) + should_validate = ( + last_step or h.val_loss_every > 0 and step % h.val_loss_every == 0 + ) + if should_validate: + torch.cuda.synchronize() + training_time_ms += 1e3 * (time.perf_counter() - t0) + val_loss, val_bpb = eval_val( + h, device, val_data, model, compiled_forward_logits + ) + log( + f"{step}/{h.iterations} val_loss: {val_loss:.4f} val_bpb: {val_bpb:.4f}" + ) + torch.cuda.synchronize() + t0 = time.perf_counter() + if last_step: + if stop_after_step is not None and step < h.iterations: + log( + f"stopping_early: wallclock_cap train_time: {training_time_ms:.0f}ms step: {step}/{h.iterations}" + ) + break + elapsed_ms = training_time_ms + 1e3 * (time.perf_counter() - t0) + frac = training_frac(step, elapsed_ms) + scale = lr_mul(frac) + if ( + h.num_loops > 0 + and not base_model.looping_active + and frac >= h.enable_looping_at + ): + base_model.looping_active = True + log( + f"layer_loop:enabled step:{step} frac:{frac:.3f} encoder:{base_model.encoder_indices} decoder:{base_model.decoder_indices}" + ) + train_loss = step_fn(step, scale) + with torch.no_grad(): + for (name, t) in base_model.state_dict().items(): + ema_state[name].mul_(ema_decay).add_( + t.detach().float(), alpha=1.0 - ema_decay + ) + step += 1 + approx_training_time_ms = training_time_ms + 1e3 * (time.perf_counter() - t0) + should_log_train = h.train_log_every > 0 and ( + step <= 5 or step % h.train_log_every == 0 or stop_after_step is not None + ) + if should_log_train: + tok_per_sec = step * h.train_batch_tokens / (approx_training_time_ms / 1e3) + log( + f"{step}/{h.iterations} train_loss: {train_loss.item():.4f} train_time: {approx_training_time_ms/60000:.1f}m tok/s: {tok_per_sec:.0f}" + ) + reached_cap = ( + max_wallclock_ms is not None and approx_training_time_ms >= max_wallclock_ms + ) + if h.distributed and max_wallclock_ms is not None: + reached_cap_tensor = torch.tensor(int(reached_cap), device=device) + dist.all_reduce(reached_cap_tensor, op=dist.ReduceOp.MAX) + reached_cap = bool(reached_cap_tensor.item()) + if stop_after_step is None and reached_cap: + stop_after_step = step + log( + f"peak memory allocated: {torch.cuda.max_memory_allocated()//1024//1024} MiB reserved: {torch.cuda.max_memory_reserved()//1024//1024} MiB" + ) + log("ema:applying EMA weights") + current_state = base_model.state_dict() + avg_state = { + name: t.to(dtype=current_state[name].dtype) for (name, t) in ema_state.items() + } + base_model.load_state_dict(avg_state, strict=True) + return base_model, compiled_model, compiled_forward_logits +def train_and_eval(h, device): + random.seed(h.seed) + np.random.seed(h.seed) + torch.manual_seed(h.seed) + torch.cuda.manual_seed_all(h.seed) + if h.artifact_dir and h.is_main_process: + os.makedirs(h.artifact_dir, exist_ok=True) + val_data = ValidationData(h, device) + log( + f"train_shards: {len(list(Path(h.datasets_dir).resolve().glob('fineweb_train_*.bin')))}" + ) + log(f"val_tokens: {val_data.val_tokens.numel()-1}") + ttt_eval_only = os.environ.get("TTT_EVAL_ONLY", "0") == "1" + if ttt_eval_only: + log("TTT_EVAL_ONLY=1 — skipping training + GPTQ, loading saved artifact for TTT eval") + log(f"ttt_lora_alpha: {BatchedLinearLoRA._ALPHA}") + log(f"ttt_warm_start_a: {BatchedLinearLoRA._WARM_START_A}") + log(f"ttt_weight_decay: {h.ttt_weight_decay}") + else: + base_model, compiled_model, compiled_forward_logits = train_model( + h, device, val_data + ) + torch._dynamo.reset() + timed_eval( + "diagnostic pre-quantization post-ema", + eval_val, + h, + device, + val_data, + compiled_model, + compiled_forward_logits, + ) + if os.environ.get("PREQUANT_ONLY", "0") == "1": + log("PREQUANT_ONLY=1 — skipping serialize/GPTQ/post-quant eval/TTT") + return + serialize(h, base_model, Path(__file__).read_text(encoding="utf-8")) + if h.distributed: + dist.barrier() + eval_model = deserialize(h, device) + if h.num_loops > 0: + eval_model.looping_active = True + if not ttt_eval_only: + compiled_model = torch.compile(eval_model, dynamic=False, fullgraph=True) + compiled_forward_logits = torch.compile( + eval_model.forward_logits, dynamic=False, fullgraph=True + ) + timed_eval( + "diagnostic quantized", + eval_val, + h, + device, + val_data, + compiled_model, + compiled_forward_logits, + ) + del eval_model + if h.ttt_enabled: + if not ttt_eval_only: + del compiled_model + if ttt_eval_only: + del eval_model + torch._dynamo.reset() + torch.cuda.empty_cache() + ttt_model = deserialize(h, device) + if h.num_loops > 0: + ttt_model.looping_active = True + for p in ttt_model.parameters(): + p.requires_grad_(False) + if h.rope_yarn: + _yarn_seqlen = h.train_batch_tokens // h.grad_accum_steps + for block in ttt_model.blocks: + block.attn.rotary(_yarn_seqlen, device, torch.bfloat16) + else: + for block in ttt_model.blocks: + block.attn.rotary._cos_cached = None + block.attn.rotary._sin_cached = None + block.attn.rotary._seq_len_cached = 0 + block.attn.rotary(h.ttt_eval_seq_len, device, torch.bfloat16) + def _fwd_ttt_inner(input_ids, target_ids, lora): + return ttt_model.forward_ttt(input_ids, target_ids, lora=lora) + _fwd_ttt_compiled_inner = None + def _fwd_ttt(input_ids, target_ids, lora): + nonlocal _fwd_ttt_compiled_inner + if _fwd_ttt_compiled_inner is None: + _fwd_ttt_compiled_inner = torch.compile(_fwd_ttt_inner, dynamic=True) + return _fwd_ttt_compiled_inner(input_ids, target_ids, lora=lora) + fwd_ttt_compiled = _fwd_ttt + log(f"ttt_lora:warming up compile (random tokens, no val data)") + global BOS_ID + if BOS_ID is None: + BOS_ID = 1 + t_warmup = time.perf_counter() + warmup_bszes = [h.ttt_batch_size] + for bsz in warmup_bszes: + wl = BatchedTTTLoRA( + bsz, ttt_model, h.ttt_lora_rank, + k_lora=h.ttt_k_lora, mlp_lora=h.ttt_mlp_lora, o_lora=h.ttt_o_lora, + ).to(device) + wo = torch.optim.AdamW( + wl.parameters(), + lr=h.ttt_lora_lr, + betas=(h.ttt_beta1, h.ttt_beta2), + eps=1e-10, + weight_decay=h.ttt_weight_decay, + fused=True, + ) + for ctx_len in (h.ttt_chunk_size, h.ttt_eval_seq_len): + xw = torch.randint(0, h.vocab_size, (bsz, ctx_len), device=device, dtype=torch.int64) + yw = torch.randint(0, h.vocab_size, (bsz, ctx_len), device=device, dtype=torch.int64) + with torch.autocast(device_type="cuda", dtype=torch.bfloat16): + ptl = fwd_ttt_compiled(xw, yw, lora=wl) + ptl[:, : min(h.ttt_chunk_size, ctx_len)].mean(dim=-1).sum().backward() + wo.step() + wo.zero_grad(set_to_none=True) + del wl, wo + torch.cuda.empty_cache() + compile_elapsed = time.perf_counter() - t_warmup + log(f"ttt_lora:compile warmup done ({compile_elapsed:.1f}s)") + log("\nbeginning TTT eval timer") + torch.cuda.synchronize() + t_ttt = time.perf_counter() + ttt_val_loss, ttt_val_bpb = eval_val_ttt_phased( + h, ttt_model, device, val_data, forward_ttt_train=fwd_ttt_compiled + ) + torch.cuda.synchronize() + ttt_eval_elapsed = time.perf_counter() - t_ttt + log( + "quantized_ttt_phased " + f"val_loss:{ttt_val_loss:.8f} val_bpb:{ttt_val_bpb:.8f} " + f"eval_time:{1e3*ttt_eval_elapsed:.0f}ms" + ) + log(f"total_eval_time:{ttt_eval_elapsed:.1f}s") + del ttt_model +def main(): + world_size = int(os.environ.get("WORLD_SIZE", "1")) + local_rank = int(os.environ.get("LOCAL_RANK", "0")) + distributed = "RANK" in os.environ and "WORLD_SIZE" in os.environ + if not torch.cuda.is_available(): + raise RuntimeError("CUDA is required") + if world_size <= 0: + raise ValueError(f"WORLD_SIZE must be positive, got {world_size}") + if 8 % world_size != 0: + raise ValueError( + f"WORLD_SIZE={world_size} must divide 8 so grad_accum_steps stays integral" + ) + device = torch.device("cuda", local_rank) + torch.cuda.set_device(device) + if distributed: + dist.init_process_group(backend="nccl", device_id=device) + dist.barrier() + torch.backends.cuda.matmul.allow_tf32 = True + torch.backends.cudnn.allow_tf32 = True + torch.set_float32_matmul_precision("high") + from torch.backends.cuda import ( + enable_cudnn_sdp, + enable_flash_sdp, + enable_math_sdp, + enable_mem_efficient_sdp, + ) + enable_cudnn_sdp(False) + enable_flash_sdp(True) + enable_mem_efficient_sdp(False) + enable_math_sdp(False) + torch._dynamo.config.optimize_ddp = False + torch._dynamo.config.cache_size_limit = 16 + h = Hyperparameters() + set_logging_hparams(h) + if h.is_main_process: + os.makedirs(h.artifact_dir if h.artifact_dir else "logs", exist_ok=True) + os.makedirs(os.path.dirname(h.model_path) or "models", exist_ok=True) + os.makedirs(os.path.dirname(h.quantized_model_path) or "models", exist_ok=True) + train_and_eval(h, device) + if distributed: + dist.destroy_process_group() +if __name__ == "__main__": + main() diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed1334.log b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed1334.log new file mode 100644 index 0000000000..7f6821ad3c --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed1334.log @@ -0,0 +1,1185 @@ +W0429 17:52:36.768000 100862 torch/distributed/run.py:803] +W0429 17:52:36.768000 100862 torch/distributed/run.py:803] ***************************************** +W0429 17:52:36.768000 100862 torch/distributed/run.py:803] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. +W0429 17:52:36.768000 100862 torch/distributed/run.py:803] ***************************************** +train_shards: 80 +val_tokens: 47851520 +model_params:35945671 +gptq:reserving 16s, effective=584000ms +warmup_cu_buckets:64,128,192,256 iters_each:3 +warmup_step: 1/20 +warmup_step: 2/20 +warmup_step: 3/20 +warmup_step: 4/20 +warmup_step: 5/20 +warmup_step: 6/20 +warmup_step: 10/20 +warmup_step: 20/20 +loop_warmup:enabled encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +loop_warmup_step: 1/20 +loop_warmup_step: 2/20 +loop_warmup_step: 3/20 +loop_warmup_step: 4/20 +loop_warmup_step: 5/20 +loop_warmup_step: 6/20 +loop_warmup_step: 10/20 +loop_warmup_step: 20/20 +0/20000 val_loss: 9.0007 val_bpb: 4.1127 +1/20000 train_loss: 9.0016 train_time: 0.0m tok/s: 12275171 +2/20000 train_loss: 12.8589 train_time: 0.0m tok/s: 11509645 +3/20000 train_loss: 10.1873 train_time: 0.0m tok/s: 10242533 +4/20000 train_loss: 8.6642 train_time: 0.0m tok/s: 9731774 +5/20000 train_loss: 7.9253 train_time: 0.0m tok/s: 9432809 +500/20000 train_loss: 2.5705 train_time: 0.8m tok/s: 8293624 +1000/20000 train_loss: 2.7960 train_time: 1.6m tok/s: 8266250 +1500/20000 train_loss: 2.6373 train_time: 2.4m tok/s: 8257371 +2000/20000 train_loss: 2.6638 train_time: 3.2m tok/s: 8255091 +layer_loop:enabled step:2146 frac:0.350 encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +2500/20000 train_loss: 2.5502 train_time: 4.2m tok/s: 7737102 +3000/20000 train_loss: 2.5623 train_time: 5.4m tok/s: 7276150 +3500/20000 train_loss: 2.5652 train_time: 6.6m tok/s: 6957307 +4000/20000 train_loss: 2.4035 train_time: 7.8m tok/s: 6754329 +4000/20000 val_loss: 2.4284 val_bpb: 1.1096 +4500/20000 train_loss: 2.2782 train_time: 9.0m tok/s: 6584803 +4833/20000 val_loss: 2.3584 val_bpb: 1.0776 +stopping_early: wallclock_cap train_time: 584153ms step: 4833/20000 +peak memory allocated: 41709 MiB reserved: 47026 MiB +ema:applying EMA weights +diagnostic pre-quantization post-ema val_loss:2.33536211 val_bpb:1.06710006 eval_time:6770ms +Serialized model: 135419669 bytes +Code size (uncompressed): 138859 bytes +Code size (compressed): 31095 bytes +GPTQ:collecting Hessians from calibration data... +GPTQ:collected 67 Hessians in 3.5s +Quantized weights: + gate_int8_row: blocks.attn.attn_gate_w + gptq (int6): blocks.attn.c_k.weight, blocks.attn.c_q.weight, blocks.attn.c_v.weight, blocks.attn.proj.weight, blocks.mlp.fc.weight, blocks.mlp.proj.weight + gptq (int7)+lqer_asym: tok_emb.weight + passthrough (float16): blocks.attn.q_gain, blocks.attn_scale, blocks.mlp_scale, blocks.resid_mix, parallel_post_lambdas, parallel_resid_lambdas, skip_gates, skip_weights, smear_gate.weight, smear_lambda +GPTQ:quantized weights in 12.13s +Serialized model quantized+brotli: 15916569 bytes +Total submission size quantized+brotli: 15947664 bytes +diagnostic quantized val_loss:2.35445830 val_bpb:1.07582570 eval_time:10246ms +ttt_lora:warming up compile (random tokens, no val data) +ttt_lora:compile warmup done (171.6s) + +beginning TTT eval timer +ttt_phased: total_docs:50000 prefix_docs:2000 suffix_docs:48000 num_phases:3 boundaries:[666, 1333, 2000] +ttp: b3121/3125 bl:2.2891 bb:1.1129 rl:2.2891 rb:1.1129 dl:17258-19362 gd:0 +ttp: b3114/3125 bl:2.2192 bb:1.0703 rl:2.2623 rb:1.0965 dl:11159-11603 gd:0 +ttp: b3108/3125 bl:2.3281 bb:1.0769 rl:2.2778 rb:1.0917 dl:9027-9229 gd:0 +ttp: b3100/3125 bl:2.3313 bb:1.0745 rl:2.2864 rb:1.0888 dl:7313-7524 gd:0 +ttp: b3093/3125 bl:2.3174 bb:1.0906 rl:2.2903 rb:1.0891 dl:6447-6522 gd:0 +ttp: b3088/3125 bl:2.3090 bb:1.1104 rl:2.2922 rb:1.0912 dl:6007-6095 gd:0 +ttp: b3081/3125 bl:2.3322 bb:1.0253 rl:2.2956 rb:1.0851 dl:5523-5586 gd:0 +ttpp: phase:1/3 pd:784 gd:666 t:205.8s +tttg: c1/158 lr:0.001000 t:0.3s +tttg: c2/158 lr:0.001000 t:0.3s +tttg: c3/158 lr:0.001000 t:0.4s +tttg: c4/158 lr:0.000999 t:0.5s +tttg: c5/158 lr:0.000998 t:0.5s +tttg: c6/158 lr:0.000997 t:0.6s +tttg: c7/158 lr:0.000996 t:0.7s +tttg: c8/158 lr:0.000995 t:0.7s +tttg: c9/158 lr:0.000994 t:0.8s +tttg: c10/158 lr:0.000992 t:0.9s +tttg: c11/158 lr:0.000990 t:0.9s +tttg: c12/158 lr:0.000988 t:1.0s +tttg: c13/158 lr:0.000986 t:1.1s +tttg: c14/158 lr:0.000983 t:1.1s +tttg: c15/158 lr:0.000981 t:1.2s +tttg: c16/158 lr:0.000978 t:1.3s +tttg: c17/158 lr:0.000975 t:1.3s +tttg: c18/158 lr:0.000971 t:1.4s +tttg: c19/158 lr:0.000968 t:1.5s +tttg: c20/158 lr:0.000964 t:1.5s +tttg: c21/158 lr:0.000960 t:1.6s +tttg: c22/158 lr:0.000957 t:1.7s +tttg: c23/158 lr:0.000952 t:1.7s +tttg: c24/158 lr:0.000948 t:1.8s +tttg: c25/158 lr:0.000943 t:1.8s +tttg: c26/158 lr:0.000939 t:1.9s +tttg: c27/158 lr:0.000934 t:2.0s +tttg: c28/158 lr:0.000929 t:2.0s +tttg: c29/158 lr:0.000924 t:2.1s +tttg: c30/158 lr:0.000918 t:2.2s +tttg: c31/158 lr:0.000913 t:2.2s +tttg: c32/158 lr:0.000907 t:2.3s +tttg: c33/158 lr:0.000901 t:2.4s +tttg: c34/158 lr:0.000895 t:2.4s +tttg: c35/158 lr:0.000889 t:2.5s +tttg: c36/158 lr:0.000882 t:2.6s +tttg: c37/158 lr:0.000876 t:2.6s +tttg: c38/158 lr:0.000869 t:2.7s +tttg: c39/158 lr:0.000862 t:2.8s +tttg: c40/158 lr:0.000855 t:2.8s +tttg: c41/158 lr:0.000848 t:2.9s +tttg: c42/158 lr:0.000841 t:3.0s +tttg: c43/158 lr:0.000834 t:3.0s +tttg: c44/158 lr:0.000826 t:3.1s +tttg: c45/158 lr:0.000818 t:3.2s +tttg: c46/158 lr:0.000811 t:3.2s +tttg: c47/158 lr:0.000803 t:3.3s +tttg: c48/158 lr:0.000795 t:3.3s +tttg: c49/158 lr:0.000787 t:3.4s +tttg: c50/158 lr:0.000778 t:3.5s +tttg: c51/158 lr:0.000770 t:3.5s +tttg: c52/158 lr:0.000761 t:3.6s +tttg: c53/158 lr:0.000753 t:3.7s +tttg: c54/158 lr:0.000744 t:3.7s +tttg: c55/158 lr:0.000735 t:3.8s +tttg: c56/158 lr:0.000727 t:3.9s +tttg: c57/158 lr:0.000718 t:3.9s +tttg: c58/158 lr:0.000709 t:4.0s +tttg: c59/158 lr:0.000699 t:4.1s +tttg: c60/158 lr:0.000690 t:4.1s +tttg: c61/158 lr:0.000681 t:4.2s +tttg: c62/158 lr:0.000672 t:4.3s +tttg: c63/158 lr:0.000662 t:4.3s +tttg: c64/158 lr:0.000653 t:4.4s +tttg: c65/158 lr:0.000643 t:4.5s +tttg: c66/158 lr:0.000633 t:4.5s +tttg: c67/158 lr:0.000624 t:4.6s +tttg: c68/158 lr:0.000614 t:4.7s +tttg: c69/158 lr:0.000604 t:4.7s +tttg: c70/158 lr:0.000594 t:4.8s +tttg: c71/158 lr:0.000585 t:4.9s +tttg: c72/158 lr:0.000575 t:4.9s +tttg: c73/158 lr:0.000565 t:5.0s +tttg: c74/158 lr:0.000555 t:5.1s +tttg: c75/158 lr:0.000545 t:5.1s +tttg: c76/158 lr:0.000535 t:5.2s +tttg: c77/158 lr:0.000525 t:5.3s +tttg: c78/158 lr:0.000515 t:5.3s +tttg: c79/158 lr:0.000505 t:5.4s +tttg: c80/158 lr:0.000495 t:5.5s +tttg: c81/158 lr:0.000485 t:5.5s +tttg: c82/158 lr:0.000475 t:5.6s +tttg: c83/158 lr:0.000465 t:5.7s +tttg: c84/158 lr:0.000455 t:5.7s +tttg: c85/158 lr:0.000445 t:5.8s +tttg: c86/158 lr:0.000435 t:5.9s +tttg: c87/158 lr:0.000425 t:5.9s +tttg: c88/158 lr:0.000415 t:6.0s +tttg: c89/158 lr:0.000406 t:6.0s +tttg: c90/158 lr:0.000396 t:6.1s +tttg: c91/158 lr:0.000386 t:6.2s +tttg: c92/158 lr:0.000376 t:6.2s +tttg: c93/158 lr:0.000367 t:6.3s +tttg: c94/158 lr:0.000357 t:6.4s +tttg: c95/158 lr:0.000347 t:6.4s +tttg: c96/158 lr:0.000338 t:6.5s +tttg: c97/158 lr:0.000328 t:6.6s +tttg: c98/158 lr:0.000319 t:6.6s +tttg: c99/158 lr:0.000310 t:6.7s +tttg: c100/158 lr:0.000301 t:6.8s +tttg: c101/158 lr:0.000291 t:6.8s +tttg: c102/158 lr:0.000282 t:6.9s +tttg: c103/158 lr:0.000273 t:7.0s +tttg: c104/158 lr:0.000265 t:7.0s +tttg: c105/158 lr:0.000256 t:7.1s +tttg: c106/158 lr:0.000247 t:7.2s +tttg: c107/158 lr:0.000239 t:7.2s +tttg: c108/158 lr:0.000230 t:7.3s +tttg: c109/158 lr:0.000222 t:7.4s +tttg: c110/158 lr:0.000213 t:7.4s +tttg: c111/158 lr:0.000205 t:7.5s +tttg: c112/158 lr:0.000197 t:7.6s +tttg: c113/158 lr:0.000189 t:7.6s +tttg: c114/158 lr:0.000182 t:7.7s +tttg: c115/158 lr:0.000174 t:7.7s +tttg: c116/158 lr:0.000166 t:7.8s +tttg: c117/158 lr:0.000159 t:7.9s +tttg: c118/158 lr:0.000152 t:7.9s +tttg: c119/158 lr:0.000145 t:8.0s +tttg: c120/158 lr:0.000138 t:8.1s +tttg: c121/158 lr:0.000131 t:8.1s +tttg: c122/158 lr:0.000124 t:8.2s +tttg: c123/158 lr:0.000118 t:8.3s +tttg: c124/158 lr:0.000111 t:8.3s +tttg: c125/158 lr:0.000105 t:8.4s +tttg: c126/158 lr:0.000099 t:8.5s +tttg: c127/158 lr:0.000093 t:8.5s +tttg: c128/158 lr:0.000087 t:8.6s +tttg: c129/158 lr:0.000082 t:8.7s +tttg: c130/158 lr:0.000076 t:8.7s +tttg: c131/158 lr:0.000071 t:8.8s +tttg: c132/158 lr:0.000066 t:8.9s +tttg: c133/158 lr:0.000061 t:8.9s +tttg: c134/158 lr:0.000057 t:9.0s +tttg: c135/158 lr:0.000052 t:9.1s +tttg: c136/158 lr:0.000048 t:9.1s +tttg: c137/158 lr:0.000043 t:9.2s +tttg: c138/158 lr:0.000040 t:9.3s +tttg: c139/158 lr:0.000036 t:9.3s +tttg: c140/158 lr:0.000032 t:9.4s +tttg: c141/158 lr:0.000029 t:9.5s +tttg: c142/158 lr:0.000025 t:9.5s +tttg: c143/158 lr:0.000022 t:9.6s +tttg: c144/158 lr:0.000019 t:9.7s +tttg: c145/158 lr:0.000017 t:9.7s +tttg: c146/158 lr:0.000014 t:9.8s +tttg: c147/158 lr:0.000012 t:9.8s +tttg: c148/158 lr:0.000010 t:9.9s +tttg: c149/158 lr:0.000008 t:10.0s +tttg: c150/158 lr:0.000006 t:10.0s +tttg: c151/158 lr:0.000005 t:10.1s +tttg: c152/158 lr:0.000004 t:10.2s +tttg: c153/158 lr:0.000003 t:10.2s +tttg: c154/158 lr:0.000002 t:10.3s +tttg: c155/158 lr:0.000001 t:10.4s +tttg: c156/158 lr:0.000000 t:10.4s +tttg: c157/158 lr:0.000000 t:10.5s +ttpr: phase:1/3 t:216.7s +ttp: b3075/3125 bl:2.4111 bb:1.1553 rl:2.3043 rb:1.0903 dl:5190-5230 gd:0 +ttp: b3062/3125 bl:2.3011 bb:1.0738 rl:2.3041 rb:1.0893 dl:4558-4601 gd:0 +ttp: b3054/3125 bl:2.2352 bb:1.0817 rl:2.3003 rb:1.0889 dl:4309-4331 gd:0 +ttp: b3046/3125 bl:2.2456 bb:1.0413 rl:2.2976 rb:1.0865 dl:4058-4077 gd:0 +ttp: b3038/3125 bl:2.4709 bb:1.0720 rl:2.3053 rb:1.0858 dl:3843-3864 gd:0 +ttpp: phase:2/3 pd:1456 gd:1333 t:334.7s +tttg: c1/247 lr:0.001000 t:0.1s +tttg: c2/247 lr:0.001000 t:0.1s +tttg: c3/247 lr:0.001000 t:0.2s +tttg: c4/247 lr:0.001000 t:0.3s +tttg: c5/247 lr:0.000999 t:0.3s +tttg: c6/247 lr:0.000999 t:0.4s +tttg: c7/247 lr:0.000999 t:0.5s +tttg: c8/247 lr:0.000998 t:0.5s +tttg: c9/247 lr:0.000997 t:0.6s +tttg: c10/247 lr:0.000997 t:0.7s +tttg: c11/247 lr:0.000996 t:0.7s +tttg: c12/247 lr:0.000995 t:0.8s +tttg: c13/247 lr:0.000994 t:0.8s +tttg: c14/247 lr:0.000993 t:0.9s +tttg: c15/247 lr:0.000992 t:1.0s +tttg: c16/247 lr:0.000991 t:1.0s +tttg: c17/247 lr:0.000990 t:1.1s +tttg: c18/247 lr:0.000988 t:1.2s +tttg: c19/247 lr:0.000987 t:1.2s +tttg: c20/247 lr:0.000985 t:1.3s +tttg: c21/247 lr:0.000984 t:1.4s +tttg: c22/247 lr:0.000982 t:1.4s +tttg: c23/247 lr:0.000980 t:1.5s +tttg: c24/247 lr:0.000979 t:1.6s +tttg: c25/247 lr:0.000977 t:1.6s +tttg: c26/247 lr:0.000975 t:1.7s +tttg: c27/247 lr:0.000973 t:1.8s +tttg: c28/247 lr:0.000971 t:1.8s +tttg: c29/247 lr:0.000968 t:1.9s +tttg: c30/247 lr:0.000966 t:1.9s +tttg: c31/247 lr:0.000964 t:2.0s +tttg: c32/247 lr:0.000961 t:2.1s +tttg: c33/247 lr:0.000959 t:2.1s +tttg: c34/247 lr:0.000956 t:2.2s +tttg: c35/247 lr:0.000954 t:2.3s +tttg: c36/247 lr:0.000951 t:2.3s +tttg: c37/247 lr:0.000948 t:2.4s +tttg: c38/247 lr:0.000945 t:2.5s +tttg: c39/247 lr:0.000942 t:2.5s +tttg: c40/247 lr:0.000939 t:2.6s +tttg: c41/247 lr:0.000936 t:2.7s +tttg: c42/247 lr:0.000933 t:2.7s +tttg: c43/247 lr:0.000930 t:2.8s +tttg: c44/247 lr:0.000926 t:2.8s +tttg: c45/247 lr:0.000923 t:2.9s +tttg: c46/247 lr:0.000920 t:3.0s +tttg: c47/247 lr:0.000916 t:3.0s +tttg: c48/247 lr:0.000913 t:3.1s +tttg: c49/247 lr:0.000909 t:3.2s +tttg: c50/247 lr:0.000905 t:3.2s +tttg: c51/247 lr:0.000901 t:3.3s +tttg: c52/247 lr:0.000898 t:3.4s +tttg: c53/247 lr:0.000894 t:3.4s +tttg: c54/247 lr:0.000890 t:3.5s +tttg: c55/247 lr:0.000886 t:3.6s +tttg: c56/247 lr:0.000882 t:3.6s +tttg: c57/247 lr:0.000877 t:3.7s +tttg: c58/247 lr:0.000873 t:3.8s +tttg: c59/247 lr:0.000869 t:3.8s +tttg: c60/247 lr:0.000865 t:3.9s +tttg: c61/247 lr:0.000860 t:3.9s +tttg: c62/247 lr:0.000856 t:4.0s +tttg: c63/247 lr:0.000851 t:4.1s +tttg: c64/247 lr:0.000847 t:4.1s +tttg: c65/247 lr:0.000842 t:4.2s +tttg: c66/247 lr:0.000837 t:4.3s +tttg: c67/247 lr:0.000833 t:4.3s +tttg: c68/247 lr:0.000828 t:4.4s +tttg: c69/247 lr:0.000823 t:4.5s +tttg: c70/247 lr:0.000818 t:4.5s +tttg: c71/247 lr:0.000813 t:4.6s +tttg: c72/247 lr:0.000808 t:4.7s +tttg: c73/247 lr:0.000803 t:4.7s +tttg: c74/247 lr:0.000798 t:4.8s +tttg: c75/247 lr:0.000793 t:4.8s +tttg: c76/247 lr:0.000788 t:4.9s +tttg: c77/247 lr:0.000782 t:5.0s +tttg: c78/247 lr:0.000777 t:5.0s +tttg: c79/247 lr:0.000772 t:5.1s +tttg: c80/247 lr:0.000766 t:5.2s +tttg: c81/247 lr:0.000761 t:5.2s +tttg: c82/247 lr:0.000756 t:5.3s +tttg: c83/247 lr:0.000750 t:5.4s +tttg: c84/247 lr:0.000744 t:5.4s +tttg: c85/247 lr:0.000739 t:5.5s +tttg: c86/247 lr:0.000733 t:5.6s +tttg: c87/247 lr:0.000728 t:5.6s +tttg: c88/247 lr:0.000722 t:5.7s +tttg: c89/247 lr:0.000716 t:5.7s +tttg: c90/247 lr:0.000710 t:5.8s +tttg: c91/247 lr:0.000705 t:5.9s +tttg: c92/247 lr:0.000699 t:5.9s +tttg: c93/247 lr:0.000693 t:6.0s +tttg: c94/247 lr:0.000687 t:6.1s +tttg: c95/247 lr:0.000681 t:6.1s +tttg: c96/247 lr:0.000675 t:6.2s +tttg: c97/247 lr:0.000669 t:6.3s +tttg: c98/247 lr:0.000663 t:6.3s +tttg: c99/247 lr:0.000657 t:6.4s +tttg: c100/247 lr:0.000651 t:6.5s +tttg: c101/247 lr:0.000645 t:6.5s +tttg: c102/247 lr:0.000639 t:6.6s +tttg: c103/247 lr:0.000632 t:6.7s +tttg: c104/247 lr:0.000626 t:6.7s +tttg: c105/247 lr:0.000620 t:6.8s +tttg: c106/247 lr:0.000614 t:6.8s +tttg: c107/247 lr:0.000608 t:6.9s +tttg: c108/247 lr:0.000601 t:7.0s +tttg: c109/247 lr:0.000595 t:7.0s +tttg: c110/247 lr:0.000589 t:7.1s +tttg: c111/247 lr:0.000583 t:7.2s +tttg: c112/247 lr:0.000576 t:7.2s +tttg: c113/247 lr:0.000570 t:7.3s +tttg: c114/247 lr:0.000564 t:7.4s +tttg: c115/247 lr:0.000557 t:7.4s +tttg: c116/247 lr:0.000551 t:7.5s +tttg: c117/247 lr:0.000545 t:7.6s +tttg: c118/247 lr:0.000538 t:7.6s +tttg: c119/247 lr:0.000532 t:7.7s +tttg: c120/247 lr:0.000526 t:7.7s +tttg: c121/247 lr:0.000519 t:7.8s +tttg: c122/247 lr:0.000513 t:7.9s +tttg: c123/247 lr:0.000506 t:7.9s +tttg: c124/247 lr:0.000500 t:8.0s +tttg: c125/247 lr:0.000494 t:8.1s +tttg: c126/247 lr:0.000487 t:8.1s +tttg: c127/247 lr:0.000481 t:8.2s +tttg: c128/247 lr:0.000474 t:8.3s +tttg: c129/247 lr:0.000468 t:8.3s +tttg: c130/247 lr:0.000462 t:8.4s +tttg: c131/247 lr:0.000455 t:8.5s +tttg: c132/247 lr:0.000449 t:8.5s +tttg: c133/247 lr:0.000443 t:8.6s +tttg: c134/247 lr:0.000436 t:8.6s +tttg: c135/247 lr:0.000430 t:8.7s +tttg: c136/247 lr:0.000424 t:8.8s +tttg: c137/247 lr:0.000417 t:8.8s +tttg: c138/247 lr:0.000411 t:8.9s +tttg: c139/247 lr:0.000405 t:9.0s +tttg: c140/247 lr:0.000399 t:9.0s +tttg: c141/247 lr:0.000392 t:9.1s +tttg: c142/247 lr:0.000386 t:9.2s +tttg: c143/247 lr:0.000380 t:9.2s +tttg: c144/247 lr:0.000374 t:9.3s +tttg: c145/247 lr:0.000368 t:9.4s +tttg: c146/247 lr:0.000361 t:9.4s +tttg: c147/247 lr:0.000355 t:9.5s +tttg: c148/247 lr:0.000349 t:9.5s +tttg: c149/247 lr:0.000343 t:9.6s +tttg: c150/247 lr:0.000337 t:9.7s +tttg: c151/247 lr:0.000331 t:9.7s +tttg: c152/247 lr:0.000325 t:9.8s +tttg: c153/247 lr:0.000319 t:9.9s +tttg: c154/247 lr:0.000313 t:9.9s +tttg: c155/247 lr:0.000307 t:10.0s +tttg: c156/247 lr:0.000301 t:10.1s +tttg: c157/247 lr:0.000295 t:10.1s +tttg: c158/247 lr:0.000290 t:10.2s +tttg: c159/247 lr:0.000284 t:10.3s +tttg: c160/247 lr:0.000278 t:10.3s +tttg: c161/247 lr:0.000272 t:10.4s +tttg: c162/247 lr:0.000267 t:10.4s +tttg: c163/247 lr:0.000261 t:10.5s +tttg: c164/247 lr:0.000256 t:10.6s +tttg: c165/247 lr:0.000250 t:10.6s +tttg: c166/247 lr:0.000244 t:10.7s +tttg: c167/247 lr:0.000239 t:10.8s +tttg: c168/247 lr:0.000234 t:10.8s +tttg: c169/247 lr:0.000228 t:10.9s +tttg: c170/247 lr:0.000223 t:11.0s +tttg: c171/247 lr:0.000218 t:11.0s +tttg: c172/247 lr:0.000212 t:11.1s +tttg: c173/247 lr:0.000207 t:11.2s +tttg: c174/247 lr:0.000202 t:11.2s +tttg: c175/247 lr:0.000197 t:11.3s +tttg: c176/247 lr:0.000192 t:11.3s +tttg: c177/247 lr:0.000187 t:11.4s +tttg: c178/247 lr:0.000182 t:11.5s +tttg: c179/247 lr:0.000177 t:11.5s +tttg: c180/247 lr:0.000172 t:11.6s +tttg: c181/247 lr:0.000167 t:11.7s +tttg: c182/247 lr:0.000163 t:11.7s +tttg: c183/247 lr:0.000158 t:11.8s +tttg: c184/247 lr:0.000153 t:11.9s +tttg: c185/247 lr:0.000149 t:11.9s +tttg: c186/247 lr:0.000144 t:12.0s +tttg: c187/247 lr:0.000140 t:12.1s +tttg: c188/247 lr:0.000135 t:12.1s +tttg: c189/247 lr:0.000131 t:12.2s +tttg: c190/247 lr:0.000127 t:12.2s +tttg: c191/247 lr:0.000123 t:12.3s +tttg: c192/247 lr:0.000118 t:12.4s +tttg: c193/247 lr:0.000114 t:12.4s +tttg: c194/247 lr:0.000110 t:12.5s +tttg: c195/247 lr:0.000106 t:12.6s +tttg: c196/247 lr:0.000102 t:12.6s +tttg: c197/247 lr:0.000099 t:12.7s +tttg: c198/247 lr:0.000095 t:12.8s +tttg: c199/247 lr:0.000091 t:12.8s +tttg: c200/247 lr:0.000087 t:12.9s +tttg: c201/247 lr:0.000084 t:13.0s +tttg: c202/247 lr:0.000080 t:13.0s +tttg: c203/247 lr:0.000077 t:13.1s +tttg: c204/247 lr:0.000074 t:13.1s +tttg: c205/247 lr:0.000070 t:13.2s +tttg: c206/247 lr:0.000067 t:13.3s +tttg: c207/247 lr:0.000064 t:13.3s +tttg: c208/247 lr:0.000061 t:13.4s +tttg: c209/247 lr:0.000058 t:13.5s +tttg: c210/247 lr:0.000055 t:13.5s +tttg: c211/247 lr:0.000052 t:13.6s +tttg: c212/247 lr:0.000049 t:13.7s +tttg: c213/247 lr:0.000046 t:13.7s +tttg: c214/247 lr:0.000044 t:13.8s +tttg: c215/247 lr:0.000041 t:13.9s +tttg: c216/247 lr:0.000039 t:13.9s +tttg: c217/247 lr:0.000036 t:14.0s +tttg: c218/247 lr:0.000034 t:14.1s +tttg: c219/247 lr:0.000032 t:14.1s +tttg: c220/247 lr:0.000029 t:14.2s +tttg: c221/247 lr:0.000027 t:14.2s +tttg: c222/247 lr:0.000025 t:14.3s +tttg: c223/247 lr:0.000023 t:14.4s +tttg: c224/247 lr:0.000021 t:14.4s +tttg: c225/247 lr:0.000020 t:14.5s +tttg: c226/247 lr:0.000018 t:14.6s +tttg: c227/247 lr:0.000016 t:14.6s +tttg: c228/247 lr:0.000015 t:14.7s +tttg: c229/247 lr:0.000013 t:14.8s +tttg: c230/247 lr:0.000012 t:14.8s +tttg: c231/247 lr:0.000010 t:14.9s +tttg: c232/247 lr:0.000009 t:15.0s +tttg: c233/247 lr:0.000008 t:15.0s +tttg: c234/247 lr:0.000007 t:15.1s +tttg: c235/247 lr:0.000006 t:15.1s +tttg: c236/247 lr:0.000005 t:15.2s +tttg: c237/247 lr:0.000004 t:15.3s +tttg: c238/247 lr:0.000003 t:15.3s +tttg: c239/247 lr:0.000003 t:15.4s +tttg: c240/247 lr:0.000002 t:15.5s +tttg: c241/247 lr:0.000001 t:15.5s +tttg: c242/247 lr:0.000001 t:15.6s +tttg: c243/247 lr:0.000001 t:15.7s +tttg: c244/247 lr:0.000000 t:15.7s +tttg: c245/247 lr:0.000000 t:15.8s +tttg: c246/247 lr:0.000000 t:15.9s +ttpr: phase:2/3 t:351.1s +ttp: b3031/3125 bl:2.3761 bb:1.1262 rl:2.3082 rb:1.0874 dl:3682-3715 gd:0 +ttp: b3021/3125 bl:2.3235 bb:1.0397 rl:2.3088 rb:1.0856 dl:3466-3487 gd:0 +ttp: b3016/3125 bl:2.3276 bb:1.1052 rl:2.3095 rb:1.0862 dl:3387-3397 gd:0 +ttp: b3005/3125 bl:2.0798 bb:1.0030 rl:2.3021 rb:1.0836 dl:3222-3232 gd:0 +ttp: b3001/3125 bl:2.2894 bb:1.0168 rl:2.3017 rb:1.0815 dl:3150-3167 gd:0 +ttpp: phase:3/3 pd:2112 gd:2000 t:369.9s +tttg: c1/319 lr:0.001000 t:0.1s +tttg: c2/319 lr:0.001000 t:0.1s +tttg: c3/319 lr:0.001000 t:0.2s +tttg: c4/319 lr:0.001000 t:0.3s +tttg: c5/319 lr:0.001000 t:0.3s +tttg: c6/319 lr:0.000999 t:0.4s +tttg: c7/319 lr:0.000999 t:0.5s +tttg: c8/319 lr:0.000999 t:0.5s +tttg: c9/319 lr:0.000998 t:0.6s +tttg: c10/319 lr:0.000998 t:0.7s +tttg: c11/319 lr:0.000998 t:0.7s +tttg: c12/319 lr:0.000997 t:0.8s +tttg: c13/319 lr:0.000996 t:0.9s +tttg: c14/319 lr:0.000996 t:0.9s +tttg: c15/319 lr:0.000995 t:1.0s +tttg: c16/319 lr:0.000995 t:1.0s +tttg: c17/319 lr:0.000994 t:1.1s +tttg: c18/319 lr:0.000993 t:1.2s +tttg: c19/319 lr:0.000992 t:1.2s +tttg: c20/319 lr:0.000991 t:1.3s +tttg: c21/319 lr:0.000990 t:1.4s +tttg: c22/319 lr:0.000989 t:1.4s +tttg: c23/319 lr:0.000988 t:1.5s +tttg: c24/319 lr:0.000987 t:1.6s +tttg: c25/319 lr:0.000986 t:1.6s +tttg: c26/319 lr:0.000985 t:1.7s +tttg: c27/319 lr:0.000984 t:1.8s +tttg: c28/319 lr:0.000982 t:1.8s +tttg: c29/319 lr:0.000981 t:1.9s +tttg: c30/319 lr:0.000980 t:2.0s +tttg: c31/319 lr:0.000978 t:2.0s +tttg: c32/319 lr:0.000977 t:2.1s +tttg: c33/319 lr:0.000975 t:2.1s +tttg: c34/319 lr:0.000974 t:2.2s +tttg: c35/319 lr:0.000972 t:2.3s +tttg: c36/319 lr:0.000970 t:2.3s +tttg: c37/319 lr:0.000969 t:2.4s +tttg: c38/319 lr:0.000967 t:2.5s +tttg: c39/319 lr:0.000965 t:2.5s +tttg: c40/319 lr:0.000963 t:2.6s +tttg: c41/319 lr:0.000961 t:2.7s +tttg: c42/319 lr:0.000960 t:2.7s +tttg: c43/319 lr:0.000958 t:2.8s +tttg: c44/319 lr:0.000956 t:2.9s +tttg: c45/319 lr:0.000954 t:2.9s +tttg: c46/319 lr:0.000951 t:3.0s +tttg: c47/319 lr:0.000949 t:3.0s +tttg: c48/319 lr:0.000947 t:3.1s +tttg: c49/319 lr:0.000945 t:3.2s +tttg: c50/319 lr:0.000943 t:3.2s +tttg: c51/319 lr:0.000940 t:3.3s +tttg: c52/319 lr:0.000938 t:3.4s +tttg: c53/319 lr:0.000935 t:3.4s +tttg: c54/319 lr:0.000933 t:3.5s +tttg: c55/319 lr:0.000931 t:3.6s +tttg: c56/319 lr:0.000928 t:3.6s +tttg: c57/319 lr:0.000925 t:3.7s +tttg: c58/319 lr:0.000923 t:3.8s +tttg: c59/319 lr:0.000920 t:3.8s +tttg: c60/319 lr:0.000917 t:3.9s +tttg: c61/319 lr:0.000915 t:4.0s +tttg: c62/319 lr:0.000912 t:4.0s +tttg: c63/319 lr:0.000909 t:4.1s +tttg: c64/319 lr:0.000906 t:4.2s +tttg: c65/319 lr:0.000903 t:4.2s +tttg: c66/319 lr:0.000900 t:4.3s +tttg: c67/319 lr:0.000897 t:4.4s +tttg: c68/319 lr:0.000894 t:4.4s +tttg: c69/319 lr:0.000891 t:4.5s +tttg: c70/319 lr:0.000888 t:4.5s +tttg: c71/319 lr:0.000885 t:4.6s +tttg: c72/319 lr:0.000882 t:4.7s +tttg: c73/319 lr:0.000879 t:4.7s +tttg: c74/319 lr:0.000876 t:4.8s +tttg: c75/319 lr:0.000872 t:4.9s +tttg: c76/319 lr:0.000869 t:4.9s +tttg: c77/319 lr:0.000866 t:5.0s +tttg: c78/319 lr:0.000862 t:5.1s +tttg: c79/319 lr:0.000859 t:5.1s +tttg: c80/319 lr:0.000855 t:5.2s +tttg: c81/319 lr:0.000852 t:5.3s +tttg: c82/319 lr:0.000848 t:5.3s +tttg: c83/319 lr:0.000845 t:5.4s +tttg: c84/319 lr:0.000841 t:5.5s +tttg: c85/319 lr:0.000837 t:5.5s +tttg: c86/319 lr:0.000834 t:5.6s +tttg: c87/319 lr:0.000830 t:5.6s +tttg: c88/319 lr:0.000826 t:5.7s +tttg: c89/319 lr:0.000823 t:5.8s +tttg: c90/319 lr:0.000819 t:5.8s +tttg: c91/319 lr:0.000815 t:5.9s +tttg: c92/319 lr:0.000811 t:6.0s +tttg: c93/319 lr:0.000807 t:6.0s +tttg: c94/319 lr:0.000803 t:6.1s +tttg: c95/319 lr:0.000799 t:6.2s +tttg: c96/319 lr:0.000795 t:6.2s +tttg: c97/319 lr:0.000791 t:6.3s +tttg: c98/319 lr:0.000787 t:6.4s +tttg: c99/319 lr:0.000783 t:6.4s +tttg: c100/319 lr:0.000779 t:6.5s +tttg: c101/319 lr:0.000775 t:6.6s +tttg: c102/319 lr:0.000771 t:6.6s +tttg: c103/319 lr:0.000767 t:6.7s +tttg: c104/319 lr:0.000763 t:6.7s +tttg: c105/319 lr:0.000759 t:6.8s +tttg: c106/319 lr:0.000754 t:6.9s +tttg: c107/319 lr:0.000750 t:6.9s +tttg: c108/319 lr:0.000746 t:7.0s +tttg: c109/319 lr:0.000741 t:7.1s +tttg: c110/319 lr:0.000737 t:7.1s +tttg: c111/319 lr:0.000733 t:7.2s +tttg: c112/319 lr:0.000728 t:7.3s +tttg: c113/319 lr:0.000724 t:7.3s +tttg: c114/319 lr:0.000719 t:7.4s +tttg: c115/319 lr:0.000715 t:7.5s +tttg: c116/319 lr:0.000711 t:7.5s +tttg: c117/319 lr:0.000706 t:7.6s +tttg: c118/319 lr:0.000702 t:7.7s +tttg: c119/319 lr:0.000697 t:7.7s +tttg: c120/319 lr:0.000692 t:7.8s +tttg: c121/319 lr:0.000688 t:7.8s +tttg: c122/319 lr:0.000683 t:7.9s +tttg: c123/319 lr:0.000679 t:8.0s +tttg: c124/319 lr:0.000674 t:8.0s +tttg: c125/319 lr:0.000669 t:8.1s +tttg: c126/319 lr:0.000665 t:8.2s +tttg: c127/319 lr:0.000660 t:8.2s +tttg: c128/319 lr:0.000655 t:8.3s +tttg: c129/319 lr:0.000651 t:8.4s +tttg: c130/319 lr:0.000646 t:8.4s +tttg: c131/319 lr:0.000641 t:8.5s +tttg: c132/319 lr:0.000637 t:8.6s +tttg: c133/319 lr:0.000632 t:8.6s +tttg: c134/319 lr:0.000627 t:8.7s +tttg: c135/319 lr:0.000622 t:8.8s +tttg: c136/319 lr:0.000617 t:8.8s +tttg: c137/319 lr:0.000613 t:8.9s +tttg: c138/319 lr:0.000608 t:8.9s +tttg: c139/319 lr:0.000603 t:9.0s +tttg: c140/319 lr:0.000598 t:9.1s +tttg: c141/319 lr:0.000593 t:9.1s +tttg: c142/319 lr:0.000588 t:9.2s +tttg: c143/319 lr:0.000584 t:9.3s +tttg: c144/319 lr:0.000579 t:9.3s +tttg: c145/319 lr:0.000574 t:9.4s +tttg: c146/319 lr:0.000569 t:9.5s +tttg: c147/319 lr:0.000564 t:9.5s +tttg: c148/319 lr:0.000559 t:9.6s +tttg: c149/319 lr:0.000554 t:9.7s +tttg: c150/319 lr:0.000549 t:9.7s +tttg: c151/319 lr:0.000544 t:9.8s +tttg: c152/319 lr:0.000539 t:9.9s +tttg: c153/319 lr:0.000535 t:9.9s +tttg: c154/319 lr:0.000530 t:10.0s +tttg: c155/319 lr:0.000525 t:10.0s +tttg: c156/319 lr:0.000520 t:10.1s +tttg: c157/319 lr:0.000515 t:10.2s +tttg: c158/319 lr:0.000510 t:10.2s +tttg: c159/319 lr:0.000505 t:10.3s +tttg: c160/319 lr:0.000500 t:10.4s +tttg: c161/319 lr:0.000495 t:10.4s +tttg: c162/319 lr:0.000490 t:10.5s +tttg: c163/319 lr:0.000485 t:10.6s +tttg: c164/319 lr:0.000480 t:10.6s +tttg: c165/319 lr:0.000475 t:10.7s +tttg: c166/319 lr:0.000470 t:10.8s +tttg: c167/319 lr:0.000465 t:10.8s +tttg: c168/319 lr:0.000461 t:10.9s +tttg: c169/319 lr:0.000456 t:11.0s +tttg: c170/319 lr:0.000451 t:11.0s +tttg: c171/319 lr:0.000446 t:11.1s +tttg: c172/319 lr:0.000441 t:11.2s +tttg: c173/319 lr:0.000436 t:11.2s +tttg: c174/319 lr:0.000431 t:11.3s +tttg: c175/319 lr:0.000426 t:11.3s +tttg: c176/319 lr:0.000421 t:11.4s +tttg: c177/319 lr:0.000416 t:11.5s +tttg: c178/319 lr:0.000412 t:11.5s +tttg: c179/319 lr:0.000407 t:11.6s +tttg: c180/319 lr:0.000402 t:11.7s +tttg: c181/319 lr:0.000397 t:11.7s +tttg: c182/319 lr:0.000392 t:11.8s +tttg: c183/319 lr:0.000387 t:11.9s +tttg: c184/319 lr:0.000383 t:11.9s +tttg: c185/319 lr:0.000378 t:12.0s +tttg: c186/319 lr:0.000373 t:12.1s +tttg: c187/319 lr:0.000368 t:12.1s +tttg: c188/319 lr:0.000363 t:12.2s +tttg: c189/319 lr:0.000359 t:12.3s +tttg: c190/319 lr:0.000354 t:12.3s +tttg: c191/319 lr:0.000349 t:12.4s +tttg: c192/319 lr:0.000345 t:12.5s +tttg: c193/319 lr:0.000340 t:12.5s +tttg: c194/319 lr:0.000335 t:12.6s +tttg: c195/319 lr:0.000331 t:12.6s +tttg: c196/319 lr:0.000326 t:12.7s +tttg: c197/319 lr:0.000321 t:12.8s +tttg: c198/319 lr:0.000317 t:12.8s +tttg: c199/319 lr:0.000312 t:12.9s +tttg: c200/319 lr:0.000308 t:13.0s +tttg: c201/319 lr:0.000303 t:13.0s +tttg: c202/319 lr:0.000298 t:13.1s +tttg: c203/319 lr:0.000294 t:13.2s +tttg: c204/319 lr:0.000289 t:13.2s +tttg: c205/319 lr:0.000285 t:13.3s +tttg: c206/319 lr:0.000281 t:13.4s +tttg: c207/319 lr:0.000276 t:13.4s +tttg: c208/319 lr:0.000272 t:13.5s +tttg: c209/319 lr:0.000267 t:13.6s +tttg: c210/319 lr:0.000263 t:13.6s +tttg: c211/319 lr:0.000259 t:13.7s +tttg: c212/319 lr:0.000254 t:13.7s +tttg: c213/319 lr:0.000250 t:13.8s +tttg: c214/319 lr:0.000246 t:13.9s +tttg: c215/319 lr:0.000241 t:13.9s +tttg: c216/319 lr:0.000237 t:14.0s +tttg: c217/319 lr:0.000233 t:14.1s +tttg: c218/319 lr:0.000229 t:14.1s +tttg: c219/319 lr:0.000225 t:14.2s +tttg: c220/319 lr:0.000221 t:14.3s +tttg: c221/319 lr:0.000217 t:14.3s +tttg: c222/319 lr:0.000213 t:14.4s +tttg: c223/319 lr:0.000209 t:14.5s +tttg: c224/319 lr:0.000205 t:14.5s +tttg: c225/319 lr:0.000201 t:14.6s +tttg: c226/319 lr:0.000197 t:14.7s +tttg: c227/319 lr:0.000193 t:14.7s +tttg: c228/319 lr:0.000189 t:14.8s +tttg: c229/319 lr:0.000185 t:14.8s +tttg: c230/319 lr:0.000181 t:14.9s +tttg: c231/319 lr:0.000177 t:15.0s +tttg: c232/319 lr:0.000174 t:15.0s +tttg: c233/319 lr:0.000170 t:15.1s +tttg: c234/319 lr:0.000166 t:15.2s +tttg: c235/319 lr:0.000163 t:15.2s +tttg: c236/319 lr:0.000159 t:15.3s +tttg: c237/319 lr:0.000155 t:15.4s +tttg: c238/319 lr:0.000152 t:15.4s +tttg: c239/319 lr:0.000148 t:15.5s +tttg: c240/319 lr:0.000145 t:15.6s +tttg: c241/319 lr:0.000141 t:15.6s +tttg: c242/319 lr:0.000138 t:15.7s +tttg: c243/319 lr:0.000134 t:15.7s +tttg: c244/319 lr:0.000131 t:15.8s +tttg: c245/319 lr:0.000128 t:15.9s +tttg: c246/319 lr:0.000124 t:15.9s +tttg: c247/319 lr:0.000121 t:16.0s +tttg: c248/319 lr:0.000118 t:16.1s +tttg: c249/319 lr:0.000115 t:16.1s +tttg: c250/319 lr:0.000112 t:16.2s +tttg: c251/319 lr:0.000109 t:16.3s +tttg: c252/319 lr:0.000106 t:16.3s +tttg: c253/319 lr:0.000103 t:16.4s +tttg: c254/319 lr:0.000100 t:16.5s +tttg: c255/319 lr:0.000097 t:16.5s +tttg: c256/319 lr:0.000094 t:16.6s +tttg: c257/319 lr:0.000091 t:16.7s +tttg: c258/319 lr:0.000088 t:16.7s +tttg: c259/319 lr:0.000085 t:16.8s +tttg: c260/319 lr:0.000083 t:16.9s +tttg: c261/319 lr:0.000080 t:16.9s +tttg: c262/319 lr:0.000077 t:17.0s +tttg: c263/319 lr:0.000075 t:17.0s +tttg: c264/319 lr:0.000072 t:17.1s +tttg: c265/319 lr:0.000069 t:17.2s +tttg: c266/319 lr:0.000067 t:17.2s +tttg: c267/319 lr:0.000065 t:17.3s +tttg: c268/319 lr:0.000062 t:17.4s +tttg: c269/319 lr:0.000060 t:17.4s +tttg: c270/319 lr:0.000057 t:17.5s +tttg: c271/319 lr:0.000055 t:17.6s +tttg: c272/319 lr:0.000053 t:17.6s +tttg: c273/319 lr:0.000051 t:17.7s +tttg: c274/319 lr:0.000049 t:17.8s +tttg: c275/319 lr:0.000046 t:17.8s +tttg: c276/319 lr:0.000044 t:17.9s +tttg: c277/319 lr:0.000042 t:18.0s +tttg: c278/319 lr:0.000040 t:18.0s +tttg: c279/319 lr:0.000039 t:18.1s +tttg: c280/319 lr:0.000037 t:18.2s +tttg: c281/319 lr:0.000035 t:18.2s +tttg: c282/319 lr:0.000033 t:18.3s +tttg: c283/319 lr:0.000031 t:18.4s +tttg: c284/319 lr:0.000030 t:18.4s +tttg: c285/319 lr:0.000028 t:18.5s +tttg: c286/319 lr:0.000026 t:18.5s +tttg: c287/319 lr:0.000025 t:18.6s +tttg: c288/319 lr:0.000023 t:18.7s +tttg: c289/319 lr:0.000022 t:18.7s +tttg: c290/319 lr:0.000020 t:18.8s +tttg: c291/319 lr:0.000019 t:18.9s +tttg: c292/319 lr:0.000018 t:18.9s +tttg: c293/319 lr:0.000016 t:19.0s +tttg: c294/319 lr:0.000015 t:19.1s +tttg: c295/319 lr:0.000014 t:19.1s +tttg: c296/319 lr:0.000013 t:19.2s +tttg: c297/319 lr:0.000012 t:19.3s +tttg: c298/319 lr:0.000011 t:19.3s +tttg: c299/319 lr:0.000010 t:19.4s +tttg: c300/319 lr:0.000009 t:19.5s +tttg: c301/319 lr:0.000008 t:19.5s +tttg: c302/319 lr:0.000007 t:19.6s +tttg: c303/319 lr:0.000006 t:19.6s +tttg: c304/319 lr:0.000005 t:19.7s +tttg: c305/319 lr:0.000005 t:19.8s +tttg: c306/319 lr:0.000004 t:19.8s +tttg: c307/319 lr:0.000004 t:19.9s +tttg: c308/319 lr:0.000003 t:20.0s +tttg: c309/319 lr:0.000002 t:20.0s +tttg: c310/319 lr:0.000002 t:20.1s +tttg: c311/319 lr:0.000002 t:20.2s +tttg: c312/319 lr:0.000001 t:20.2s +tttg: c313/319 lr:0.000001 t:20.3s +tttg: c314/319 lr:0.000001 t:20.4s +tttg: c315/319 lr:0.000000 t:20.4s +tttg: c316/319 lr:0.000000 t:20.5s +tttg: c317/319 lr:0.000000 t:20.5s +tttg: c318/319 lr:0.000000 t:20.6s +ttpr: phase:3/3 t:391.0s +ttp: b2993/3125 bl:2.4104 bb:1.0373 rl:2.3048 rb:1.0801 dl:3039-3050 gd:1 +ttp: b2979/3125 bl:2.1271 bb:0.9821 rl:2.3001 rb:1.0775 dl:2861-2872 gd:1 +ttp: b2975/3125 bl:2.4479 bb:1.1364 rl:2.3039 rb:1.0790 dl:2819-2830 gd:1 +ttp: b2968/3125 bl:2.3282 bb:1.0619 rl:2.3044 rb:1.0786 dl:2753-2762 gd:1 +ttp: b2956/3125 bl:2.3096 bb:1.0529 rl:2.3046 rb:1.0780 dl:2644-2652 gd:1 +ttp: b2947/3125 bl:2.2187 bb:0.9902 rl:2.3027 rb:1.0760 dl:2565-2572 gd:1 +ttp: b2942/3125 bl:2.2875 bb:1.0683 rl:2.3024 rb:1.0758 dl:2534-2538 gd:1 +ttp: b2934/3125 bl:2.2519 bb:1.0560 rl:2.3014 rb:1.0754 dl:2475-2480 gd:1 +ttp: b2927/3125 bl:2.3775 bb:1.1454 rl:2.3029 rb:1.0767 dl:2430-2435 gd:1 +ttp: b2919/3125 bl:2.3453 bb:1.0313 rl:2.3036 rb:1.0759 dl:2363-2367 gd:1 +ttp: b2908/3125 bl:2.3733 bb:1.0859 rl:2.3048 rb:1.0760 dl:2301-2305 gd:1 +ttp: b2903/3125 bl:2.4509 bb:1.0857 rl:2.3073 rb:1.0762 dl:2267-2272 gd:1 +ttp: b2892/3125 bl:2.5682 bb:1.1172 rl:2.3115 rb:1.0769 dl:2199-2203 gd:1 +ttp: b2888/3125 bl:2.3718 bb:1.1171 rl:2.3125 rb:1.0775 dl:2181-2185 gd:1 +ttp: b2881/3125 bl:2.3135 bb:0.9806 rl:2.3125 rb:1.0759 dl:2144-2147 gd:1 +ttp: b2872/3125 bl:2.2122 bb:1.0104 rl:2.3110 rb:1.0749 dl:2101-2106 gd:1 +ttp: b2866/3125 bl:2.2573 bb:0.9773 rl:2.3102 rb:1.0734 dl:2072-2078 gd:1 +ttp: b2857/3125 bl:2.4723 bb:1.0405 rl:2.3125 rb:1.0729 dl:2036-2039 gd:1 +ttp: b2838/3125 bl:2.3727 bb:1.0490 rl:2.3133 rb:1.0726 dl:1956-1958 gd:1 +ttp: b2830/3125 bl:2.2246 bb:0.9987 rl:2.3121 rb:1.0716 dl:1927-1929 gd:1 +ttp: b2822/3125 bl:2.2856 bb:1.0279 rl:2.3118 rb:1.0711 dl:1901-1903 gd:1 +ttp: b2815/3125 bl:2.3021 bb:1.0667 rl:2.3117 rb:1.0710 dl:1881-1883 gd:1 +ttp: b2807/3125 bl:2.6291 bb:1.1359 rl:2.3154 rb:1.0718 dl:1853-1855 gd:1 +ttp: b2799/3125 bl:2.2416 bb:0.9866 rl:2.3146 rb:1.0708 dl:1830-1832 gd:1 +ttp: b2792/3125 bl:2.2562 bb:1.0088 rl:2.3139 rb:1.0701 dl:1812-1814 gd:1 +ttp: b2786/3125 bl:2.3699 bb:1.0415 rl:2.3146 rb:1.0698 dl:1792-1795 gd:1 +ttp: b2779/3125 bl:2.4459 bb:1.1173 rl:2.3160 rb:1.0703 dl:1774-1777 gd:1 +ttp: b2771/3125 bl:2.4224 bb:1.0131 rl:2.3171 rb:1.0696 dl:1751-1755 gd:1 +ttp: b2763/3125 bl:2.4630 bb:1.0988 rl:2.3186 rb:1.0699 dl:1731-1733 gd:1 +ttp: b2755/3125 bl:2.5274 bb:1.0696 rl:2.3207 rb:1.0699 dl:1710-1712 gd:1 +ttp: b2747/3125 bl:2.2627 bb:1.0205 rl:2.3201 rb:1.0694 dl:1690-1693 gd:1 +ttp: b2739/3125 bl:2.3682 bb:1.0467 rl:2.3206 rb:1.0692 dl:1670-1673 gd:1 +ttp: b2731/3125 bl:2.1984 bb:1.0397 rl:2.3195 rb:1.0689 dl:1651-1654 gd:1 +ttp: b2723/3125 bl:2.2892 bb:1.0485 rl:2.3192 rb:1.0687 dl:1632-1635 gd:1 +ttp: b2715/3125 bl:2.2350 bb:1.0384 rl:2.3184 rb:1.0685 dl:1614-1616 gd:1 +ttp: b2707/3125 bl:2.2028 bb:0.9956 rl:2.3174 rb:1.0678 dl:1598-1600 gd:1 +ttp: b2699/3125 bl:2.3491 bb:1.0553 rl:2.3177 rb:1.0677 dl:1583-1584 gd:1 +ttp: b2691/3125 bl:2.2546 bb:1.0631 rl:2.3171 rb:1.0676 dl:1566-1569 gd:1 +ttp: b2683/3125 bl:2.3761 bb:1.0401 rl:2.3176 rb:1.0674 dl:1549-1551 gd:1 +ttp: b2675/3125 bl:2.2812 bb:1.0214 rl:2.3173 rb:1.0670 dl:1533-1535 gd:1 +ttp: b2667/3125 bl:2.3483 bb:1.0478 rl:2.3176 rb:1.0668 dl:1518-1519 gd:1 +ttp: b2659/3125 bl:2.3463 bb:1.0234 rl:2.3178 rb:1.0665 dl:1503-1505 gd:1 +ttp: b2651/3125 bl:2.1927 bb:0.9941 rl:2.3168 rb:1.0659 dl:1489-1491 gd:1 +ttp: b2643/3125 bl:2.3864 bb:1.0747 rl:2.3173 rb:1.0660 dl:1477-1479 gd:1 +ttp: b2635/3125 bl:2.3693 bb:1.0045 rl:2.3177 rb:1.0655 dl:1462-1464 gd:1 +ttp: b2628/3125 bl:2.3126 bb:1.0788 rl:2.3177 rb:1.0656 dl:1451-1452 gd:1 +ttp: b2621/3125 bl:2.3568 bb:1.1274 rl:2.3180 rb:1.0660 dl:1439-1440 gd:1 +ttp: b2613/3125 bl:2.3254 bb:1.0575 rl:2.3180 rb:1.0659 dl:1425-1427 gd:1 +ttp: b2605/3125 bl:2.2527 bb:1.0584 rl:2.3176 rb:1.0659 dl:1411-1413 gd:1 +ttp: b2597/3125 bl:2.2807 bb:1.0213 rl:2.3173 rb:1.0656 dl:1398-1399 gd:1 +ttp: b2589/3125 bl:2.3439 bb:0.9903 rl:2.3175 rb:1.0650 dl:1387-1388 gd:1 +ttp: b2581/3125 bl:2.2590 bb:1.0197 rl:2.3171 rb:1.0647 dl:1375-1376 gd:1 +ttp: b2573/3125 bl:2.3147 bb:1.0396 rl:2.3171 rb:1.0645 dl:1362-1363 gd:1 +ttp: b2565/3125 bl:2.3503 bb:1.0858 rl:2.3173 rb:1.0647 dl:1349-1351 gd:1 +ttp: b2557/3125 bl:2.2172 bb:1.0777 rl:2.3167 rb:1.0647 dl:1337-1338 gd:1 +ttp: b2549/3125 bl:2.2864 bb:1.0542 rl:2.3165 rb:1.0647 dl:1325-1327 gd:1 +ttp: b2541/3125 bl:2.4271 bb:1.0481 rl:2.3172 rb:1.0646 dl:1314-1316 gd:1 +ttp: b2534/3125 bl:2.3803 bb:0.9925 rl:2.3176 rb:1.0641 dl:1304-1305 gd:1 +ttp: b2526/3125 bl:2.2977 bb:1.0181 rl:2.3174 rb:1.0638 dl:1292-1293 gd:1 +ttp: b2518/3125 bl:2.4016 bb:1.0127 rl:2.3179 rb:1.0634 dl:1281-1283 gd:1 +ttp: b2510/3125 bl:2.2795 bb:1.0089 rl:2.3177 rb:1.0631 dl:1272-1273 gd:1 +ttp: b2502/3125 bl:2.3481 bb:1.0595 rl:2.3179 rb:1.0631 dl:1261-1262 gd:1 +ttp: b2494/3125 bl:2.2171 bb:1.0688 rl:2.3173 rb:1.0631 dl:1250-1252 gd:1 +ttp: b2484/3125 bl:2.4155 bb:1.0432 rl:2.3179 rb:1.0630 dl:1236-1237 gd:1 +ttp: b2476/3125 bl:2.1755 bb:0.9966 rl:2.3171 rb:1.0626 dl:1225-1226 gd:1 +ttp: b2468/3125 bl:2.4389 bb:1.0265 rl:2.3177 rb:1.0624 dl:1215-1216 gd:1 +ttp: b2460/3125 bl:2.5189 bb:1.0912 rl:2.3188 rb:1.0626 dl:1204-1205 gd:1 +ttp: b2452/3125 bl:2.3025 bb:1.0228 rl:2.3187 rb:1.0624 dl:1194-1195 gd:1 +ttp: b2444/3125 bl:2.2775 bb:1.0025 rl:2.3185 rb:1.0620 dl:1185-1186 gd:1 +ttp: b2436/3125 bl:2.3055 bb:0.9823 rl:2.3185 rb:1.0616 dl:1176-1177 gd:1 +ttp: b2428/3125 bl:2.2522 bb:0.9981 rl:2.3181 rb:1.0612 dl:1167-1168 gd:1 +ttp: b2420/3125 bl:2.2672 bb:0.9935 rl:2.3179 rb:1.0609 dl:1158-1159 gd:1 +ttp: b2412/3125 bl:2.5948 bb:1.0597 rl:2.3192 rb:1.0609 dl:1149-1150 gd:1 +ttp: b2404/3125 bl:2.2097 bb:0.9885 rl:2.3187 rb:1.0605 dl:1140-1141 gd:1 +ttp: b2396/3125 bl:2.4441 bb:1.0875 rl:2.3193 rb:1.0606 dl:1132-1133 gd:1 +ttp: b2388/3125 bl:2.2622 bb:1.0054 rl:2.3190 rb:1.0604 dl:1124-1124 gd:1 +ttp: b2381/3125 bl:2.1805 bb:1.0813 rl:2.3184 rb:1.0605 dl:1115-1115 gd:1 +ttp: b2371/3125 bl:2.2654 bb:0.9932 rl:2.3181 rb:1.0601 dl:1105-1106 gd:1 +ttp: b2363/3125 bl:2.4996 bb:1.1389 rl:2.3190 rb:1.0605 dl:1096-1097 gd:1 +ttp: b2355/3125 bl:2.4610 bb:1.0918 rl:2.3196 rb:1.0606 dl:1088-1088 gd:1 +ttp: b2346/3125 bl:2.3536 bb:0.9998 rl:2.3198 rb:1.0603 dl:1077-1079 gd:1 +ttp: b2336/3125 bl:2.3644 bb:1.0423 rl:2.3200 rb:1.0603 dl:1067-1069 gd:1 +ttp: b2329/3125 bl:2.3581 bb:1.0283 rl:2.3201 rb:1.0601 dl:1060-1061 gd:1 +ttp: b2322/3125 bl:2.2953 bb:1.0167 rl:2.3200 rb:1.0599 dl:1053-1054 gd:1 +ttp: b2314/3125 bl:2.5350 bb:1.0786 rl:2.3210 rb:1.0600 dl:1045-1046 gd:1 +ttp: b2308/3125 bl:2.0679 bb:0.9679 rl:2.3199 rb:1.0596 dl:1040-1041 gd:1 +ttp: b2301/3125 bl:2.2953 bb:1.0480 rl:2.3198 rb:1.0596 dl:1033-1034 gd:1 +ttp: b2291/3125 bl:2.4000 bb:1.0679 rl:2.3201 rb:1.0596 dl:1023-1024 gd:1 +ttp: b2283/3125 bl:2.1999 bb:1.0216 rl:2.3196 rb:1.0595 dl:1015-1016 gd:1 +ttp: b2273/3125 bl:2.3072 bb:1.0267 rl:2.3196 rb:1.0593 dl:1007-1008 gd:1 +ttp: b2267/3125 bl:2.1293 bb:0.9581 rl:2.3188 rb:1.0589 dl:1003-1003 gd:1 +ttp: b2259/3125 bl:2.3709 bb:0.9876 rl:2.3190 rb:1.0586 dl:995-996 gd:1 +ttp: b2252/3125 bl:2.1793 bb:1.0138 rl:2.3185 rb:1.0584 dl:989-990 gd:1 +ttp: b2244/3125 bl:2.3188 bb:1.0583 rl:2.3185 rb:1.0584 dl:982-983 gd:1 +ttp: b2215/3125 bl:2.3507 bb:1.0802 rl:2.3186 rb:1.0585 dl:957-958 gd:1 +ttp: b2207/3125 bl:2.2527 bb:1.0162 rl:2.3183 rb:1.0583 dl:951-951 gd:1 +ttp: b2198/3125 bl:2.2216 bb:1.0369 rl:2.3180 rb:1.0583 dl:944-945 gd:1 +ttp: b2191/3125 bl:2.0702 bb:1.0042 rl:2.3171 rb:1.0581 dl:937-938 gd:1 +ttp: b2185/3125 bl:2.2855 bb:1.0174 rl:2.3169 rb:1.0579 dl:934-934 gd:1 +ttp: b2177/3125 bl:2.2162 bb:1.0072 rl:2.3166 rb:1.0577 dl:927-928 gd:1 +ttp: b2170/3125 bl:2.3431 bb:1.1243 rl:2.3167 rb:1.0580 dl:922-923 gd:1 +ttp: b2161/3125 bl:2.2350 bb:1.0508 rl:2.3164 rb:1.0579 dl:915-916 gd:1 +ttp: b2154/3125 bl:2.3626 bb:1.0197 rl:2.3165 rb:1.0578 dl:909-910 gd:1 +ttp: b2146/3125 bl:2.3057 bb:1.0864 rl:2.3165 rb:1.0579 dl:903-904 gd:1 +ttp: b2140/3125 bl:2.5930 bb:1.1609 rl:2.3175 rb:1.0583 dl:898-899 gd:1 +ttp: b2133/3125 bl:2.3745 bb:1.0426 rl:2.3177 rb:1.0582 dl:893-893 gd:1 +ttp: b2125/3125 bl:2.3694 bb:1.0772 rl:2.3178 rb:1.0583 dl:887-888 gd:1 +ttp: b2120/3125 bl:2.2735 bb:1.0760 rl:2.3177 rb:1.0583 dl:884-884 gd:1 +ttp: b2112/3125 bl:2.4356 bb:1.0566 rl:2.3181 rb:1.0583 dl:877-878 gd:1 +ttp: b2104/3125 bl:2.3075 bb:1.0867 rl:2.3180 rb:1.0584 dl:871-872 gd:1 +ttp: b2096/3125 bl:2.3021 bb:1.0298 rl:2.3180 rb:1.0583 dl:865-866 gd:1 +ttp: b2088/3125 bl:2.3182 bb:1.0746 rl:2.3180 rb:1.0584 dl:859-860 gd:1 +ttp: b2079/3125 bl:2.3190 bb:0.9343 rl:2.3180 rb:1.0579 dl:853-854 gd:1 +ttp: b2073/3125 bl:2.1910 bb:1.0328 rl:2.3176 rb:1.0578 dl:850-850 gd:1 +ttp: b2065/3125 bl:2.4303 bb:1.0682 rl:2.3179 rb:1.0579 dl:843-844 gd:1 +ttp: b2058/3125 bl:2.3633 bb:1.0342 rl:2.3181 rb:1.0578 dl:839-839 gd:1 +ttp: b2049/3125 bl:2.3006 bb:1.0391 rl:2.3180 rb:1.0577 dl:832-833 gd:1 +ttp: b2040/3125 bl:2.4078 bb:1.1077 rl:2.3183 rb:1.0579 dl:825-826 gd:1 +ttp: b2032/3125 bl:2.4484 bb:1.0602 rl:2.3187 rb:1.0579 dl:819-820 gd:1 +ttp: b2025/3125 bl:2.3684 bb:1.0344 rl:2.3188 rb:1.0578 dl:814-815 gd:1 +ttp: b2017/3125 bl:2.3323 bb:1.0391 rl:2.3189 rb:1.0578 dl:809-810 gd:1 +ttp: b2010/3125 bl:2.5207 bb:1.0954 rl:2.3195 rb:1.0579 dl:805-805 gd:1 +ttp: b2001/3125 bl:2.3562 bb:0.9914 rl:2.3196 rb:1.0577 dl:799-800 gd:1 +ttp: b1992/3125 bl:2.2890 bb:1.0360 rl:2.3195 rb:1.0576 dl:793-794 gd:1 +ttp: b1983/3125 bl:2.3055 bb:1.0459 rl:2.3194 rb:1.0576 dl:787-788 gd:1 +ttp: b1975/3125 bl:2.4487 bb:1.0736 rl:2.3198 rb:1.0576 dl:782-782 gd:1 +ttp: b1966/3125 bl:2.2429 bb:1.0221 rl:2.3196 rb:1.0575 dl:776-777 gd:1 +ttp: b1957/3125 bl:2.4190 bb:1.0817 rl:2.3199 rb:1.0576 dl:771-772 gd:1 +ttp: b1950/3125 bl:2.3258 bb:1.0470 rl:2.3199 rb:1.0576 dl:767-768 gd:1 +ttp: b1941/3125 bl:2.5040 bb:1.1045 rl:2.3204 rb:1.0577 dl:761-762 gd:1 +ttp: b1933/3125 bl:2.4539 bb:1.0424 rl:2.3207 rb:1.0576 dl:756-757 gd:1 +ttp: b1926/3125 bl:2.3136 bb:1.0665 rl:2.3207 rb:1.0577 dl:753-753 gd:1 +ttp: b1918/3125 bl:2.4641 bb:1.1217 rl:2.3211 rb:1.0578 dl:747-748 gd:1 +ttp: b1909/3125 bl:2.2464 bb:1.0684 rl:2.3209 rb:1.0579 dl:742-743 gd:1 +ttp: b1903/3125 bl:2.3353 bb:1.0988 rl:2.3209 rb:1.0580 dl:739-739 gd:1 +ttp: b1894/3125 bl:2.1971 bb:1.0108 rl:2.3206 rb:1.0579 dl:734-734 gd:1 +ttp: b1886/3125 bl:2.3846 bb:1.1064 rl:2.3208 rb:1.0580 dl:729-729 gd:1 +ttp: b1879/3125 bl:2.3815 bb:1.0549 rl:2.3209 rb:1.0580 dl:725-725 gd:1 +ttp: b1869/3125 bl:2.2905 bb:1.0332 rl:2.3209 rb:1.0579 dl:719-720 gd:1 +ttp: b1861/3125 bl:2.3051 bb:0.9902 rl:2.3208 rb:1.0577 dl:714-715 gd:1 +ttp: b1853/3125 bl:2.2466 bb:1.0700 rl:2.3206 rb:1.0578 dl:710-711 gd:1 +ttp: b1847/3125 bl:2.3455 bb:1.0503 rl:2.3207 rb:1.0577 dl:707-707 gd:1 +ttp: b1838/3125 bl:2.1767 bb:1.0419 rl:2.3204 rb:1.0577 dl:701-702 gd:1 +ttp: b1831/3125 bl:2.1088 bb:1.0229 rl:2.3198 rb:1.0576 dl:699-699 gd:1 +ttp: b1821/3125 bl:2.4165 bb:1.0486 rl:2.3201 rb:1.0576 dl:693-694 gd:1 +ttp: b1813/3125 bl:2.2268 bb:1.0627 rl:2.3199 rb:1.0576 dl:689-690 gd:1 +ttp: b1806/3125 bl:2.2508 bb:1.0239 rl:2.3197 rb:1.0575 dl:685-686 gd:1 +ttp: b1800/3125 bl:2.4417 bb:1.0296 rl:2.3200 rb:1.0575 dl:682-682 gd:1 +ttp: b1791/3125 bl:2.3867 bb:1.0271 rl:2.3201 rb:1.0574 dl:677-678 gd:1 +ttp: b1786/3125 bl:2.4325 bb:1.0925 rl:2.3204 rb:1.0575 dl:675-675 gd:1 +ttp: b1779/3125 bl:2.4877 bb:1.1475 rl:2.3208 rb:1.0577 dl:671-671 gd:1 +ttp: b1771/3125 bl:2.1648 bb:1.0106 rl:2.3204 rb:1.0576 dl:667-667 gd:1 +ttp: b1763/3125 bl:2.4460 bb:1.0911 rl:2.3207 rb:1.0576 dl:663-663 gd:1 +ttp: b1754/3125 bl:2.3014 bb:1.0042 rl:2.3207 rb:1.0575 dl:657-658 gd:1 +ttp: b1745/3125 bl:2.3478 bb:1.0024 rl:2.3207 rb:1.0574 dl:653-654 gd:1 +ttp: b1739/3125 bl:2.3068 bb:1.0695 rl:2.3207 rb:1.0574 dl:650-650 gd:1 +ttp: b1730/3125 bl:2.1541 bb:1.0674 rl:2.3203 rb:1.0574 dl:646-646 gd:1 +ttp: b1724/3125 bl:2.3396 bb:1.0102 rl:2.3204 rb:1.0573 dl:643-643 gd:1 +ttp: b1713/3125 bl:2.4381 bb:1.1532 rl:2.3206 rb:1.0575 dl:638-639 gd:1 +ttp: b1705/3125 bl:2.3204 bb:1.0828 rl:2.3206 rb:1.0576 dl:634-635 gd:1 +ttp: b1697/3125 bl:2.2660 bb:0.9980 rl:2.3205 rb:1.0574 dl:630-631 gd:1 +ttp: b1689/3125 bl:2.3697 bb:1.0532 rl:2.3206 rb:1.0574 dl:626-627 gd:1 +ttp: b1681/3125 bl:2.1614 bb:0.9934 rl:2.3203 rb:1.0573 dl:622-623 gd:1 +ttp: b1673/3125 bl:2.2929 bb:1.0637 rl:2.3202 rb:1.0573 dl:618-619 gd:1 +ttp: b1667/3125 bl:2.2612 bb:1.0172 rl:2.3201 rb:1.0572 dl:616-616 gd:1 +ttp: b1658/3125 bl:2.3273 bb:1.0894 rl:2.3201 rb:1.0573 dl:612-612 gd:1 +ttp: b1652/3125 bl:2.4563 bb:1.0859 rl:2.3204 rb:1.0574 dl:609-609 gd:1 +ttp: b1644/3125 bl:2.3630 bb:1.0592 rl:2.3205 rb:1.0574 dl:605-605 gd:1 +ttp: b1634/3125 bl:2.2856 bb:1.0154 rl:2.3204 rb:1.0573 dl:599-600 gd:1 +ttp: b1625/3125 bl:2.0984 bb:1.0021 rl:2.3200 rb:1.0572 dl:595-596 gd:1 +ttp: b1620/3125 bl:2.3876 bb:1.0713 rl:2.3201 rb:1.0572 dl:593-593 gd:1 +ttp: b1608/3125 bl:2.2636 bb:0.9424 rl:2.3200 rb:1.0570 dl:587-588 gd:1 +ttp: b1600/3125 bl:2.2902 bb:1.0408 rl:2.3199 rb:1.0569 dl:584-584 gd:1 +ttp: b1592/3125 bl:2.3277 bb:1.0138 rl:2.3200 rb:1.0569 dl:580-581 gd:1 +ttp: b1587/3125 bl:2.3067 bb:1.0476 rl:2.3199 rb:1.0568 dl:578-578 gd:1 +ttp: b1577/3125 bl:2.2335 bb:1.0307 rl:2.3198 rb:1.0568 dl:573-574 gd:1 +ttp: b1568/3125 bl:2.0075 bb:0.9841 rl:2.3192 rb:1.0567 dl:569-570 gd:1 +ttp: b1562/3125 bl:2.3189 bb:1.0208 rl:2.3192 rb:1.0566 dl:566-567 gd:1 +ttp: b1553/3125 bl:2.1039 bb:0.9695 rl:2.3188 rb:1.0564 dl:563-563 gd:1 +ttp: b1544/3125 bl:2.3930 bb:1.1010 rl:2.3190 rb:1.0565 dl:558-559 gd:1 +ttp: b1537/3125 bl:2.3495 bb:1.0556 rl:2.3190 rb:1.0565 dl:555-556 gd:1 +ttp: b1531/3125 bl:2.2843 bb:1.0802 rl:2.3189 rb:1.0566 dl:553-553 gd:1 +ttp: b1520/3125 bl:2.3130 bb:1.0500 rl:2.3189 rb:1.0566 dl:548-549 gd:1 +ttp: b1513/3125 bl:2.3644 bb:1.0481 rl:2.3190 rb:1.0565 dl:545-546 gd:1 +ttp: b1506/3125 bl:2.1761 bb:1.0447 rl:2.3188 rb:1.0565 dl:543-543 gd:1 +ttp: b1497/3125 bl:2.5798 bb:1.1528 rl:2.3192 rb:1.0567 dl:538-539 gd:1 +ttp: b1490/3125 bl:2.5028 bb:1.0892 rl:2.3195 rb:1.0567 dl:536-536 gd:1 +ttp: b1484/3125 bl:2.2227 bb:1.1088 rl:2.3194 rb:1.0568 dl:533-533 gd:1 +ttp: b1473/3125 bl:2.4021 bb:1.1381 rl:2.3195 rb:1.0569 dl:528-529 gd:1 +ttp: b1467/3125 bl:2.2015 bb:1.0561 rl:2.3193 rb:1.0569 dl:526-526 gd:1 +ttp: b1455/3125 bl:2.4520 bb:1.0869 rl:2.3195 rb:1.0570 dl:521-522 gd:1 +ttp: b1449/3125 bl:2.3814 bb:1.0597 rl:2.3196 rb:1.0570 dl:518-519 gd:1 +ttp: b1443/3125 bl:2.3398 bb:1.1529 rl:2.3196 rb:1.0571 dl:516-516 gd:1 +ttp: b1432/3125 bl:2.4195 bb:1.0585 rl:2.3198 rb:1.0571 dl:511-512 gd:1 +ttp: b1423/3125 bl:2.5237 bb:1.1403 rl:2.3201 rb:1.0573 dl:507-508 gd:1 +ttp: b1417/3125 bl:2.2998 bb:1.1026 rl:2.3201 rb:1.0573 dl:504-505 gd:1 +ttp: b1407/3125 bl:2.4601 bb:1.1267 rl:2.3203 rb:1.0575 dl:500-501 gd:1 +ttp: b1404/3125 bl:2.4204 bb:1.0943 rl:2.3205 rb:1.0575 dl:499-499 gd:1 +ttp: b1392/3125 bl:2.3811 bb:1.0362 rl:2.3206 rb:1.0575 dl:495-495 gd:1 +ttp: b1383/3125 bl:2.4178 bb:1.0367 rl:2.3207 rb:1.0574 dl:491-492 gd:1 +ttp: b1375/3125 bl:2.2892 bb:1.0502 rl:2.3207 rb:1.0574 dl:488-489 gd:1 +ttp: b1367/3125 bl:2.2358 bb:1.0080 rl:2.3205 rb:1.0574 dl:485-486 gd:1 +ttp: b1359/3125 bl:2.4269 bb:1.0692 rl:2.3207 rb:1.0574 dl:482-483 gd:1 +ttp: b1351/3125 bl:2.5089 bb:1.1546 rl:2.3210 rb:1.0575 dl:480-480 gd:1 +ttp: b1343/3125 bl:2.4741 bb:1.1418 rl:2.3212 rb:1.0576 dl:476-477 gd:1 +ttp: b1336/3125 bl:2.3698 bb:1.0758 rl:2.3213 rb:1.0577 dl:474-474 gd:1 +ttp: b1328/3125 bl:2.2915 bb:1.0667 rl:2.3212 rb:1.0577 dl:471-471 gd:1 +ttp: b1321/3125 bl:2.2183 bb:1.0491 rl:2.3211 rb:1.0577 dl:468-468 gd:1 +ttp: b1313/3125 bl:2.2856 bb:1.1270 rl:2.3210 rb:1.0578 dl:465-465 gd:1 +ttp: b1303/3125 bl:2.2117 bb:0.9764 rl:2.3209 rb:1.0576 dl:462-462 gd:1 +ttp: b1294/3125 bl:2.2041 bb:1.0424 rl:2.3207 rb:1.0576 dl:458-459 gd:1 +ttp: b1291/3125 bl:2.5059 bb:1.1453 rl:2.3210 rb:1.0577 dl:457-457 gd:1 +ttp: b1280/3125 bl:2.2684 bb:1.0710 rl:2.3209 rb:1.0578 dl:453-453 gd:1 +ttp: b1274/3125 bl:2.5185 bb:1.1709 rl:2.3212 rb:1.0579 dl:450-450 gd:1 +ttp: b1262/3125 bl:2.3681 bb:1.0928 rl:2.3212 rb:1.0580 dl:445-446 gd:1 +ttp: b1258/3125 bl:2.4033 bb:1.0229 rl:2.3213 rb:1.0579 dl:444-444 gd:1 +ttp: b1250/3125 bl:2.3318 bb:1.0786 rl:2.3213 rb:1.0579 dl:441-441 gd:1 +ttp: b1240/3125 bl:2.1224 bb:1.0719 rl:2.3211 rb:1.0579 dl:437-438 gd:1 +ttp: b1232/3125 bl:2.3347 bb:1.0956 rl:2.3211 rb:1.0580 dl:434-435 gd:1 +ttp: b1224/3125 bl:2.4349 bb:1.0634 rl:2.3212 rb:1.0580 dl:431-432 gd:1 +ttp: b1217/3125 bl:2.3407 bb:1.1392 rl:2.3213 rb:1.0581 dl:429-429 gd:1 +ttp: b1209/3125 bl:2.4324 bb:1.0752 rl:2.3214 rb:1.0581 dl:426-426 gd:1 +ttp: b1198/3125 bl:2.2279 bb:1.0199 rl:2.3213 rb:1.0581 dl:421-422 gd:1 +ttp: b1194/3125 bl:2.4252 bb:1.1974 rl:2.3214 rb:1.0582 dl:420-420 gd:1 +ttp: b1186/3125 bl:2.4433 bb:1.0870 rl:2.3216 rb:1.0583 dl:417-417 gd:1 +ttp: b1178/3125 bl:2.1840 bb:1.0573 rl:2.3214 rb:1.0583 dl:414-414 gd:1 +ttp: b1166/3125 bl:2.4184 bb:1.1192 rl:2.3215 rb:1.0583 dl:409-410 gd:1 +ttp: b1158/3125 bl:2.2721 bb:1.0593 rl:2.3215 rb:1.0583 dl:406-407 gd:1 +ttp: b1154/3125 bl:2.2492 bb:1.0109 rl:2.3214 rb:1.0583 dl:405-405 gd:1 +ttp: b1146/3125 bl:2.5026 bb:1.1759 rl:2.3216 rb:1.0584 dl:402-402 gd:1 +ttp: b1135/3125 bl:2.4201 bb:1.1126 rl:2.3217 rb:1.0585 dl:398-399 gd:1 +ttp: b1128/3125 bl:2.1811 bb:1.0899 rl:2.3215 rb:1.0585 dl:396-396 gd:1 +ttp: b1122/3125 bl:2.2924 bb:1.0743 rl:2.3215 rb:1.0585 dl:394-394 gd:1 +ttp: b1111/3125 bl:2.3261 bb:1.0165 rl:2.3215 rb:1.0585 dl:390-391 gd:1 +ttp: b1108/3125 bl:2.2548 bb:1.0825 rl:2.3214 rb:1.0585 dl:389-389 gd:1 +ttp: b1098/3125 bl:2.4838 bb:1.1020 rl:2.3216 rb:1.0586 dl:386-386 gd:1 +ttp: b1093/3125 bl:2.3541 bb:1.0697 rl:2.3217 rb:1.0586 dl:384-384 gd:1 +ttp: b1085/3125 bl:2.3862 bb:1.0595 rl:2.3217 rb:1.0586 dl:382-382 gd:1 +ttp: b1081/3125 bl:2.5418 bb:1.1463 rl:2.3220 rb:1.0587 dl:380-380 gd:1 +ttp: b1074/3125 bl:2.5215 bb:1.1158 rl:2.3222 rb:1.0587 dl:378-378 gd:1 +ttp: b1067/3125 bl:2.5386 bb:1.2029 rl:2.3224 rb:1.0589 dl:376-376 gd:1 +ttp: b1061/3125 bl:2.2736 bb:1.1180 rl:2.3224 rb:1.0589 dl:374-374 gd:1 +ttp: b1055/3125 bl:2.4026 bb:1.1183 rl:2.3225 rb:1.0590 dl:372-372 gd:1 +ttp: b1049/3125 bl:2.4526 bb:1.1100 rl:2.3226 rb:1.0591 dl:370-370 gd:1 +ttp: b1038/3125 bl:2.4157 bb:1.0806 rl:2.3227 rb:1.0591 dl:366-367 gd:1 +ttp: b1033/3125 bl:2.3333 bb:1.0821 rl:2.3227 rb:1.0591 dl:365-365 gd:1 +ttp: b1023/3125 bl:2.4524 bb:1.1028 rl:2.3229 rb:1.0592 dl:361-362 gd:1 +ttp: b1017/3125 bl:2.3651 bb:1.0780 rl:2.3229 rb:1.0592 dl:360-360 gd:1 +ttp: b1006/3125 bl:2.3007 bb:1.0354 rl:2.3229 rb:1.0592 dl:356-357 gd:1 +ttp: b999/3125 bl:2.1018 bb:1.0294 rl:2.3226 rb:1.0591 dl:354-355 gd:1 +ttp: b995/3125 bl:2.5869 bb:1.1179 rl:2.3229 rb:1.0592 dl:353-353 gd:1 +ttp: b984/3125 bl:2.4025 bb:1.0793 rl:2.3230 rb:1.0592 dl:350-350 gd:1 +ttp: b974/3125 bl:2.2635 bb:1.1194 rl:2.3229 rb:1.0593 dl:346-347 gd:1 +ttp: b969/3125 bl:2.4981 bb:1.1263 rl:2.3231 rb:1.0593 dl:345-345 gd:1 +ttp: b959/3125 bl:2.3096 bb:1.0738 rl:2.3231 rb:1.0593 dl:342-342 gd:1 +ttp: b954/3125 bl:2.5120 bb:1.1507 rl:2.3233 rb:1.0594 dl:340-340 gd:1 +ttp: b943/3125 bl:2.2057 bb:1.0290 rl:2.3232 rb:1.0594 dl:336-337 gd:1 +ttp: b942/3125 bl:2.3774 bb:1.0806 rl:2.3232 rb:1.0594 dl:336-336 gd:1 +ttp: b935/3125 bl:2.3304 bb:1.1092 rl:2.3232 rb:1.0595 dl:334-334 gd:1 +ttp: b927/3125 bl:2.3783 bb:1.0615 rl:2.3233 rb:1.0595 dl:332-332 gd:1 +ttp: b920/3125 bl:2.4715 bb:1.1683 rl:2.3234 rb:1.0596 dl:330-330 gd:1 +ttp: b908/3125 bl:2.5397 bb:1.2484 rl:2.3236 rb:1.0597 dl:326-327 gd:1 +ttp: b901/3125 bl:2.3381 bb:1.0187 rl:2.3236 rb:1.0597 dl:324-325 gd:1 +ttp: b894/3125 bl:2.5773 bb:1.1972 rl:2.3239 rb:1.0598 dl:322-323 gd:1 +ttp: b888/3125 bl:2.3814 bb:1.1035 rl:2.3239 rb:1.0599 dl:321-321 gd:1 +ttp: b883/3125 bl:2.2975 bb:1.0589 rl:2.3239 rb:1.0599 dl:319-319 gd:1 +ttp: b870/3125 bl:2.5789 bb:1.0972 rl:2.3241 rb:1.0599 dl:315-316 gd:1 +ttp: b864/3125 bl:2.6307 bb:1.2001 rl:2.3244 rb:1.0600 dl:314-314 gd:1 +ttp: b858/3125 bl:2.4209 bb:1.1513 rl:2.3245 rb:1.0601 dl:312-312 gd:1 +ttp: b845/3125 bl:2.3132 bb:0.9991 rl:2.3245 rb:1.0600 dl:308-309 gd:1 +ttp: b838/3125 bl:2.3051 bb:1.1071 rl:2.3245 rb:1.0601 dl:306-307 gd:1 +ttp: b830/3125 bl:2.5503 bb:1.1451 rl:2.3246 rb:1.0602 dl:304-305 gd:1 +ttp: b823/3125 bl:2.2760 bb:1.0400 rl:2.3246 rb:1.0601 dl:302-303 gd:1 +ttp: b751/3125 bl:2.2824 bb:1.0568 rl:2.3246 rb:1.0601 dl:283-283 gd:1 +ttp: b745/3125 bl:2.3684 bb:1.1096 rl:2.3246 rb:1.0602 dl:281-281 gd:1 +ttp: b737/3125 bl:2.4034 bb:1.0454 rl:2.3247 rb:1.0602 dl:279-279 gd:1 +ttp: b727/3125 bl:2.3324 bb:1.0940 rl:2.3247 rb:1.0602 dl:277-277 gd:1 +ttp: b721/3125 bl:2.3250 bb:1.1513 rl:2.3247 rb:1.0603 dl:275-275 gd:1 +ttp: b713/3125 bl:2.1843 bb:1.0484 rl:2.3246 rb:1.0602 dl:273-273 gd:1 +ttp: b704/3125 bl:2.4947 bb:1.2714 rl:2.3247 rb:1.0604 dl:271-271 gd:1 +ttp: b696/3125 bl:2.5160 bb:1.1952 rl:2.3248 rb:1.0605 dl:269-269 gd:1 +ttp: b687/3125 bl:2.5075 bb:1.1468 rl:2.3250 rb:1.0605 dl:267-267 gd:1 +ttp: b680/3125 bl:2.4904 bb:1.1746 rl:2.3251 rb:1.0606 dl:265-265 gd:1 +ttp: b670/3125 bl:2.4106 bb:1.1568 rl:2.3252 rb:1.0607 dl:263-263 gd:1 +ttp: b662/3125 bl:2.6596 bb:1.1645 rl:2.3254 rb:1.0608 dl:261-261 gd:1 +ttp: b653/3125 bl:2.6084 bb:1.2352 rl:2.3256 rb:1.0609 dl:259-259 gd:1 +ttp: b647/3125 bl:2.4113 bb:1.0619 rl:2.3257 rb:1.0609 dl:257-257 gd:1 +ttp: b639/3125 bl:2.4253 bb:1.1244 rl:2.3257 rb:1.0609 dl:255-255 gd:1 +ttp: b629/3125 bl:2.1482 bb:0.9755 rl:2.3256 rb:1.0609 dl:253-253 gd:1 +ttp: b621/3125 bl:2.4137 bb:1.2234 rl:2.3257 rb:1.0610 dl:251-251 gd:1 +ttp: b611/3125 bl:2.3286 bb:1.0597 rl:2.3257 rb:1.0610 dl:248-249 gd:1 +ttp: b610/3125 bl:2.4399 bb:1.1092 rl:2.3258 rb:1.0610 dl:248-248 gd:1 +ttp: b602/3125 bl:2.5042 bb:1.0901 rl:2.3259 rb:1.0610 dl:246-246 gd:1 +ttp: b592/3125 bl:2.1603 bb:1.0253 rl:2.3258 rb:1.0610 dl:244-244 gd:1 +ttp: b582/3125 bl:2.5587 bb:1.2231 rl:2.3259 rb:1.0611 dl:241-242 gd:1 +ttp: b580/3125 bl:2.6011 bb:1.2198 rl:2.3261 rb:1.0612 dl:241-241 gd:1 +ttp: b571/3125 bl:2.4181 bb:1.1682 rl:2.3262 rb:1.0613 dl:239-239 gd:1 +ttp: b569/3125 bl:2.4798 bb:1.1430 rl:2.3263 rb:1.0613 dl:238-238 gd:1 +ttp: b559/3125 bl:2.4095 bb:1.1065 rl:2.3263 rb:1.0614 dl:236-236 gd:1 +ttp: b556/3125 bl:2.4242 bb:1.1802 rl:2.3264 rb:1.0614 dl:235-235 gd:1 +ttp: b544/3125 bl:2.3532 bb:1.1376 rl:2.3264 rb:1.0615 dl:232-233 gd:1 +ttp: b541/3125 bl:2.3598 bb:1.1425 rl:2.3264 rb:1.0615 dl:232-232 gd:1 +ttp: b539/3125 bl:2.4337 bb:1.1930 rl:2.3265 rb:1.0616 dl:231-231 gd:1 +ttp: b528/3125 bl:2.4049 bb:1.1397 rl:2.3265 rb:1.0617 dl:229-229 gd:1 +ttp: b518/3125 bl:2.6316 bb:1.2040 rl:2.3267 rb:1.0618 dl:226-227 gd:1 +ttp: b517/3125 bl:2.3125 bb:1.0444 rl:2.3267 rb:1.0617 dl:226-226 gd:1 +ttp: b507/3125 bl:2.4896 bb:1.1283 rl:2.3268 rb:1.0618 dl:224-224 gd:1 +ttp: b496/3125 bl:2.2633 bb:1.1219 rl:2.3268 rb:1.0618 dl:221-222 gd:1 +ttp: b495/3125 bl:2.5895 bb:1.2862 rl:2.3270 rb:1.0619 dl:221-221 gd:1 +ttp: b486/3125 bl:2.4256 bb:1.0764 rl:2.3270 rb:1.0620 dl:219-219 gd:1 +ttp: b475/3125 bl:2.3540 bb:1.1388 rl:2.3270 rb:1.0620 dl:216-217 gd:1 +ttp: b466/3125 bl:2.5826 bb:1.2505 rl:2.3272 rb:1.0621 dl:214-215 gd:1 +ttp: b465/3125 bl:2.4364 bb:1.2379 rl:2.3272 rb:1.0622 dl:214-214 gd:1 +ttp: b456/3125 bl:2.4365 bb:1.1941 rl:2.3273 rb:1.0623 dl:212-212 gd:1 +ttp: b446/3125 bl:2.5426 bb:1.2351 rl:2.3274 rb:1.0624 dl:210-210 gd:1 +ttp: b437/3125 bl:2.3292 bb:1.0541 rl:2.3274 rb:1.0624 dl:208-208 gd:1 +ttp: b435/3125 bl:2.4625 bb:1.1759 rl:2.3275 rb:1.0624 dl:207-207 gd:1 +ttp: b425/3125 bl:2.3908 bb:1.1897 rl:2.3275 rb:1.0625 dl:205-205 gd:1 +ttp: b422/3125 bl:2.4456 bb:1.2136 rl:2.3276 rb:1.0626 dl:204-204 gd:1 +ttp: b409/3125 bl:2.3710 bb:1.2227 rl:2.3276 rb:1.0626 dl:202-202 gd:1 +ttp: b407/3125 bl:2.7307 bb:1.2643 rl:2.3279 rb:1.0627 dl:201-201 gd:1 +ttp: b397/3125 bl:2.5125 bb:1.2601 rl:2.3280 rb:1.0628 dl:199-199 gd:1 +ttp: b386/3125 bl:2.4983 bb:1.1981 rl:2.3280 rb:1.0629 dl:196-197 gd:1 +ttp: b384/3125 bl:2.3730 bb:1.1321 rl:2.3281 rb:1.0629 dl:196-196 gd:1 +ttp: b375/3125 bl:2.6272 bb:1.2261 rl:2.3282 rb:1.0630 dl:194-194 gd:1 +ttp: b367/3125 bl:2.4560 bb:1.1808 rl:2.3283 rb:1.0631 dl:192-192 gd:1 +ttp: b356/3125 bl:2.5497 bb:1.1714 rl:2.3284 rb:1.0631 dl:189-190 gd:1 +ttp: b354/3125 bl:2.3583 bb:1.0864 rl:2.3284 rb:1.0632 dl:189-189 gd:1 +ttp: b344/3125 bl:2.5413 bb:1.1545 rl:2.3285 rb:1.0632 dl:187-187 gd:1 +ttp: b342/3125 bl:2.4769 bb:1.1598 rl:2.3286 rb:1.0632 dl:186-186 gd:1 +ttp: b331/3125 bl:2.4787 bb:1.1264 rl:2.3287 rb:1.0633 dl:184-184 gd:1 +ttp: b321/3125 bl:2.4532 bb:1.1312 rl:2.3287 rb:1.0633 dl:182-182 gd:1 +ttp: b313/3125 bl:2.5533 bb:1.2366 rl:2.3288 rb:1.0634 dl:180-180 gd:1 +ttp: b303/3125 bl:2.3509 bb:1.1719 rl:2.3289 rb:1.0634 dl:178-178 gd:1 +ttp: b301/3125 bl:2.5332 bb:1.1833 rl:2.3290 rb:1.0635 dl:177-177 gd:1 +ttp: b293/3125 bl:2.3908 bb:1.1007 rl:2.3290 rb:1.0635 dl:175-175 gd:1 +ttp: b282/3125 bl:2.4407 bb:1.1544 rl:2.3290 rb:1.0636 dl:173-173 gd:1 +ttp: b270/3125 bl:2.5386 bb:1.2794 rl:2.3291 rb:1.0636 dl:170-171 gd:1 +ttp: b269/3125 bl:2.3845 bb:1.1095 rl:2.3292 rb:1.0637 dl:170-170 gd:1 +ttp: b259/3125 bl:2.4676 bb:1.1846 rl:2.3292 rb:1.0637 dl:168-168 gd:1 +ttp: b251/3125 bl:2.7448 bb:1.2903 rl:2.3294 rb:1.0638 dl:166-166 gd:1 +ttp: b242/3125 bl:2.3539 bb:1.2561 rl:2.3294 rb:1.0639 dl:164-164 gd:1 +ttp: b233/3125 bl:2.4537 bb:1.0542 rl:2.3295 rb:1.0639 dl:162-162 gd:1 +ttp: b225/3125 bl:2.4754 bb:1.1887 rl:2.3295 rb:1.0639 dl:160-160 gd:1 +ttp: b217/3125 bl:2.3019 bb:1.0532 rl:2.3295 rb:1.0639 dl:158-158 gd:1 +ttp: b206/3125 bl:2.7452 bb:1.2357 rl:2.3297 rb:1.0640 dl:155-156 gd:1 +ttp: b199/3125 bl:2.4747 bb:1.2712 rl:2.3298 rb:1.0641 dl:153-154 gd:1 +ttp: b191/3125 bl:2.4681 bb:1.2076 rl:2.3298 rb:1.0641 dl:151-152 gd:1 +ttp: b183/3125 bl:2.5375 bb:1.2484 rl:2.3299 rb:1.0642 dl:150-150 gd:1 +ttp: b182/3125 bl:2.6826 bb:1.2145 rl:2.3300 rb:1.0643 dl:149-149 gd:1 +ttp: b167/3125 bl:2.4455 bb:1.2259 rl:2.3301 rb:1.0643 dl:146-146 gd:1 +ttp: b166/3125 bl:2.4147 bb:1.1653 rl:2.3301 rb:1.0644 dl:145-145 gd:1 +ttp: b156/3125 bl:2.5026 bb:1.1769 rl:2.3302 rb:1.0644 dl:143-143 gd:1 +ttp: b142/3125 bl:2.5552 bb:1.1953 rl:2.3303 rb:1.0645 dl:139-140 gd:1 +ttp: b134/3125 bl:2.6231 bb:1.1573 rl:2.3304 rb:1.0645 dl:137-138 gd:1 +ttp: b126/3125 bl:2.5872 bb:1.2076 rl:2.3305 rb:1.0645 dl:135-136 gd:1 +ttp: b118/3125 bl:2.5778 bb:1.2853 rl:2.3306 rb:1.0646 dl:133-134 gd:1 +ttp: b117/3125 bl:2.6042 bb:1.2163 rl:2.3307 rb:1.0647 dl:133-133 gd:1 +ttp: b103/3125 bl:2.7223 bb:1.4525 rl:2.3308 rb:1.0648 dl:129-130 gd:1 +ttp: b96/3125 bl:2.5811 bb:1.1935 rl:2.3309 rb:1.0648 dl:128-128 gd:1 +ttp: b90/3125 bl:2.7589 bb:1.2690 rl:2.3310 rb:1.0649 dl:126-126 gd:1 +ttp: b84/3125 bl:2.7225 bb:1.1818 rl:2.3312 rb:1.0649 dl:124-124 gd:1 +ttp: b73/3125 bl:2.4937 bb:1.1617 rl:2.3312 rb:1.0650 dl:121-121 gd:1 +ttp: b63/3125 bl:2.6582 bb:1.3073 rl:2.3313 rb:1.0650 dl:117-118 gd:1 +ttp: b54/3125 bl:2.6367 bb:1.2221 rl:2.3314 rb:1.0651 dl:114-115 gd:1 +ttp: b51/3125 bl:2.6176 bb:1.1493 rl:2.3315 rb:1.0651 dl:113-113 gd:1 +ttp: b41/3125 bl:2.5702 bb:1.1419 rl:2.3316 rb:1.0651 dl:109-109 gd:1 +ttp: b31/3125 bl:2.7817 bb:1.2884 rl:2.3317 rb:1.0652 dl:104-105 gd:1 +ttp: b26/3125 bl:2.6482 bb:1.1438 rl:2.3318 rb:1.0652 dl:102-102 gd:1 +ttp: b17/3125 bl:2.6717 bb:1.2231 rl:2.3319 rb:1.0653 dl:96-97 gd:1 +ttp: b12/3125 bl:2.4581 bb:1.1265 rl:2.3319 rb:1.0653 dl:92-93 gd:1 +ttp: b3/3125 bl:2.8697 bb:1.1969 rl:2.3320 rb:1.0653 dl:79-81 gd:1 +quantized_ttt_phased val_loss:2.32529863 val_bpb:1.06257073 eval_time:587671ms +total_eval_time:587.7s diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed42.log b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed42.log new file mode 100644 index 0000000000..edce3a6db6 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed42.log @@ -0,0 +1,209 @@ +ttp: b1662/3125 bl:2.3431 bb:1.0303 rl:2.3159 rb:1.0611 dl:613-614 gd:1 +ttp: b1656/3125 bl:2.2861 bb:1.0830 rl:2.3159 rb:1.0612 dl:611-611 gd:1 +ttp: b1649/3125 bl:2.5226 bb:1.0977 rl:2.3163 rb:1.0612 dl:607-607 gd:1 +ttp: b1641/3125 bl:2.3883 bb:1.0702 rl:2.3164 rb:1.0612 dl:603-603 gd:1 +ttp: b1631/3125 bl:2.4157 bb:1.1188 rl:2.3166 rb:1.0614 dl:598-598 gd:1 +ttp: b1624/3125 bl:2.4566 bb:1.1213 rl:2.3169 rb:1.0615 dl:595-595 gd:1 +ttp: b1614/3125 bl:2.4457 bb:1.1402 rl:2.3171 rb:1.0616 dl:590-591 gd:1 +ttp: b1605/3125 bl:2.2000 bb:1.0407 rl:2.3169 rb:1.0616 dl:586-587 gd:1 +ttp: b1597/3125 bl:2.3293 bb:1.0208 rl:2.3169 rb:1.0615 dl:582-583 gd:1 +ttp: b1591/3125 bl:2.2107 bb:0.9882 rl:2.3167 rb:1.0614 dl:579-580 gd:1 +ttp: b1581/3125 bl:2.2376 bb:1.0543 rl:2.3166 rb:1.0613 dl:575-576 gd:1 +ttp: b1576/3125 bl:2.1508 bb:0.9542 rl:2.3163 rb:1.0611 dl:573-573 gd:1 +ttp: b1566/3125 bl:2.3691 bb:1.0311 rl:2.3164 rb:1.0611 dl:568-569 gd:1 +ttp: b1557/3125 bl:2.3040 bb:1.0233 rl:2.3164 rb:1.0610 dl:564-565 gd:1 +ttp: b1548/3125 bl:2.3628 bb:1.0866 rl:2.3164 rb:1.0611 dl:560-561 gd:1 +ttp: b1543/3125 bl:2.3261 bb:1.0801 rl:2.3165 rb:1.0611 dl:558-558 gd:1 +ttp: b1532/3125 bl:2.3406 bb:1.0348 rl:2.3165 rb:1.0610 dl:553-554 gd:1 +ttp: b1527/3125 bl:2.3565 bb:1.1016 rl:2.3166 rb:1.0611 dl:551-551 gd:1 +ttp: b1515/3125 bl:2.5268 bb:1.1096 rl:2.3169 rb:1.0612 dl:546-547 gd:1 +ttp: b1508/3125 bl:2.1477 bb:1.0322 rl:2.3166 rb:1.0612 dl:543-544 gd:1 +ttp: b1499/3125 bl:2.3812 bb:1.0358 rl:2.3167 rb:1.0611 dl:539-540 gd:1 +ttp: b1493/3125 bl:2.2299 bb:1.0269 rl:2.3166 rb:1.0611 dl:537-537 gd:1 +ttp: b1486/3125 bl:2.2294 bb:0.9855 rl:2.3165 rb:1.0609 dl:534-534 gd:1 +ttp: b1475/3125 bl:2.3243 bb:1.0522 rl:2.3165 rb:1.0609 dl:529-530 gd:1 +ttp: b1469/3125 bl:2.5323 bb:1.1565 rl:2.3168 rb:1.0611 dl:527-527 gd:1 +ttp: b1460/3125 bl:2.2912 bb:1.0318 rl:2.3168 rb:1.0610 dl:523-524 gd:1 +ttp: b1453/3125 bl:2.3124 bb:1.0708 rl:2.3168 rb:1.0610 dl:521-521 gd:1 +ttp: b1445/3125 bl:2.5046 bb:1.1158 rl:2.3171 rb:1.0611 dl:517-517 gd:1 +ttp: b1438/3125 bl:2.4646 bb:1.1084 rl:2.3173 rb:1.0612 dl:514-514 gd:1 +ttp: b1428/3125 bl:2.2503 bb:1.0333 rl:2.3172 rb:1.0612 dl:509-510 gd:1 +ttp: b1422/3125 bl:2.2288 bb:1.0219 rl:2.3171 rb:1.0611 dl:507-507 gd:1 +ttp: b1414/3125 bl:2.0854 bb:0.9660 rl:2.3167 rb:1.0609 dl:503-503 gd:1 +ttp: b1401/3125 bl:2.3524 bb:1.0738 rl:2.3168 rb:1.0610 dl:498-499 gd:1 +ttp: b1394/3125 bl:2.1788 bb:1.0005 rl:2.3166 rb:1.0609 dl:495-496 gd:1 +ttp: b1390/3125 bl:2.2522 bb:1.0148 rl:2.3165 rb:1.0608 dl:494-494 gd:1 +ttp: b1382/3125 bl:2.5724 bb:1.1214 rl:2.3168 rb:1.0609 dl:491-491 gd:1 +ttp: b1373/3125 bl:2.4415 bb:1.0475 rl:2.3170 rb:1.0609 dl:488-488 gd:1 +ttp: b1364/3125 bl:2.3296 bb:1.0906 rl:2.3170 rb:1.0609 dl:484-485 gd:1 +ttp: b1360/3125 bl:2.5676 bb:1.1304 rl:2.3174 rb:1.0610 dl:483-483 gd:1 +ttp: b1351/3125 bl:2.5126 bb:1.1563 rl:2.3177 rb:1.0612 dl:480-480 gd:1 +ttp: b1343/3125 bl:2.4695 bb:1.1397 rl:2.3179 rb:1.0613 dl:476-477 gd:1 +ttp: b1335/3125 bl:2.3794 bb:1.0657 rl:2.3180 rb:1.0613 dl:473-474 gd:1 +ttp: b1332/3125 bl:2.4201 bb:1.0788 rl:2.3182 rb:1.0613 dl:472-472 gd:1 +ttp: b1323/3125 bl:2.4053 bb:1.1092 rl:2.3183 rb:1.0614 dl:469-469 gd:1 +ttp: b1316/3125 bl:2.3342 bb:1.0921 rl:2.3183 rb:1.0614 dl:466-466 gd:1 +ttp: b1306/3125 bl:2.1627 bb:1.0190 rl:2.3181 rb:1.0614 dl:462-463 gd:1 +ttp: b1301/3125 bl:2.2725 bb:1.0798 rl:2.3180 rb:1.0614 dl:461-461 gd:1 +ttp: b1293/3125 bl:2.4554 bb:1.1106 rl:2.3182 rb:1.0615 dl:458-458 gd:1 +ttp: b1285/3125 bl:2.3385 bb:1.0558 rl:2.3182 rb:1.0615 dl:455-455 gd:1 +ttp: b1275/3125 bl:2.1772 bb:0.9617 rl:2.3180 rb:1.0613 dl:450-451 gd:1 +ttp: b1268/3125 bl:2.2848 bb:1.0769 rl:2.3180 rb:1.0613 dl:448-448 gd:1 +ttp: b1260/3125 bl:2.5005 bb:1.1251 rl:2.3182 rb:1.0614 dl:445-445 gd:1 +ttp: b1253/3125 bl:2.2552 bb:1.0932 rl:2.3182 rb:1.0615 dl:442-442 gd:1 +ttp: b1244/3125 bl:2.4107 bb:1.0742 rl:2.3183 rb:1.0615 dl:439-439 gd:1 +ttp: b1234/3125 bl:2.3185 bb:1.1172 rl:2.3183 rb:1.0615 dl:435-436 gd:1 +ttp: b1227/3125 bl:2.3082 bb:1.2320 rl:2.3183 rb:1.0617 dl:433-433 gd:1 +ttp: b1220/3125 bl:2.3473 bb:1.0632 rl:2.3183 rb:1.0617 dl:430-430 gd:1 +ttp: b1212/3125 bl:2.3243 bb:1.1033 rl:2.3183 rb:1.0618 dl:427-427 gd:1 +ttp: b1204/3125 bl:2.3192 bb:1.1169 rl:2.3183 rb:1.0619 dl:424-424 gd:1 +ttp: b1196/3125 bl:2.3544 bb:1.0616 rl:2.3184 rb:1.0619 dl:421-421 gd:1 +ttp: b1187/3125 bl:2.4534 bb:1.1534 rl:2.3185 rb:1.0620 dl:417-418 gd:1 +ttp: b1179/3125 bl:2.2654 bb:1.0186 rl:2.3185 rb:1.0619 dl:414-415 gd:1 +ttp: b1173/3125 bl:2.1594 bb:1.0237 rl:2.3183 rb:1.0619 dl:412-412 gd:1 +ttp: b1164/3125 bl:2.3513 bb:1.0775 rl:2.3183 rb:1.0619 dl:408-409 gd:1 +ttp: b1160/3125 bl:2.5243 bb:1.0951 rl:2.3186 rb:1.0619 dl:407-407 gd:1 +ttp: b1154/3125 bl:2.2594 bb:1.0155 rl:2.3185 rb:1.0619 dl:405-405 gd:1 +ttp: b1144/3125 bl:2.3849 bb:1.1077 rl:2.3186 rb:1.0619 dl:401-402 gd:1 +ttp: b1138/3125 bl:2.3859 bb:1.0942 rl:2.3186 rb:1.0620 dl:400-400 gd:1 +ttp: b1134/3125 bl:2.4784 bb:1.1173 rl:2.3188 rb:1.0620 dl:398-398 gd:1 +ttp: b1124/3125 bl:2.3191 bb:1.1677 rl:2.3188 rb:1.0621 dl:395-395 gd:1 +ttp: b1119/3125 bl:2.2882 bb:1.0946 rl:2.3188 rb:1.0622 dl:393-393 gd:1 +ttp: b1109/3125 bl:2.1137 bb:1.0467 rl:2.3186 rb:1.0622 dl:389-390 gd:1 +ttp: b1105/3125 bl:2.2860 bb:1.1030 rl:2.3185 rb:1.0622 dl:388-388 gd:1 +ttp: b1095/3125 bl:2.2831 bb:1.0841 rl:2.3185 rb:1.0622 dl:385-385 gd:1 +ttp: b1088/3125 bl:2.2647 bb:1.0819 rl:2.3184 rb:1.0622 dl:383-383 gd:1 +ttp: b1079/3125 bl:2.2230 bb:1.0149 rl:2.3183 rb:1.0622 dl:379-380 gd:1 +ttp: b1073/3125 bl:2.2358 bb:1.0912 rl:2.3182 rb:1.0622 dl:378-378 gd:1 +ttp: b1067/3125 bl:2.5242 bb:1.1961 rl:2.3185 rb:1.0624 dl:376-376 gd:1 +ttp: b1062/3125 bl:2.4379 bb:1.1520 rl:2.3186 rb:1.0625 dl:374-374 gd:1 +ttp: b1056/3125 bl:2.4395 bb:1.0933 rl:2.3187 rb:1.0625 dl:372-372 gd:1 +ttp: b1043/3125 bl:2.4482 bb:1.1499 rl:2.3188 rb:1.0626 dl:368-369 gd:1 +ttp: b1040/3125 bl:2.2472 bb:1.0367 rl:2.3188 rb:1.0626 dl:367-367 gd:1 +ttp: b1028/3125 bl:2.4296 bb:1.0870 rl:2.3189 rb:1.0626 dl:363-364 gd:1 +ttp: b990/3125 bl:2.2782 bb:1.1744 rl:2.3188 rb:1.0627 dl:352-352 gd:1 +ttp: b985/3125 bl:2.4312 bb:1.0917 rl:2.3190 rb:1.0627 dl:350-350 gd:1 +ttp: b974/3125 bl:2.2495 bb:1.1124 rl:2.3189 rb:1.0628 dl:346-347 gd:1 +ttp: b969/3125 bl:2.4888 bb:1.1221 rl:2.3191 rb:1.0628 dl:345-345 gd:1 +ttp: b959/3125 bl:2.3213 bb:1.0792 rl:2.3191 rb:1.0628 dl:342-342 gd:1 +ttp: b954/3125 bl:2.5120 bb:1.1507 rl:2.3193 rb:1.0629 dl:340-340 gd:1 +ttp: b944/3125 bl:2.4403 bb:1.1338 rl:2.3194 rb:1.0630 dl:337-337 gd:1 +ttp: b938/3125 bl:2.3248 bb:1.1348 rl:2.3194 rb:1.0631 dl:335-335 gd:1 +ttp: b931/3125 bl:2.3943 bb:1.1416 rl:2.3194 rb:1.0631 dl:333-333 gd:1 +ttp: b918/3125 bl:2.5492 bb:1.1622 rl:2.3197 rb:1.0632 dl:329-330 gd:1 +ttp: b915/3125 bl:2.3237 bb:1.1212 rl:2.3197 rb:1.0633 dl:328-328 gd:1 +ttp: b901/3125 bl:2.3313 bb:1.0157 rl:2.3197 rb:1.0632 dl:324-325 gd:1 +ttp: b895/3125 bl:2.3854 bb:1.1577 rl:2.3197 rb:1.0633 dl:323-323 gd:1 +ttp: b889/3125 bl:2.2918 bb:1.1047 rl:2.3197 rb:1.0633 dl:321-321 gd:1 +ttp: b877/3125 bl:2.6204 bb:1.2088 rl:2.3200 rb:1.0635 dl:317-318 gd:1 +ttp: b871/3125 bl:2.2826 bb:1.0657 rl:2.3199 rb:1.0635 dl:316-316 gd:1 +ttp: b865/3125 bl:2.5567 bb:1.1390 rl:2.3202 rb:1.0635 dl:314-314 gd:1 +ttp: b859/3125 bl:2.3186 bb:1.0581 rl:2.3202 rb:1.0635 dl:312-312 gd:1 +ttp: b846/3125 bl:2.4703 bb:1.0791 rl:2.3203 rb:1.0636 dl:309-309 gd:1 +ttp: b839/3125 bl:2.3482 bb:1.0486 rl:2.3203 rb:1.0635 dl:307-307 gd:1 +ttp: b831/3125 bl:2.2835 bb:1.1285 rl:2.3203 rb:1.0636 dl:305-305 gd:1 +ttp: b824/3125 bl:2.4834 bb:1.1261 rl:2.3204 rb:1.0636 dl:303-303 gd:1 +ttp: b817/3125 bl:2.5059 bb:1.1847 rl:2.3206 rb:1.0637 dl:301-301 gd:1 +ttp: b803/3125 bl:2.4235 bb:1.0751 rl:2.3207 rb:1.0638 dl:297-298 gd:1 +ttp: b796/3125 bl:2.4641 bb:1.2265 rl:2.3208 rb:1.0639 dl:295-296 gd:1 +ttp: b789/3125 bl:2.4018 bb:1.0631 rl:2.3209 rb:1.0639 dl:294-294 gd:1 +ttp: b784/3125 bl:2.3745 bb:1.0554 rl:2.3209 rb:1.0639 dl:292-292 gd:1 +ttp: b772/3125 bl:2.2475 bb:1.0358 rl:2.3208 rb:1.0638 dl:288-289 gd:1 +ttp: b771/3125 bl:2.3079 bb:1.1478 rl:2.3208 rb:1.0639 dl:288-288 gd:1 +ttp: b758/3125 bl:2.4884 bb:1.2018 rl:2.3210 rb:1.0640 dl:285-285 gd:1 +ttp: b752/3125 bl:2.3916 bb:1.1762 rl:2.3210 rb:1.0641 dl:283-283 gd:1 +ttp: b745/3125 bl:2.3747 bb:1.1125 rl:2.3211 rb:1.0641 dl:281-281 gd:1 +ttp: b737/3125 bl:2.4062 bb:1.0466 rl:2.3211 rb:1.0641 dl:279-279 gd:1 +ttp: b727/3125 bl:2.3157 bb:1.0861 rl:2.3211 rb:1.0641 dl:277-277 gd:1 +ttp: b721/3125 bl:2.3372 bb:1.1574 rl:2.3211 rb:1.0642 dl:275-275 gd:1 +ttp: b713/3125 bl:2.1878 bb:1.0501 rl:2.3210 rb:1.0642 dl:273-273 gd:1 +ttp: b704/3125 bl:2.4847 bb:1.2663 rl:2.3212 rb:1.0643 dl:271-271 gd:1 +ttp: b697/3125 bl:2.4145 bb:1.1813 rl:2.3212 rb:1.0644 dl:269-269 gd:1 +ttp: b688/3125 bl:2.4633 bb:1.1026 rl:2.3213 rb:1.0644 dl:267-267 gd:1 +ttp: b673/3125 bl:2.2663 bb:1.0035 rl:2.3213 rb:1.0644 dl:263-264 gd:1 +ttp: b671/3125 bl:2.3505 bb:1.1534 rl:2.3213 rb:1.0645 dl:263-263 gd:1 +ttp: b663/3125 bl:2.5697 bb:1.1356 rl:2.3215 rb:1.0645 dl:261-261 gd:1 +ttp: b654/3125 bl:2.4284 bb:1.2494 rl:2.3216 rb:1.0646 dl:259-259 gd:1 +ttp: b648/3125 bl:2.2740 bb:1.0628 rl:2.3215 rb:1.0646 dl:257-257 gd:1 +ttp: b639/3125 bl:2.4391 bb:1.1309 rl:2.3216 rb:1.0647 dl:255-255 gd:1 +ttp: b629/3125 bl:2.1573 bb:0.9796 rl:2.3215 rb:1.0646 dl:253-253 gd:1 +ttp: b621/3125 bl:2.4101 bb:1.2216 rl:2.3216 rb:1.0647 dl:251-251 gd:1 +ttp: b612/3125 bl:2.1494 bb:1.0099 rl:2.3214 rb:1.0647 dl:249-249 gd:1 +ttp: b605/3125 bl:2.2390 bb:1.1376 rl:2.3214 rb:1.0647 dl:247-247 gd:1 +ttp: b597/3125 bl:2.1864 bb:1.0811 rl:2.3213 rb:1.0647 dl:245-245 gd:1 +ttp: b589/3125 bl:2.4903 bb:1.1740 rl:2.3214 rb:1.0648 dl:243-243 gd:1 +ttp: b579/3125 bl:2.4009 bb:1.0539 rl:2.3215 rb:1.0648 dl:241-241 gd:1 +ttp: b571/3125 bl:2.4242 bb:1.1711 rl:2.3215 rb:1.0649 dl:239-239 gd:1 +ttp: b563/3125 bl:2.5703 bb:1.2677 rl:2.3217 rb:1.0650 dl:237-237 gd:1 +ttp: b554/3125 bl:2.3968 bb:1.0552 rl:2.3217 rb:1.0650 dl:235-235 gd:1 +ttp: b544/3125 bl:2.3237 bb:1.1233 rl:2.3217 rb:1.0650 dl:232-233 gd:1 +ttp: b543/3125 bl:2.5079 bb:1.1281 rl:2.3219 rb:1.0651 dl:232-232 gd:1 +ttp: b533/3125 bl:2.5242 bb:1.2074 rl:2.3220 rb:1.0651 dl:230-230 gd:1 +ttp: b523/3125 bl:2.3085 bb:1.0670 rl:2.3220 rb:1.0651 dl:228-228 gd:1 +ttp: b515/3125 bl:2.4513 bb:1.1581 rl:2.3221 rb:1.0652 dl:226-226 gd:1 +ttp: b506/3125 bl:2.4327 bb:1.2245 rl:2.3221 rb:1.0653 dl:224-224 gd:1 +ttp: b496/3125 bl:2.2742 bb:1.1273 rl:2.3221 rb:1.0653 dl:221-222 gd:1 +ttp: b489/3125 bl:2.2420 bb:1.1033 rl:2.3221 rb:1.0653 dl:219-220 gd:1 +ttp: b488/3125 bl:2.4497 bb:1.1877 rl:2.3221 rb:1.0654 dl:219-219 gd:1 +ttp: b477/3125 bl:2.3048 bb:1.1590 rl:2.3221 rb:1.0655 dl:217-217 gd:1 +ttp: b468/3125 bl:2.4146 bb:1.1508 rl:2.3222 rb:1.0655 dl:215-215 gd:1 +ttp: b459/3125 bl:2.3898 bb:1.1225 rl:2.3222 rb:1.0655 dl:213-213 gd:1 +ttp: b449/3125 bl:2.6808 bb:1.1958 rl:2.3224 rb:1.0656 dl:210-211 gd:1 +ttp: b448/3125 bl:2.4382 bb:1.1972 rl:2.3225 rb:1.0657 dl:210-210 gd:1 +ttp: b440/3125 bl:2.3568 bb:1.1675 rl:2.3225 rb:1.0657 dl:208-208 gd:1 +ttp: b431/3125 bl:2.3986 bb:1.1778 rl:2.3225 rb:1.0658 dl:206-206 gd:1 +ttp: b420/3125 bl:2.2975 bb:1.1692 rl:2.3225 rb:1.0659 dl:204-204 gd:1 +ttp: b409/3125 bl:2.3750 bb:1.2248 rl:2.3226 rb:1.0659 dl:202-202 gd:1 +ttp: b399/3125 bl:2.1993 bb:1.0590 rl:2.3225 rb:1.0659 dl:199-200 gd:1 +ttp: b391/3125 bl:2.8510 bb:1.2927 rl:2.3228 rb:1.0661 dl:197-198 gd:1 +ttp: b389/3125 bl:2.5632 bb:1.1914 rl:2.3229 rb:1.0661 dl:197-197 gd:1 +ttp: b380/3125 bl:2.2377 bb:1.0549 rl:2.3229 rb:1.0661 dl:195-195 gd:1 +ttp: b372/3125 bl:2.4119 bb:1.1479 rl:2.3229 rb:1.0662 dl:193-193 gd:1 +ttp: b362/3125 bl:2.6049 bb:1.2181 rl:2.3231 rb:1.0662 dl:191-191 gd:1 +ttp: b352/3125 bl:2.5586 bb:1.1907 rl:2.3232 rb:1.0663 dl:188-189 gd:1 +ttp: b343/3125 bl:2.5025 bb:1.2124 rl:2.3233 rb:1.0664 dl:186-187 gd:1 +ttp: b341/3125 bl:2.3292 bb:1.0265 rl:2.3233 rb:1.0663 dl:186-186 gd:1 +ttp: b330/3125 bl:2.4169 bb:1.1451 rl:2.3233 rb:1.0664 dl:184-184 gd:1 +ttp: b320/3125 bl:2.3082 bb:1.1173 rl:2.3233 rb:1.0664 dl:181-182 gd:1 +ttp: b312/3125 bl:2.5593 bb:1.1864 rl:2.3234 rb:1.0665 dl:179-180 gd:1 +ttp: b311/3125 bl:2.7077 bb:1.2690 rl:2.3236 rb:1.0666 dl:179-179 gd:1 +ttp: b299/3125 bl:2.6807 bb:1.1698 rl:2.3238 rb:1.0666 dl:177-177 gd:1 +ttp: b291/3125 bl:2.5337 bb:1.2193 rl:2.3239 rb:1.0667 dl:175-175 gd:1 +ttp: b280/3125 bl:2.5293 bb:1.2524 rl:2.3240 rb:1.0668 dl:173-173 gd:1 +ttp: b277/3125 bl:2.4487 bb:1.2155 rl:2.3240 rb:1.0668 dl:172-172 gd:1 +ttp: b267/3125 bl:2.5752 bb:1.1436 rl:2.3242 rb:1.0669 dl:170-170 gd:1 +ttp: b257/3125 bl:2.4073 bb:1.1574 rl:2.3242 rb:1.0669 dl:167-168 gd:1 +ttp: b249/3125 bl:2.5115 bb:1.2392 rl:2.3243 rb:1.0670 dl:166-166 gd:1 +ttp: b240/3125 bl:2.3887 bb:1.1993 rl:2.3243 rb:1.0670 dl:163-164 gd:1 +ttp: b239/3125 bl:2.5651 bb:1.2031 rl:2.3244 rb:1.0671 dl:163-163 gd:1 +ttp: b224/3125 bl:2.4648 bb:1.2291 rl:2.3245 rb:1.0672 dl:159-160 gd:1 +ttp: b217/3125 bl:2.3094 bb:1.0566 rl:2.3245 rb:1.0671 dl:158-158 gd:1 +ttp: b207/3125 bl:2.8197 bb:1.2780 rl:2.3247 rb:1.0672 dl:156-156 gd:1 +ttp: b200/3125 bl:2.3880 bb:1.1586 rl:2.3247 rb:1.0673 dl:154-154 gd:1 +ttp: b192/3125 bl:2.4836 bb:1.2025 rl:2.3248 rb:1.0673 dl:152-152 gd:1 +ttp: b184/3125 bl:2.4547 bb:1.2111 rl:2.3248 rb:1.0674 dl:150-150 gd:1 +ttp: b174/3125 bl:2.5150 bb:1.1488 rl:2.3249 rb:1.0674 dl:147-148 gd:1 +ttp: b168/3125 bl:2.5988 bb:1.2656 rl:2.3250 rb:1.0675 dl:146-146 gd:1 +ttp: b159/3125 bl:2.7331 bb:1.1812 rl:2.3252 rb:1.0675 dl:144-144 gd:1 +ttp: b157/3125 bl:2.6037 bb:1.1770 rl:2.3253 rb:1.0676 dl:143-143 gd:1 +ttp: b142/3125 bl:2.5552 bb:1.1953 rl:2.3253 rb:1.0676 dl:139-140 gd:1 +ttp: b135/3125 bl:2.5705 bb:1.1720 rl:2.3254 rb:1.0677 dl:138-138 gd:1 +ttp: b127/3125 bl:2.6560 bb:1.2047 rl:2.3256 rb:1.0677 dl:136-136 gd:1 +ttp: b121/3125 bl:2.4368 bb:1.1636 rl:2.3256 rb:1.0677 dl:134-134 gd:1 +ttp: b113/3125 bl:2.6515 bb:1.2039 rl:2.3257 rb:1.0678 dl:132-132 gd:1 +ttp: b106/3125 bl:2.7688 bb:1.3074 rl:2.3259 rb:1.0679 dl:130-130 gd:1 +ttp: b98/3125 bl:2.6451 bb:1.1708 rl:2.3260 rb:1.0679 dl:128-128 gd:1 +ttp: b85/3125 bl:2.5506 bb:1.2863 rl:2.3260 rb:1.0680 dl:124-125 gd:1 +ttp: b79/3125 bl:2.6908 bb:1.2276 rl:2.3262 rb:1.0680 dl:123-123 gd:1 +ttp: b69/3125 bl:2.5022 bb:1.1188 rl:2.3262 rb:1.0680 dl:119-120 gd:1 +ttp: b65/3125 bl:2.6165 bb:1.2971 rl:2.3263 rb:1.0681 dl:118-118 gd:1 +ttp: b56/3125 bl:2.5371 bb:1.1769 rl:2.3264 rb:1.0681 dl:115-115 gd:1 +ttp: b48/3125 bl:2.5898 bb:1.1359 rl:2.3265 rb:1.0682 dl:112-112 gd:1 +ttp: b38/3125 bl:2.6268 bb:1.2013 rl:2.3265 rb:1.0682 dl:107-108 gd:1 +ttp: b33/3125 bl:2.8351 bb:1.2748 rl:2.3267 rb:1.0683 dl:105-105 gd:1 +ttp: b24/3125 bl:2.6617 bb:1.1626 rl:2.3268 rb:1.0683 dl:101-101 gd:1 +ttp: b14/3125 bl:2.7732 bb:1.2727 rl:2.3269 rb:1.0683 dl:94-95 gd:1 +ttp: b9/3125 bl:2.7567 bb:1.1938 rl:2.3270 rb:1.0684 dl:89-90 gd:1 +ttp: b1/3125 bl:2.8249 bb:1.1866 rl:2.3271 rb:1.0684 dl:27-75 gd:1 +quantized_ttt_phased val_loss:2.32475990 val_bpb:1.06232455 eval_time:485616ms +total_eval_time:485.6s diff --git a/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed999.log b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed999.log new file mode 100644 index 0000000000..68eaae3486 --- /dev/null +++ b/records/track_10min_16mb/2026-04-29_ReLU-Slope_Reverse-Cholesky-GPTQ_SP8192_CaseOps_PhasedTTT_PolarNS_FusedCE/train_seed999.log @@ -0,0 +1,1188 @@ +W0429 18:43:07.243000 249914 torch/distributed/run.py:803] +W0429 18:43:07.243000 249914 torch/distributed/run.py:803] ***************************************** +W0429 18:43:07.243000 249914 torch/distributed/run.py:803] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. +W0429 18:43:07.243000 249914 torch/distributed/run.py:803] ***************************************** +train_shards: 80 +val_tokens: 47851520 +model_params:35945671 +gptq:reserving 16s, effective=584000ms +warmup_cu_buckets:64,128,192,256 iters_each:3 +warmup_step: 1/20 +warmup_step: 2/20 +warmup_step: 3/20 +warmup_step: 4/20 +warmup_step: 5/20 +warmup_step: 6/20 +warmup_step: 10/20 +warmup_step: 20/20 +loop_warmup:enabled encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +loop_warmup_step: 1/20 +loop_warmup_step: 2/20 +loop_warmup_step: 3/20 +loop_warmup_step: 4/20 +loop_warmup_step: 5/20 +loop_warmup_step: 6/20 +loop_warmup_step: 10/20 +loop_warmup_step: 20/20 +0/20000 val_loss: 9.0149 val_bpb: 4.1192 +1/20000 train_loss: 9.0152 train_time: 0.0m tok/s: 12476399 +2/20000 train_loss: 12.8832 train_time: 0.0m tok/s: 11464028 +3/20000 train_loss: 10.1976 train_time: 0.0m tok/s: 10222934 +4/20000 train_loss: 8.6674 train_time: 0.0m tok/s: 9673623 +5/20000 train_loss: 7.9217 train_time: 0.0m tok/s: 9385556 +500/20000 train_loss: 2.5712 train_time: 0.8m tok/s: 8290006 +1000/20000 train_loss: 2.7985 train_time: 1.6m tok/s: 8272115 +1500/20000 train_loss: 2.6277 train_time: 2.4m tok/s: 8266041 +2000/20000 train_loss: 2.6622 train_time: 3.2m tok/s: 8261486 +layer_loop:enabled step:2147 frac:0.350 encoder:[0, 1, 2, 3, 4, 5, 3, 4] decoder:[5, 3, 4, 5, 6, 7, 8, 9, 10] +2500/20000 train_loss: 2.5521 train_time: 4.2m tok/s: 7742735 +3000/20000 train_loss: 2.5630 train_time: 5.4m tok/s: 7279323 +3500/20000 train_loss: 2.5649 train_time: 6.6m tok/s: 6960793 +4000/20000 train_loss: 2.4087 train_time: 7.8m tok/s: 6757643 +4000/20000 val_loss: 2.4278 val_bpb: 1.1093 +4500/20000 train_loss: 2.2761 train_time: 8.9m tok/s: 6608036 +4846/20000 val_loss: 2.3578 val_bpb: 1.0774 +stopping_early: wallclock_cap train_time: 584084ms step: 4846/20000 +peak memory allocated: 41709 MiB reserved: 47026 MiB +ema:applying EMA weights +diagnostic pre-quantization post-ema val_loss:2.33492623 val_bpb:1.06690089 eval_time:6763ms +Serialized model: 135419597 bytes +Code size (uncompressed): 138859 bytes +Code size (compressed): 31095 bytes +GPTQ:collecting Hessians from calibration data... +GPTQ:collected 67 Hessians in 3.5s +Quantized weights: + gate_int8_row: blocks.attn.attn_gate_w + gptq (int6): blocks.attn.c_k.weight, blocks.attn.c_q.weight, blocks.attn.c_v.weight, blocks.attn.proj.weight, blocks.mlp.fc.weight, blocks.mlp.proj.weight + gptq (int7)+lqer_asym: tok_emb.weight + passthrough (float16): blocks.attn.q_gain, blocks.attn_scale, blocks.mlp_scale, blocks.resid_mix, parallel_post_lambdas, parallel_resid_lambdas, skip_gates, skip_weights, smear_gate.weight, smear_lambda +GPTQ:quantized weights in 12.15s +Serialized model quantized+brotli: 15915437 bytes +Total submission size quantized+brotli: 15946532 bytes +diagnostic quantized val_loss:2.35389768 val_bpb:1.07556954 eval_time:10311ms +ttt_lora:warming up compile (random tokens, no val data) +ttt_lora:compile warmup done (133.9s) + +beginning TTT eval timer +ttt_phased: total_docs:50000 prefix_docs:2000 suffix_docs:48000 num_phases:3 boundaries:[666, 1333, 2000] +ttp: b3124/3125 bl:2.1816 bb:1.0808 rl:2.1816 rb:1.0808 dl:24255-30330 gd:0 +ttp: b3107/3125 bl:2.3408 bb:1.1003 rl:2.2210 rb:1.0859 dl:8823-8998 gd:0 +ttp: b3101/3125 bl:2.1964 bb:1.0606 rl:2.2167 rb:1.0814 dl:7534-7751 gd:0 +ttp: b3094/3125 bl:2.3777 bb:1.1127 rl:2.2378 rb:1.0857 dl:6523-6670 gd:0 +ttp: b3087/3125 bl:2.2163 bb:1.0335 rl:2.2355 rb:1.0799 dl:5910-5996 gd:0 +ttp: b3078/3125 bl:2.0107 bb:0.9559 rl:2.2159 rb:1.0690 dl:5341-5395 gd:0 +ttpp: phase:1/3 pd:784 gd:666 t:156.9s +tttg: c1/158 lr:0.001000 t:0.3s +tttg: c2/158 lr:0.001000 t:0.3s +tttg: c3/158 lr:0.001000 t:0.4s +tttg: c4/158 lr:0.000999 t:0.5s +tttg: c5/158 lr:0.000998 t:0.5s +tttg: c6/158 lr:0.000997 t:0.6s +tttg: c7/158 lr:0.000996 t:0.7s +tttg: c8/158 lr:0.000995 t:0.7s +tttg: c9/158 lr:0.000994 t:0.8s +tttg: c10/158 lr:0.000992 t:0.9s +tttg: c11/158 lr:0.000990 t:1.0s +tttg: c12/158 lr:0.000988 t:1.0s +tttg: c13/158 lr:0.000986 t:1.1s +tttg: c14/158 lr:0.000983 t:1.2s +tttg: c15/158 lr:0.000981 t:1.2s +tttg: c16/158 lr:0.000978 t:1.3s +tttg: c17/158 lr:0.000975 t:1.4s +tttg: c18/158 lr:0.000971 t:1.4s +tttg: c19/158 lr:0.000968 t:1.5s +tttg: c20/158 lr:0.000964 t:1.6s +tttg: c21/158 lr:0.000960 t:1.6s +tttg: c22/158 lr:0.000957 t:1.7s +tttg: c23/158 lr:0.000952 t:1.8s +tttg: c24/158 lr:0.000948 t:1.8s +tttg: c25/158 lr:0.000943 t:1.9s +tttg: c26/158 lr:0.000939 t:2.0s +tttg: c27/158 lr:0.000934 t:2.1s +tttg: c28/158 lr:0.000929 t:2.1s +tttg: c29/158 lr:0.000924 t:2.2s +tttg: c30/158 lr:0.000918 t:2.3s +tttg: c31/158 lr:0.000913 t:2.3s +tttg: c32/158 lr:0.000907 t:2.4s +tttg: c33/158 lr:0.000901 t:2.5s +tttg: c34/158 lr:0.000895 t:2.5s +tttg: c35/158 lr:0.000889 t:2.6s +tttg: c36/158 lr:0.000882 t:2.7s +tttg: c37/158 lr:0.000876 t:2.7s +tttg: c38/158 lr:0.000869 t:2.8s +tttg: c39/158 lr:0.000862 t:2.9s +tttg: c40/158 lr:0.000855 t:3.0s +tttg: c41/158 lr:0.000848 t:3.0s +tttg: c42/158 lr:0.000841 t:3.1s +tttg: c43/158 lr:0.000834 t:3.2s +tttg: c44/158 lr:0.000826 t:3.2s +tttg: c45/158 lr:0.000818 t:3.3s +tttg: c46/158 lr:0.000811 t:3.4s +tttg: c47/158 lr:0.000803 t:3.4s +tttg: c48/158 lr:0.000795 t:3.5s +tttg: c49/158 lr:0.000787 t:3.6s +tttg: c50/158 lr:0.000778 t:3.6s +tttg: c51/158 lr:0.000770 t:3.7s +tttg: c52/158 lr:0.000761 t:3.8s +tttg: c53/158 lr:0.000753 t:3.9s +tttg: c54/158 lr:0.000744 t:3.9s +tttg: c55/158 lr:0.000735 t:4.0s +tttg: c56/158 lr:0.000727 t:4.1s +tttg: c57/158 lr:0.000718 t:4.1s +tttg: c58/158 lr:0.000709 t:4.2s +tttg: c59/158 lr:0.000699 t:4.3s +tttg: c60/158 lr:0.000690 t:4.3s +tttg: c61/158 lr:0.000681 t:4.4s +tttg: c62/158 lr:0.000672 t:4.5s +tttg: c63/158 lr:0.000662 t:4.5s +tttg: c64/158 lr:0.000653 t:4.6s +tttg: c65/158 lr:0.000643 t:4.7s +tttg: c66/158 lr:0.000633 t:4.8s +tttg: c67/158 lr:0.000624 t:4.8s +tttg: c68/158 lr:0.000614 t:4.9s +tttg: c69/158 lr:0.000604 t:5.0s +tttg: c70/158 lr:0.000594 t:5.0s +tttg: c71/158 lr:0.000585 t:5.1s +tttg: c72/158 lr:0.000575 t:5.2s +tttg: c73/158 lr:0.000565 t:5.2s +tttg: c74/158 lr:0.000555 t:5.3s +tttg: c75/158 lr:0.000545 t:5.4s +tttg: c76/158 lr:0.000535 t:5.4s +tttg: c77/158 lr:0.000525 t:5.5s +tttg: c78/158 lr:0.000515 t:5.6s +tttg: c79/158 lr:0.000505 t:5.7s +tttg: c80/158 lr:0.000495 t:5.7s +tttg: c81/158 lr:0.000485 t:5.8s +tttg: c82/158 lr:0.000475 t:5.9s +tttg: c83/158 lr:0.000465 t:5.9s +tttg: c84/158 lr:0.000455 t:6.0s +tttg: c85/158 lr:0.000445 t:6.1s +tttg: c86/158 lr:0.000435 t:6.1s +tttg: c87/158 lr:0.000425 t:6.2s +tttg: c88/158 lr:0.000415 t:6.3s +tttg: c89/158 lr:0.000406 t:6.3s +tttg: c90/158 lr:0.000396 t:6.4s +tttg: c91/158 lr:0.000386 t:6.5s +tttg: c92/158 lr:0.000376 t:6.5s +tttg: c93/158 lr:0.000367 t:6.6s +tttg: c94/158 lr:0.000357 t:6.7s +tttg: c95/158 lr:0.000347 t:6.8s +tttg: c96/158 lr:0.000338 t:6.8s +tttg: c97/158 lr:0.000328 t:6.9s +tttg: c98/158 lr:0.000319 t:7.0s +tttg: c99/158 lr:0.000310 t:7.0s +tttg: c100/158 lr:0.000301 t:7.1s +tttg: c101/158 lr:0.000291 t:7.2s +tttg: c102/158 lr:0.000282 t:7.2s +tttg: c103/158 lr:0.000273 t:7.3s +tttg: c104/158 lr:0.000265 t:7.4s +tttg: c105/158 lr:0.000256 t:7.4s +tttg: c106/158 lr:0.000247 t:7.5s +tttg: c107/158 lr:0.000239 t:7.6s +tttg: c108/158 lr:0.000230 t:7.7s +tttg: c109/158 lr:0.000222 t:7.7s +tttg: c110/158 lr:0.000213 t:7.8s +tttg: c111/158 lr:0.000205 t:7.9s +tttg: c112/158 lr:0.000197 t:7.9s +tttg: c113/158 lr:0.000189 t:8.0s +tttg: c114/158 lr:0.000182 t:8.1s +tttg: c115/158 lr:0.000174 t:8.1s +tttg: c116/158 lr:0.000166 t:8.2s +tttg: c117/158 lr:0.000159 t:8.3s +tttg: c118/158 lr:0.000152 t:8.3s +tttg: c119/158 lr:0.000145 t:8.4s +tttg: c120/158 lr:0.000138 t:8.5s +tttg: c121/158 lr:0.000131 t:8.6s +tttg: c122/158 lr:0.000124 t:8.6s +tttg: c123/158 lr:0.000118 t:8.7s +tttg: c124/158 lr:0.000111 t:8.8s +tttg: c125/158 lr:0.000105 t:8.8s +tttg: c126/158 lr:0.000099 t:8.9s +tttg: c127/158 lr:0.000093 t:9.0s +tttg: c128/158 lr:0.000087 t:9.0s +tttg: c129/158 lr:0.000082 t:9.1s +tttg: c130/158 lr:0.000076 t:9.2s +tttg: c131/158 lr:0.000071 t:9.2s +tttg: c132/158 lr:0.000066 t:9.3s +tttg: c133/158 lr:0.000061 t:9.4s +tttg: c134/158 lr:0.000057 t:9.5s +tttg: c135/158 lr:0.000052 t:9.5s +tttg: c136/158 lr:0.000048 t:9.6s +tttg: c137/158 lr:0.000043 t:9.7s +tttg: c138/158 lr:0.000040 t:9.7s +tttg: c139/158 lr:0.000036 t:9.8s +tttg: c140/158 lr:0.000032 t:9.9s +tttg: c141/158 lr:0.000029 t:9.9s +tttg: c142/158 lr:0.000025 t:10.0s +tttg: c143/158 lr:0.000022 t:10.1s +tttg: c144/158 lr:0.000019 t:10.1s +tttg: c145/158 lr:0.000017 t:10.2s +tttg: c146/158 lr:0.000014 t:10.3s +tttg: c147/158 lr:0.000012 t:10.4s +tttg: c148/158 lr:0.000010 t:10.4s +tttg: c149/158 lr:0.000008 t:10.5s +tttg: c150/158 lr:0.000006 t:10.6s +tttg: c151/158 lr:0.000005 t:10.6s +tttg: c152/158 lr:0.000004 t:10.7s +tttg: c153/158 lr:0.000003 t:10.8s +tttg: c154/158 lr:0.000002 t:10.8s +tttg: c155/158 lr:0.000001 t:10.9s +tttg: c156/158 lr:0.000000 t:11.0s +tttg: c157/158 lr:0.000000 t:11.0s +ttpr: phase:1/3 t:168.4s +ttp: b3073/3125 bl:2.3394 bb:1.0608 rl:2.2254 rb:1.0683 dl:5097-5130 gd:0 +ttp: b3062/3125 bl:2.3048 bb:1.0755 rl:2.2305 rb:1.0688 dl:4558-4601 gd:0 +ttp: b3053/3125 bl:2.2821 bb:1.0531 rl:2.2334 rb:1.0678 dl:4284-4309 gd:0 +ttp: b3046/3125 bl:2.2467 bb:1.0418 rl:2.2341 rb:1.0665 dl:4058-4077 gd:0 +ttp: b3037/3125 bl:2.2410 bb:1.0223 rl:2.2344 rb:1.0644 dl:3817-3837 gd:0 +ttpp: phase:2/3 pd:1456 gd:1333 t:237.9s +tttg: c1/247 lr:0.001000 t:0.1s +tttg: c2/247 lr:0.001000 t:0.1s +tttg: c3/247 lr:0.001000 t:0.2s +tttg: c4/247 lr:0.001000 t:0.3s +tttg: c5/247 lr:0.000999 t:0.3s +tttg: c6/247 lr:0.000999 t:0.4s +tttg: c7/247 lr:0.000999 t:0.5s +tttg: c8/247 lr:0.000998 t:0.5s +tttg: c9/247 lr:0.000997 t:0.6s +tttg: c10/247 lr:0.000997 t:0.7s +tttg: c11/247 lr:0.000996 t:0.7s +tttg: c12/247 lr:0.000995 t:0.8s +tttg: c13/247 lr:0.000994 t:0.9s +tttg: c14/247 lr:0.000993 t:0.9s +tttg: c15/247 lr:0.000992 t:1.0s +tttg: c16/247 lr:0.000991 t:1.1s +tttg: c17/247 lr:0.000990 t:1.1s +tttg: c18/247 lr:0.000988 t:1.2s +tttg: c19/247 lr:0.000987 t:1.3s +tttg: c20/247 lr:0.000985 t:1.3s +tttg: c21/247 lr:0.000984 t:1.4s +tttg: c22/247 lr:0.000982 t:1.5s +tttg: c23/247 lr:0.000980 t:1.5s +tttg: c24/247 lr:0.000979 t:1.6s +tttg: c25/247 lr:0.000977 t:1.7s +tttg: c26/247 lr:0.000975 t:1.8s +tttg: c27/247 lr:0.000973 t:1.8s +tttg: c28/247 lr:0.000971 t:1.9s +tttg: c29/247 lr:0.000968 t:2.0s +tttg: c30/247 lr:0.000966 t:2.0s +tttg: c31/247 lr:0.000964 t:2.1s +tttg: c32/247 lr:0.000961 t:2.2s +tttg: c33/247 lr:0.000959 t:2.2s +tttg: c34/247 lr:0.000956 t:2.3s +tttg: c35/247 lr:0.000954 t:2.4s +tttg: c36/247 lr:0.000951 t:2.4s +tttg: c37/247 lr:0.000948 t:2.5s +tttg: c38/247 lr:0.000945 t:2.6s +tttg: c39/247 lr:0.000942 t:2.6s +tttg: c40/247 lr:0.000939 t:2.7s +tttg: c41/247 lr:0.000936 t:2.8s +tttg: c42/247 lr:0.000933 t:2.8s +tttg: c43/247 lr:0.000930 t:2.9s +tttg: c44/247 lr:0.000926 t:3.0s +tttg: c45/247 lr:0.000923 t:3.0s +tttg: c46/247 lr:0.000920 t:3.1s +tttg: c47/247 lr:0.000916 t:3.2s +tttg: c48/247 lr:0.000913 t:3.2s +tttg: c49/247 lr:0.000909 t:3.3s +tttg: c50/247 lr:0.000905 t:3.4s +tttg: c51/247 lr:0.000901 t:3.5s +tttg: c52/247 lr:0.000898 t:3.5s +tttg: c53/247 lr:0.000894 t:3.6s +tttg: c54/247 lr:0.000890 t:3.7s +tttg: c55/247 lr:0.000886 t:3.7s +tttg: c56/247 lr:0.000882 t:3.8s +tttg: c57/247 lr:0.000877 t:3.9s +tttg: c58/247 lr:0.000873 t:3.9s +tttg: c59/247 lr:0.000869 t:4.0s +tttg: c60/247 lr:0.000865 t:4.1s +tttg: c61/247 lr:0.000860 t:4.1s +tttg: c62/247 lr:0.000856 t:4.2s +tttg: c63/247 lr:0.000851 t:4.3s +tttg: c64/247 lr:0.000847 t:4.3s +tttg: c65/247 lr:0.000842 t:4.4s +tttg: c66/247 lr:0.000837 t:4.5s +tttg: c67/247 lr:0.000833 t:4.5s +tttg: c68/247 lr:0.000828 t:4.6s +tttg: c69/247 lr:0.000823 t:4.7s +tttg: c70/247 lr:0.000818 t:4.7s +tttg: c71/247 lr:0.000813 t:4.8s +tttg: c72/247 lr:0.000808 t:4.9s +tttg: c73/247 lr:0.000803 t:4.9s +tttg: c74/247 lr:0.000798 t:5.0s +tttg: c75/247 lr:0.000793 t:5.1s +tttg: c76/247 lr:0.000788 t:5.1s +tttg: c77/247 lr:0.000782 t:5.2s +tttg: c78/247 lr:0.000777 t:5.3s +tttg: c79/247 lr:0.000772 t:5.4s +tttg: c80/247 lr:0.000766 t:5.4s +tttg: c81/247 lr:0.000761 t:5.5s +tttg: c82/247 lr:0.000756 t:5.6s +tttg: c83/247 lr:0.000750 t:5.6s +tttg: c84/247 lr:0.000744 t:5.7s +tttg: c85/247 lr:0.000739 t:5.8s +tttg: c86/247 lr:0.000733 t:5.8s +tttg: c87/247 lr:0.000728 t:5.9s +tttg: c88/247 lr:0.000722 t:6.0s +tttg: c89/247 lr:0.000716 t:6.0s +tttg: c90/247 lr:0.000710 t:6.1s +tttg: c91/247 lr:0.000705 t:6.2s +tttg: c92/247 lr:0.000699 t:6.2s +tttg: c93/247 lr:0.000693 t:6.3s +tttg: c94/247 lr:0.000687 t:6.4s +tttg: c95/247 lr:0.000681 t:6.4s +tttg: c96/247 lr:0.000675 t:6.5s +tttg: c97/247 lr:0.000669 t:6.6s +tttg: c98/247 lr:0.000663 t:6.6s +tttg: c99/247 lr:0.000657 t:6.7s +tttg: c100/247 lr:0.000651 t:6.8s +tttg: c101/247 lr:0.000645 t:6.8s +tttg: c102/247 lr:0.000639 t:6.9s +tttg: c103/247 lr:0.000632 t:7.0s +tttg: c104/247 lr:0.000626 t:7.0s +tttg: c105/247 lr:0.000620 t:7.1s +tttg: c106/247 lr:0.000614 t:7.2s +tttg: c107/247 lr:0.000608 t:7.2s +tttg: c108/247 lr:0.000601 t:7.3s +tttg: c109/247 lr:0.000595 t:7.4s +tttg: c110/247 lr:0.000589 t:7.4s +tttg: c111/247 lr:0.000583 t:7.5s +tttg: c112/247 lr:0.000576 t:7.6s +tttg: c113/247 lr:0.000570 t:7.7s +tttg: c114/247 lr:0.000564 t:7.7s +tttg: c115/247 lr:0.000557 t:7.8s +tttg: c116/247 lr:0.000551 t:7.9s +tttg: c117/247 lr:0.000545 t:7.9s +tttg: c118/247 lr:0.000538 t:8.0s +tttg: c119/247 lr:0.000532 t:8.1s +tttg: c120/247 lr:0.000526 t:8.1s +tttg: c121/247 lr:0.000519 t:8.2s +tttg: c122/247 lr:0.000513 t:8.3s +tttg: c123/247 lr:0.000506 t:8.3s +tttg: c124/247 lr:0.000500 t:8.4s +tttg: c125/247 lr:0.000494 t:8.5s +tttg: c126/247 lr:0.000487 t:8.5s +tttg: c127/247 lr:0.000481 t:8.6s +tttg: c128/247 lr:0.000474 t:8.7s +tttg: c129/247 lr:0.000468 t:8.7s +tttg: c130/247 lr:0.000462 t:8.8s +tttg: c131/247 lr:0.000455 t:8.9s +tttg: c132/247 lr:0.000449 t:8.9s +tttg: c133/247 lr:0.000443 t:9.0s +tttg: c134/247 lr:0.000436 t:9.1s +tttg: c135/247 lr:0.000430 t:9.1s +tttg: c136/247 lr:0.000424 t:9.2s +tttg: c137/247 lr:0.000417 t:9.3s +tttg: c138/247 lr:0.000411 t:9.3s +tttg: c139/247 lr:0.000405 t:9.4s +tttg: c140/247 lr:0.000399 t:9.5s +tttg: c141/247 lr:0.000392 t:9.6s +tttg: c142/247 lr:0.000386 t:9.6s +tttg: c143/247 lr:0.000380 t:9.7s +tttg: c144/247 lr:0.000374 t:9.8s +tttg: c145/247 lr:0.000368 t:9.8s +tttg: c146/247 lr:0.000361 t:9.9s +tttg: c147/247 lr:0.000355 t:10.0s +tttg: c148/247 lr:0.000349 t:10.0s +tttg: c149/247 lr:0.000343 t:10.1s +tttg: c150/247 lr:0.000337 t:10.2s +tttg: c151/247 lr:0.000331 t:10.2s +tttg: c152/247 lr:0.000325 t:10.3s +tttg: c153/247 lr:0.000319 t:10.4s +tttg: c154/247 lr:0.000313 t:10.4s +tttg: c155/247 lr:0.000307 t:10.5s +tttg: c156/247 lr:0.000301 t:10.6s +tttg: c157/247 lr:0.000295 t:10.6s +tttg: c158/247 lr:0.000290 t:10.7s +tttg: c159/247 lr:0.000284 t:10.8s +tttg: c160/247 lr:0.000278 t:10.8s +tttg: c161/247 lr:0.000272 t:10.9s +tttg: c162/247 lr:0.000267 t:11.0s +tttg: c163/247 lr:0.000261 t:11.0s +tttg: c164/247 lr:0.000256 t:11.1s +tttg: c165/247 lr:0.000250 t:11.2s +tttg: c166/247 lr:0.000244 t:11.2s +tttg: c167/247 lr:0.000239 t:11.3s +tttg: c168/247 lr:0.000234 t:11.4s +tttg: c169/247 lr:0.000228 t:11.5s +tttg: c170/247 lr:0.000223 t:11.5s +tttg: c171/247 lr:0.000218 t:11.6s +tttg: c172/247 lr:0.000212 t:11.7s +tttg: c173/247 lr:0.000207 t:11.7s +tttg: c174/247 lr:0.000202 t:11.8s +tttg: c175/247 lr:0.000197 t:11.9s +tttg: c176/247 lr:0.000192 t:11.9s +tttg: c177/247 lr:0.000187 t:12.0s +tttg: c178/247 lr:0.000182 t:12.1s +tttg: c179/247 lr:0.000177 t:12.1s +tttg: c180/247 lr:0.000172 t:12.2s +tttg: c181/247 lr:0.000167 t:12.3s +tttg: c182/247 lr:0.000163 t:12.3s +tttg: c183/247 lr:0.000158 t:12.4s +tttg: c184/247 lr:0.000153 t:12.5s +tttg: c185/247 lr:0.000149 t:12.5s +tttg: c186/247 lr:0.000144 t:12.6s +tttg: c187/247 lr:0.000140 t:12.7s +tttg: c188/247 lr:0.000135 t:12.7s +tttg: c189/247 lr:0.000131 t:12.8s +tttg: c190/247 lr:0.000127 t:12.9s +tttg: c191/247 lr:0.000123 t:12.9s +tttg: c192/247 lr:0.000118 t:13.0s +tttg: c193/247 lr:0.000114 t:13.1s +tttg: c194/247 lr:0.000110 t:13.1s +tttg: c195/247 lr:0.000106 t:13.2s +tttg: c196/247 lr:0.000102 t:13.3s +tttg: c197/247 lr:0.000099 t:13.4s +tttg: c198/247 lr:0.000095 t:13.4s +tttg: c199/247 lr:0.000091 t:13.5s +tttg: c200/247 lr:0.000087 t:13.6s +tttg: c201/247 lr:0.000084 t:13.6s +tttg: c202/247 lr:0.000080 t:13.7s +tttg: c203/247 lr:0.000077 t:13.8s +tttg: c204/247 lr:0.000074 t:13.8s +tttg: c205/247 lr:0.000070 t:13.9s +tttg: c206/247 lr:0.000067 t:14.0s +tttg: c207/247 lr:0.000064 t:14.0s +tttg: c208/247 lr:0.000061 t:14.1s +tttg: c209/247 lr:0.000058 t:14.2s +tttg: c210/247 lr:0.000055 t:14.2s +tttg: c211/247 lr:0.000052 t:14.3s +tttg: c212/247 lr:0.000049 t:14.4s +tttg: c213/247 lr:0.000046 t:14.4s +tttg: c214/247 lr:0.000044 t:14.5s +tttg: c215/247 lr:0.000041 t:14.6s +tttg: c216/247 lr:0.000039 t:14.6s +tttg: c217/247 lr:0.000036 t:14.7s +tttg: c218/247 lr:0.000034 t:14.8s +tttg: c219/247 lr:0.000032 t:14.8s +tttg: c220/247 lr:0.000029 t:14.9s +tttg: c221/247 lr:0.000027 t:15.0s +tttg: c222/247 lr:0.000025 t:15.1s +tttg: c223/247 lr:0.000023 t:15.1s +tttg: c224/247 lr:0.000021 t:15.2s +tttg: c225/247 lr:0.000020 t:15.3s +tttg: c226/247 lr:0.000018 t:15.3s +tttg: c227/247 lr:0.000016 t:15.4s +tttg: c228/247 lr:0.000015 t:15.5s +tttg: c229/247 lr:0.000013 t:15.5s +tttg: c230/247 lr:0.000012 t:15.6s +tttg: c231/247 lr:0.000010 t:15.7s +tttg: c232/247 lr:0.000009 t:15.7s +tttg: c233/247 lr:0.000008 t:15.8s +tttg: c234/247 lr:0.000007 t:15.9s +tttg: c235/247 lr:0.000006 t:15.9s +tttg: c236/247 lr:0.000005 t:16.0s +tttg: c237/247 lr:0.000004 t:16.1s +tttg: c238/247 lr:0.000003 t:16.1s +tttg: c239/247 lr:0.000003 t:16.2s +tttg: c240/247 lr:0.000002 t:16.3s +tttg: c241/247 lr:0.000001 t:16.3s +tttg: c242/247 lr:0.000001 t:16.4s +tttg: c243/247 lr:0.000001 t:16.5s +tttg: c244/247 lr:0.000000 t:16.5s +tttg: c245/247 lr:0.000000 t:16.6s +tttg: c246/247 lr:0.000000 t:16.7s +ttpr: phase:2/3 t:255.1s +ttp: b3033/3125 bl:2.3738 bb:1.0907 rl:2.2404 rb:1.0655 dl:3741-3760 gd:0 +ttp: b3019/3125 bl:2.4930 bb:1.0991 rl:2.2500 rb:1.0669 dl:3436-3446 gd:0 +ttp: b3015/3125 bl:2.4088 bb:1.0622 rl:2.2557 rb:1.0667 dl:3366-3381 gd:0 +ttp: b3006/3125 bl:2.4826 bb:1.0952 rl:2.2633 rb:1.0677 dl:3233-3249 gd:0 +ttp: b3000/3125 bl:2.4149 bb:1.1294 rl:2.2680 rb:1.0697 dl:3135-3149 gd:0 +ttpp: phase:3/3 pd:2112 gd:2000 t:273.9s +tttg: c1/319 lr:0.001000 t:0.1s +tttg: c2/319 lr:0.001000 t:0.1s +tttg: c3/319 lr:0.001000 t:0.2s +tttg: c4/319 lr:0.001000 t:0.3s +tttg: c5/319 lr:0.001000 t:0.3s +tttg: c6/319 lr:0.000999 t:0.4s +tttg: c7/319 lr:0.000999 t:0.5s +tttg: c8/319 lr:0.000999 t:0.5s +tttg: c9/319 lr:0.000998 t:0.6s +tttg: c10/319 lr:0.000998 t:0.7s +tttg: c11/319 lr:0.000998 t:0.7s +tttg: c12/319 lr:0.000997 t:0.8s +tttg: c13/319 lr:0.000996 t:0.9s +tttg: c14/319 lr:0.000996 t:0.9s +tttg: c15/319 lr:0.000995 t:1.0s +tttg: c16/319 lr:0.000995 t:1.1s +tttg: c17/319 lr:0.000994 t:1.1s +tttg: c18/319 lr:0.000993 t:1.2s +tttg: c19/319 lr:0.000992 t:1.3s +tttg: c20/319 lr:0.000991 t:1.3s +tttg: c21/319 lr:0.000990 t:1.4s +tttg: c22/319 lr:0.000989 t:1.5s +tttg: c23/319 lr:0.000988 t:1.5s +tttg: c24/319 lr:0.000987 t:1.6s +tttg: c25/319 lr:0.000986 t:1.7s +tttg: c26/319 lr:0.000985 t:1.8s +tttg: c27/319 lr:0.000984 t:1.8s +tttg: c28/319 lr:0.000982 t:1.9s +tttg: c29/319 lr:0.000981 t:2.0s +tttg: c30/319 lr:0.000980 t:2.0s +tttg: c31/319 lr:0.000978 t:2.1s +tttg: c32/319 lr:0.000977 t:2.2s +tttg: c33/319 lr:0.000975 t:2.2s +tttg: c34/319 lr:0.000974 t:2.3s +tttg: c35/319 lr:0.000972 t:2.4s +tttg: c36/319 lr:0.000970 t:2.4s +tttg: c37/319 lr:0.000969 t:2.5s +tttg: c38/319 lr:0.000967 t:2.6s +tttg: c39/319 lr:0.000965 t:2.6s +tttg: c40/319 lr:0.000963 t:2.7s +tttg: c41/319 lr:0.000961 t:2.8s +tttg: c42/319 lr:0.000960 t:2.8s +tttg: c43/319 lr:0.000958 t:2.9s +tttg: c44/319 lr:0.000956 t:3.0s +tttg: c45/319 lr:0.000954 t:3.0s +tttg: c46/319 lr:0.000951 t:3.1s +tttg: c47/319 lr:0.000949 t:3.2s +tttg: c48/319 lr:0.000947 t:3.2s +tttg: c49/319 lr:0.000945 t:3.3s +tttg: c50/319 lr:0.000943 t:3.4s +tttg: c51/319 lr:0.000940 t:3.4s +tttg: c52/319 lr:0.000938 t:3.5s +tttg: c53/319 lr:0.000935 t:3.6s +tttg: c54/319 lr:0.000933 t:3.7s +tttg: c55/319 lr:0.000931 t:3.7s +tttg: c56/319 lr:0.000928 t:3.8s +tttg: c57/319 lr:0.000925 t:3.9s +tttg: c58/319 lr:0.000923 t:3.9s +tttg: c59/319 lr:0.000920 t:4.0s +tttg: c60/319 lr:0.000917 t:4.1s +tttg: c61/319 lr:0.000915 t:4.1s +tttg: c62/319 lr:0.000912 t:4.2s +tttg: c63/319 lr:0.000909 t:4.3s +tttg: c64/319 lr:0.000906 t:4.3s +tttg: c65/319 lr:0.000903 t:4.4s +tttg: c66/319 lr:0.000900 t:4.5s +tttg: c67/319 lr:0.000897 t:4.5s +tttg: c68/319 lr:0.000894 t:4.6s +tttg: c69/319 lr:0.000891 t:4.7s +tttg: c70/319 lr:0.000888 t:4.7s +tttg: c71/319 lr:0.000885 t:4.8s +tttg: c72/319 lr:0.000882 t:4.9s +tttg: c73/319 lr:0.000879 t:4.9s +tttg: c74/319 lr:0.000876 t:5.0s +tttg: c75/319 lr:0.000872 t:5.1s +tttg: c76/319 lr:0.000869 t:5.1s +tttg: c77/319 lr:0.000866 t:5.2s +tttg: c78/319 lr:0.000862 t:5.3s +tttg: c79/319 lr:0.000859 t:5.3s +tttg: c80/319 lr:0.000855 t:5.4s +tttg: c81/319 lr:0.000852 t:5.5s +tttg: c82/319 lr:0.000848 t:5.5s +tttg: c83/319 lr:0.000845 t:5.6s +tttg: c84/319 lr:0.000841 t:5.7s +tttg: c85/319 lr:0.000837 t:5.7s +tttg: c86/319 lr:0.000834 t:5.8s +tttg: c87/319 lr:0.000830 t:5.9s +tttg: c88/319 lr:0.000826 t:5.9s +tttg: c89/319 lr:0.000823 t:6.0s +tttg: c90/319 lr:0.000819 t:6.1s +tttg: c91/319 lr:0.000815 t:6.2s +tttg: c92/319 lr:0.000811 t:6.2s +tttg: c93/319 lr:0.000807 t:6.3s +tttg: c94/319 lr:0.000803 t:6.4s +tttg: c95/319 lr:0.000799 t:6.4s +tttg: c96/319 lr:0.000795 t:6.5s +tttg: c97/319 lr:0.000791 t:6.6s +tttg: c98/319 lr:0.000787 t:6.6s +tttg: c99/319 lr:0.000783 t:6.7s +tttg: c100/319 lr:0.000779 t:6.8s +tttg: c101/319 lr:0.000775 t:6.8s +tttg: c102/319 lr:0.000771 t:6.9s +tttg: c103/319 lr:0.000767 t:7.0s +tttg: c104/319 lr:0.000763 t:7.0s +tttg: c105/319 lr:0.000759 t:7.1s +tttg: c106/319 lr:0.000754 t:7.2s +tttg: c107/319 lr:0.000750 t:7.2s +tttg: c108/319 lr:0.000746 t:7.3s +tttg: c109/319 lr:0.000741 t:7.4s +tttg: c110/319 lr:0.000737 t:7.4s +tttg: c111/319 lr:0.000733 t:7.5s +tttg: c112/319 lr:0.000728 t:7.6s +tttg: c113/319 lr:0.000724 t:7.6s +tttg: c114/319 lr:0.000719 t:7.7s +tttg: c115/319 lr:0.000715 t:7.8s +tttg: c116/319 lr:0.000711 t:7.8s +tttg: c117/319 lr:0.000706 t:7.9s +tttg: c118/319 lr:0.000702 t:8.0s +tttg: c119/319 lr:0.000697 t:8.0s +tttg: c120/319 lr:0.000692 t:8.1s +tttg: c121/319 lr:0.000688 t:8.2s +tttg: c122/319 lr:0.000683 t:8.2s +tttg: c123/319 lr:0.000679 t:8.3s +tttg: c124/319 lr:0.000674 t:8.4s +tttg: c125/319 lr:0.000669 t:8.5s +tttg: c126/319 lr:0.000665 t:8.5s +tttg: c127/319 lr:0.000660 t:8.6s +tttg: c128/319 lr:0.000655 t:8.7s +tttg: c129/319 lr:0.000651 t:8.7s +tttg: c130/319 lr:0.000646 t:8.8s +tttg: c131/319 lr:0.000641 t:8.9s +tttg: c132/319 lr:0.000637 t:8.9s +tttg: c133/319 lr:0.000632 t:9.0s +tttg: c134/319 lr:0.000627 t:9.1s +tttg: c135/319 lr:0.000622 t:9.1s +tttg: c136/319 lr:0.000617 t:9.2s +tttg: c137/319 lr:0.000613 t:9.3s +tttg: c138/319 lr:0.000608 t:9.3s +tttg: c139/319 lr:0.000603 t:9.4s +tttg: c140/319 lr:0.000598 t:9.5s +tttg: c141/319 lr:0.000593 t:9.5s +tttg: c142/319 lr:0.000588 t:9.6s +tttg: c143/319 lr:0.000584 t:9.7s +tttg: c144/319 lr:0.000579 t:9.7s +tttg: c145/319 lr:0.000574 t:9.8s +tttg: c146/319 lr:0.000569 t:9.9s +tttg: c147/319 lr:0.000564 t:9.9s +tttg: c148/319 lr:0.000559 t:10.0s +tttg: c149/319 lr:0.000554 t:10.1s +tttg: c150/319 lr:0.000549 t:10.1s +tttg: c151/319 lr:0.000544 t:10.2s +tttg: c152/319 lr:0.000539 t:10.3s +tttg: c153/319 lr:0.000535 t:10.4s +tttg: c154/319 lr:0.000530 t:10.4s +tttg: c155/319 lr:0.000525 t:10.5s +tttg: c156/319 lr:0.000520 t:10.6s +tttg: c157/319 lr:0.000515 t:10.6s +tttg: c158/319 lr:0.000510 t:10.7s +tttg: c159/319 lr:0.000505 t:10.8s +tttg: c160/319 lr:0.000500 t:10.8s +tttg: c161/319 lr:0.000495 t:10.9s +tttg: c162/319 lr:0.000490 t:11.0s +tttg: c163/319 lr:0.000485 t:11.0s +tttg: c164/319 lr:0.000480 t:11.1s +tttg: c165/319 lr:0.000475 t:11.2s +tttg: c166/319 lr:0.000470 t:11.2s +tttg: c167/319 lr:0.000465 t:11.3s +tttg: c168/319 lr:0.000461 t:11.4s +tttg: c169/319 lr:0.000456 t:11.4s +tttg: c170/319 lr:0.000451 t:11.5s +tttg: c171/319 lr:0.000446 t:11.6s +tttg: c172/319 lr:0.000441 t:11.6s +tttg: c173/319 lr:0.000436 t:11.7s +tttg: c174/319 lr:0.000431 t:11.8s +tttg: c175/319 lr:0.000426 t:11.8s +tttg: c176/319 lr:0.000421 t:11.9s +tttg: c177/319 lr:0.000416 t:12.0s +tttg: c178/319 lr:0.000412 t:12.0s +tttg: c179/319 lr:0.000407 t:12.1s +tttg: c180/319 lr:0.000402 t:12.2s +tttg: c181/319 lr:0.000397 t:12.2s +tttg: c182/319 lr:0.000392 t:12.3s +tttg: c183/319 lr:0.000387 t:12.4s +tttg: c184/319 lr:0.000383 t:12.4s +tttg: c185/319 lr:0.000378 t:12.5s +tttg: c186/319 lr:0.000373 t:12.6s +tttg: c187/319 lr:0.000368 t:12.7s +tttg: c188/319 lr:0.000363 t:12.7s +tttg: c189/319 lr:0.000359 t:12.8s +tttg: c190/319 lr:0.000354 t:12.9s +tttg: c191/319 lr:0.000349 t:12.9s +tttg: c192/319 lr:0.000345 t:13.0s +tttg: c193/319 lr:0.000340 t:13.1s +tttg: c194/319 lr:0.000335 t:13.1s +tttg: c195/319 lr:0.000331 t:13.2s +tttg: c196/319 lr:0.000326 t:13.3s +tttg: c197/319 lr:0.000321 t:13.3s +tttg: c198/319 lr:0.000317 t:13.4s +tttg: c199/319 lr:0.000312 t:13.5s +tttg: c200/319 lr:0.000308 t:13.5s +tttg: c201/319 lr:0.000303 t:13.6s +tttg: c202/319 lr:0.000298 t:13.7s +tttg: c203/319 lr:0.000294 t:13.7s +tttg: c204/319 lr:0.000289 t:13.8s +tttg: c205/319 lr:0.000285 t:13.9s +tttg: c206/319 lr:0.000281 t:13.9s +tttg: c207/319 lr:0.000276 t:14.0s +tttg: c208/319 lr:0.000272 t:14.1s +tttg: c209/319 lr:0.000267 t:14.1s +tttg: c210/319 lr:0.000263 t:14.2s +tttg: c211/319 lr:0.000259 t:14.3s +tttg: c212/319 lr:0.000254 t:14.3s +tttg: c213/319 lr:0.000250 t:14.4s +tttg: c214/319 lr:0.000246 t:14.5s +tttg: c215/319 lr:0.000241 t:14.6s +tttg: c216/319 lr:0.000237 t:14.6s +tttg: c217/319 lr:0.000233 t:14.7s +tttg: c218/319 lr:0.000229 t:14.8s +tttg: c219/319 lr:0.000225 t:14.8s +tttg: c220/319 lr:0.000221 t:14.9s +tttg: c221/319 lr:0.000217 t:15.0s +tttg: c222/319 lr:0.000213 t:15.0s +tttg: c223/319 lr:0.000209 t:15.1s +tttg: c224/319 lr:0.000205 t:15.2s +tttg: c225/319 lr:0.000201 t:15.2s +tttg: c226/319 lr:0.000197 t:15.3s +tttg: c227/319 lr:0.000193 t:15.4s +tttg: c228/319 lr:0.000189 t:15.4s +tttg: c229/319 lr:0.000185 t:15.5s +tttg: c230/319 lr:0.000181 t:15.6s +tttg: c231/319 lr:0.000177 t:15.6s +tttg: c232/319 lr:0.000174 t:15.7s +tttg: c233/319 lr:0.000170 t:15.8s +tttg: c234/319 lr:0.000166 t:15.8s +tttg: c235/319 lr:0.000163 t:15.9s +tttg: c236/319 lr:0.000159 t:16.0s +tttg: c237/319 lr:0.000155 t:16.0s +tttg: c238/319 lr:0.000152 t:16.1s +tttg: c239/319 lr:0.000148 t:16.2s +tttg: c240/319 lr:0.000145 t:16.2s +tttg: c241/319 lr:0.000141 t:16.3s +tttg: c242/319 lr:0.000138 t:16.4s +tttg: c243/319 lr:0.000134 t:16.4s +tttg: c244/319 lr:0.000131 t:16.5s +tttg: c245/319 lr:0.000128 t:16.6s +tttg: c246/319 lr:0.000124 t:16.6s +tttg: c247/319 lr:0.000121 t:16.7s +tttg: c248/319 lr:0.000118 t:16.8s +tttg: c249/319 lr:0.000115 t:16.8s +tttg: c250/319 lr:0.000112 t:16.9s +tttg: c251/319 lr:0.000109 t:17.0s +tttg: c252/319 lr:0.000106 t:17.1s +tttg: c253/319 lr:0.000103 t:17.1s +tttg: c254/319 lr:0.000100 t:17.2s +tttg: c255/319 lr:0.000097 t:17.3s +tttg: c256/319 lr:0.000094 t:17.3s +tttg: c257/319 lr:0.000091 t:17.4s +tttg: c258/319 lr:0.000088 t:17.5s +tttg: c259/319 lr:0.000085 t:17.5s +tttg: c260/319 lr:0.000083 t:17.6s +tttg: c261/319 lr:0.000080 t:17.7s +tttg: c262/319 lr:0.000077 t:17.7s +tttg: c263/319 lr:0.000075 t:17.8s +tttg: c264/319 lr:0.000072 t:17.9s +tttg: c265/319 lr:0.000069 t:17.9s +tttg: c266/319 lr:0.000067 t:18.0s +tttg: c267/319 lr:0.000065 t:18.1s +tttg: c268/319 lr:0.000062 t:18.1s +tttg: c269/319 lr:0.000060 t:18.2s +tttg: c270/319 lr:0.000057 t:18.3s +tttg: c271/319 lr:0.000055 t:18.3s +tttg: c272/319 lr:0.000053 t:18.4s +tttg: c273/319 lr:0.000051 t:18.5s +tttg: c274/319 lr:0.000049 t:18.5s +tttg: c275/319 lr:0.000046 t:18.6s +tttg: c276/319 lr:0.000044 t:18.7s +tttg: c277/319 lr:0.000042 t:18.7s +tttg: c278/319 lr:0.000040 t:18.8s +tttg: c279/319 lr:0.000039 t:18.9s +tttg: c280/319 lr:0.000037 t:18.9s +tttg: c281/319 lr:0.000035 t:19.0s +tttg: c282/319 lr:0.000033 t:19.1s +tttg: c283/319 lr:0.000031 t:19.1s +tttg: c284/319 lr:0.000030 t:19.2s +tttg: c285/319 lr:0.000028 t:19.3s +tttg: c286/319 lr:0.000026 t:19.4s +tttg: c287/319 lr:0.000025 t:19.4s +tttg: c288/319 lr:0.000023 t:19.5s +tttg: c289/319 lr:0.000022 t:19.6s +tttg: c290/319 lr:0.000020 t:19.6s +tttg: c291/319 lr:0.000019 t:19.7s +tttg: c292/319 lr:0.000018 t:19.8s +tttg: c293/319 lr:0.000016 t:19.8s +tttg: c294/319 lr:0.000015 t:19.9s +tttg: c295/319 lr:0.000014 t:20.0s +tttg: c296/319 lr:0.000013 t:20.0s +tttg: c297/319 lr:0.000012 t:20.1s +tttg: c298/319 lr:0.000011 t:20.2s +tttg: c299/319 lr:0.000010 t:20.2s +tttg: c300/319 lr:0.000009 t:20.3s +tttg: c301/319 lr:0.000008 t:20.4s +tttg: c302/319 lr:0.000007 t:20.4s +tttg: c303/319 lr:0.000006 t:20.5s +tttg: c304/319 lr:0.000005 t:20.6s +tttg: c305/319 lr:0.000005 t:20.6s +tttg: c306/319 lr:0.000004 t:20.7s +tttg: c307/319 lr:0.000004 t:20.8s +tttg: c308/319 lr:0.000003 t:20.8s +tttg: c309/319 lr:0.000002 t:20.9s +tttg: c310/319 lr:0.000002 t:21.0s +tttg: c311/319 lr:0.000002 t:21.0s +tttg: c312/319 lr:0.000001 t:21.1s +tttg: c313/319 lr:0.000001 t:21.2s +tttg: c314/319 lr:0.000001 t:21.2s +tttg: c315/319 lr:0.000000 t:21.3s +tttg: c316/319 lr:0.000000 t:21.4s +tttg: c317/319 lr:0.000000 t:21.4s +tttg: c318/319 lr:0.000000 t:21.5s +ttpr: phase:3/3 t:295.9s +ttp: b2993/3125 bl:2.4095 bb:1.0369 rl:2.2722 rb:1.0686 dl:3039-3050 gd:1 +ttp: b2979/3125 bl:2.1303 bb:0.9836 rl:2.2683 rb:1.0663 dl:2861-2872 gd:1 +ttp: b2975/3125 bl:2.4461 bb:1.1355 rl:2.2729 rb:1.0681 dl:2819-2830 gd:1 +ttp: b2968/3125 bl:2.3311 bb:1.0632 rl:2.2744 rb:1.0680 dl:2753-2762 gd:1 +ttp: b2957/3125 bl:2.3796 bb:1.1083 rl:2.2768 rb:1.0689 dl:2653-2658 gd:1 +ttp: b2950/3125 bl:2.3084 bb:1.0334 rl:2.2775 rb:1.0681 dl:2592-2603 gd:1 +ttp: b2939/3125 bl:2.3816 bb:1.1057 rl:2.2797 rb:1.0689 dl:2508-2517 gd:1 +ttp: b2930/3125 bl:2.3217 bb:1.0100 rl:2.2806 rb:1.0676 dl:2451-2455 gd:1 +ttp: b2922/3125 bl:2.4410 bb:1.0938 rl:2.2836 rb:1.0681 dl:2387-2397 gd:1 +ttp: b2914/3125 bl:2.1440 bb:1.0800 rl:2.2811 rb:1.0684 dl:2335-2341 gd:1 +ttp: b2908/3125 bl:2.3732 bb:1.0859 rl:2.2827 rb:1.0687 dl:2301-2305 gd:1 +ttp: b2900/3125 bl:2.2847 bb:1.0122 rl:2.2827 rb:1.0676 dl:2249-2254 gd:1 +ttp: b2892/3125 bl:2.5626 bb:1.1147 rl:2.2874 rb:1.0685 dl:2199-2203 gd:1 +ttp: b2886/3125 bl:2.3466 bb:1.0094 rl:2.2883 rb:1.0675 dl:2167-2173 gd:1 +ttp: b2878/3125 bl:2.3613 bb:1.0282 rl:2.2894 rb:1.0668 dl:2129-2134 gd:1 +ttp: b2870/3125 bl:2.3257 bb:1.0658 rl:2.2900 rb:1.0668 dl:2093-2097 gd:1 +ttp: b2860/3125 bl:2.3270 bb:1.0445 rl:2.2905 rb:1.0665 dl:2049-2053 gd:1 +ttp: b2852/3125 bl:2.2276 bb:0.9661 rl:2.2896 rb:1.0650 dl:2013-2017 gd:1 +ttp: b2848/3125 bl:2.4369 bb:1.0523 rl:2.2916 rb:1.0648 dl:1997-2002 gd:1 +ttp: b2840/3125 bl:2.1437 bb:1.0136 rl:2.2897 rb:1.0641 dl:1962-1966 gd:1 +ttp: b2833/3125 bl:2.3131 bb:1.0195 rl:2.2900 rb:1.0635 dl:1937-1940 gd:1 +ttp: b2825/3125 bl:2.4228 bb:1.0404 rl:2.2917 rb:1.0632 dl:1910-1914 gd:1 +ttp: b2816/3125 bl:2.3147 bb:1.0610 rl:2.2919 rb:1.0632 dl:1883-1885 gd:1 +ttp: b2807/3125 bl:2.6276 bb:1.1352 rl:2.2959 rb:1.0641 dl:1853-1855 gd:1 +ttp: b2799/3125 bl:2.2373 bb:0.9847 rl:2.2953 rb:1.0631 dl:1830-1832 gd:1 +ttp: b2790/3125 bl:2.1337 bb:1.0164 rl:2.2934 rb:1.0626 dl:1805-1808 gd:1 +ttp: b2783/3125 bl:2.4102 bb:1.0673 rl:2.2947 rb:1.0626 dl:1785-1788 gd:1 +ttp: b2773/3125 bl:2.1972 bb:1.0034 rl:2.2937 rb:1.0620 dl:1758-1760 gd:1 +ttp: b2765/3125 bl:2.1942 bb:1.0238 rl:2.2926 rb:1.0616 dl:1737-1739 gd:1 +ttp: b2757/3125 bl:2.0867 bb:0.9656 rl:2.2905 rb:1.0606 dl:1715-1718 gd:1 +ttp: b2749/3125 bl:2.3338 bb:1.0077 rl:2.2909 rb:1.0600 dl:1696-1698 gd:1 +ttp: b2741/3125 bl:2.4190 bb:1.0730 rl:2.2922 rb:1.0602 dl:1675-1677 gd:1 +ttp: b2734/3125 bl:2.4387 bb:1.0544 rl:2.2936 rb:1.0601 dl:1659-1660 gd:1 +ttp: b2726/3125 bl:2.2493 bb:1.0720 rl:2.2932 rb:1.0602 dl:1640-1642 gd:1 +ttp: b2718/3125 bl:2.2343 bb:0.9933 rl:2.2926 rb:1.0596 dl:1620-1623 gd:1 +ttp: b2710/3125 bl:2.3724 bb:1.0470 rl:2.2934 rb:1.0594 dl:1604-1606 gd:1 +ttp: b2701/3125 bl:2.2334 bb:0.9816 rl:2.2928 rb:1.0587 dl:1586-1588 gd:1 +ttp: b2693/3125 bl:2.3723 bb:1.0822 rl:2.2935 rb:1.0589 dl:1571-1572 gd:1 +ttp: b2685/3125 bl:2.3205 bb:1.0645 rl:2.2938 rb:1.0590 dl:1553-1554 gd:1 +ttp: b2677/3125 bl:2.2614 bb:1.0861 rl:2.2935 rb:1.0592 dl:1537-1539 gd:1 +ttp: b2669/3125 bl:2.2983 bb:1.0622 rl:2.2935 rb:1.0592 dl:1521-1523 gd:1 +ttp: b2661/3125 bl:2.3552 bb:1.0140 rl:2.2940 rb:1.0588 dl:1507-1508 gd:1 +ttp: b2653/3125 bl:2.3240 bb:1.0158 rl:2.2943 rb:1.0585 dl:1493-1495 gd:1 +ttp: b2646/3125 bl:2.2843 bb:1.0298 rl:2.2942 rb:1.0582 dl:1481-1483 gd:1 +ttp: b2638/3125 bl:2.3852 bb:1.0936 rl:2.2949 rb:1.0585 dl:1467-1469 gd:1 +ttp: b2630/3125 bl:2.2383 bb:0.9897 rl:2.2945 rb:1.0580 dl:1453-1455 gd:1 +ttp: b2614/3125 bl:2.3241 bb:1.0231 rl:2.2947 rb:1.0577 dl:1427-1428 gd:1 +ttp: b2606/3125 bl:2.2081 bb:1.0243 rl:2.2941 rb:1.0574 dl:1413-1415 gd:1 +ttp: b2600/3125 bl:2.2374 bb:1.0690 rl:2.2936 rb:1.0575 dl:1404-1406 gd:1 +ttp: b2593/3125 bl:2.3805 bb:1.0703 rl:2.2943 rb:1.0576 dl:1392-1393 gd:1 +ttp: b2586/3125 bl:2.1108 bb:1.0238 rl:2.2930 rb:1.0574 dl:1383-1384 gd:1 +ttp: b2578/3125 bl:2.3545 bb:1.0447 rl:2.2934 rb:1.0573 dl:1368-1370 gd:1 +ttp: b2570/3125 bl:2.5391 bb:1.0772 rl:2.2951 rb:1.0575 dl:1357-1359 gd:1 +ttp: b2563/3125 bl:2.4436 bb:1.0597 rl:2.2960 rb:1.0575 dl:1346-1348 gd:1 +ttp: b2555/3125 bl:2.3849 bb:1.1026 rl:2.2966 rb:1.0578 dl:1334-1336 gd:1 +ttp: b2549/3125 bl:2.2842 bb:1.0532 rl:2.2966 rb:1.0577 dl:1325-1327 gd:1 +ttp: b2541/3125 bl:2.4330 bb:1.0507 rl:2.2974 rb:1.0577 dl:1314-1316 gd:1 +ttp: b2533/3125 bl:2.2755 bb:1.0583 rl:2.2973 rb:1.0577 dl:1302-1304 gd:1 +ttp: b2525/3125 bl:2.3341 bb:1.0277 rl:2.2975 rb:1.0575 dl:1290-1292 gd:1 +ttp: b2517/3125 bl:2.3720 bb:1.1129 rl:2.2980 rb:1.0578 dl:1280-1281 gd:1 +ttp: b2510/3125 bl:2.2767 bb:1.0077 rl:2.2978 rb:1.0575 dl:1272-1273 gd:1 +ttp: b2503/3125 bl:2.3122 bb:0.9982 rl:2.2979 rb:1.0571 dl:1262-1264 gd:1 +ttp: b2496/3125 bl:2.3182 bb:1.0436 rl:2.2980 rb:1.0571 dl:1253-1255 gd:1 +ttp: b2488/3125 bl:2.2886 bb:1.0249 rl:2.2980 rb:1.0569 dl:1242-1243 gd:1 +ttp: b2480/3125 bl:2.4044 bb:1.0680 rl:2.2986 rb:1.0569 dl:1230-1231 gd:1 +ttp: b2472/3125 bl:2.4936 bb:1.1049 rl:2.2997 rb:1.0572 dl:1220-1221 gd:1 +ttp: b2464/3125 bl:2.2946 bb:0.9850 rl:2.2997 rb:1.0568 dl:1209-1211 gd:1 +ttp: b2456/3125 bl:2.2801 bb:1.0518 rl:2.2995 rb:1.0568 dl:1198-1200 gd:1 +ttp: b2448/3125 bl:2.1750 bb:1.0078 rl:2.2989 rb:1.0565 dl:1189-1190 gd:1 +ttp: b2440/3125 bl:2.2663 bb:1.0304 rl:2.2987 rb:1.0564 dl:1181-1182 gd:1 +ttp: b2432/3125 bl:2.4078 bb:1.0919 rl:2.2993 rb:1.0565 dl:1171-1172 gd:1 +ttp: b2424/3125 bl:2.3158 bb:1.0320 rl:2.2994 rb:1.0564 dl:1162-1164 gd:1 +ttp: b2416/3125 bl:2.2932 bb:1.0082 rl:2.2993 rb:1.0562 dl:1153-1154 gd:1 +ttp: b2408/3125 bl:2.4202 bb:1.0108 rl:2.2999 rb:1.0559 dl:1145-1146 gd:1 +ttp: b2400/3125 bl:2.3466 bb:1.0546 rl:2.3002 rb:1.0559 dl:1136-1137 gd:1 +ttp: b2392/3125 bl:2.3697 bb:1.0825 rl:2.3005 rb:1.0560 dl:1128-1129 gd:1 +ttp: b2384/3125 bl:2.4054 bb:1.0505 rl:2.3010 rb:1.0560 dl:1118-1119 gd:1 +ttp: b2376/3125 bl:2.2865 bb:1.0446 rl:2.3010 rb:1.0559 dl:1109-1110 gd:1 +ttp: b2368/3125 bl:2.1948 bb:0.9894 rl:2.3004 rb:1.0556 dl:1101-1103 gd:1 +ttp: b2360/3125 bl:2.3182 bb:1.0455 rl:2.3005 rb:1.0556 dl:1093-1093 gd:1 +ttp: b2352/3125 bl:2.4180 bb:1.0205 rl:2.3011 rb:1.0554 dl:1084-1086 gd:1 +ttp: b2344/3125 bl:2.2934 bb:1.0346 rl:2.3010 rb:1.0553 dl:1075-1076 gd:1 +ttp: b2337/3125 bl:2.2176 bb:0.9923 rl:2.3007 rb:1.0550 dl:1069-1069 gd:1 +ttp: b2328/3125 bl:2.3358 bb:1.0693 rl:2.3008 rb:1.0551 dl:1059-1060 gd:1 +ttp: b2320/3125 bl:2.3524 bb:1.0733 rl:2.3010 rb:1.0552 dl:1051-1052 gd:1 +ttp: b2312/3125 bl:2.5177 bb:1.0192 rl:2.3020 rb:1.0550 dl:1043-1044 gd:1 +ttp: b2304/3125 bl:2.5108 bb:1.1245 rl:2.3029 rb:1.0553 dl:1036-1037 gd:1 +ttp: b2296/3125 bl:2.4701 bb:1.0724 rl:2.3036 rb:1.0554 dl:1028-1029 gd:1 +ttp: b2288/3125 bl:2.2129 bb:0.9902 rl:2.3032 rb:1.0551 dl:1021-1021 gd:1 +ttp: b2280/3125 bl:2.5569 bb:1.1056 rl:2.3043 rb:1.0553 dl:1012-1014 gd:1 +ttp: b2272/3125 bl:2.3104 bb:1.1314 rl:2.3043 rb:1.0556 dl:1006-1007 gd:1 +ttp: b2265/3125 bl:2.3830 bb:1.0625 rl:2.3046 rb:1.0556 dl:1001-1001 gd:1 +ttp: b2256/3125 bl:2.3763 bb:1.0446 rl:2.3049 rb:1.0556 dl:992-993 gd:1 +ttp: b2249/3125 bl:2.2812 bb:1.0130 rl:2.3048 rb:1.0554 dl:987-987 gd:1 +ttp: b2240/3125 bl:2.1884 bb:1.0188 rl:2.3043 rb:1.0553 dl:978-979 gd:1 +ttp: b2232/3125 bl:2.3512 bb:1.0139 rl:2.3045 rb:1.0551 dl:971-972 gd:1 +ttp: b2225/3125 bl:2.6684 bb:1.2046 rl:2.3059 rb:1.0557 dl:965-965 gd:1 +ttp: b2217/3125 bl:2.3500 bb:1.0341 rl:2.3061 rb:1.0556 dl:959-959 gd:1 +ttp: b2208/3125 bl:2.2807 bb:1.0018 rl:2.3060 rb:1.0554 dl:951-952 gd:1 +ttp: b2200/3125 bl:2.2928 bb:1.0242 rl:2.3059 rb:1.0553 dl:946-946 gd:1 +ttp: b2192/3125 bl:2.2859 bb:1.0782 rl:2.3059 rb:1.0553 dl:938-939 gd:1 +ttp: b2185/3125 bl:2.2876 bb:1.0183 rl:2.3058 rb:1.0552 dl:934-934 gd:1 +ttp: b2175/3125 bl:2.4160 bb:1.0921 rl:2.3062 rb:1.0553 dl:926-927 gd:1 +ttp: b2167/3125 bl:2.3687 bb:1.0610 rl:2.3064 rb:1.0554 dl:920-921 gd:1 +ttp: b2160/3125 bl:2.4208 bb:1.0885 rl:2.3068 rb:1.0555 dl:914-915 gd:1 +ttp: b2153/3125 bl:2.3532 bb:1.0233 rl:2.3070 rb:1.0554 dl:909-909 gd:1 +ttp: b2143/3125 bl:2.3544 bb:1.0950 rl:2.3072 rb:1.0555 dl:901-902 gd:1 +ttp: b2121/3125 bl:2.1697 bb:0.9709 rl:2.3067 rb:1.0552 dl:884-885 gd:1 +ttp: b2113/3125 bl:2.3277 bb:1.0436 rl:2.3068 rb:1.0552 dl:878-879 gd:1 +ttp: b2105/3125 bl:2.2620 bb:1.0482 rl:2.3066 rb:1.0551 dl:872-873 gd:1 +ttp: b2097/3125 bl:2.3941 bb:1.0447 rl:2.3069 rb:1.0551 dl:866-867 gd:1 +ttp: b2091/3125 bl:2.3798 bb:1.0417 rl:2.3071 rb:1.0551 dl:861-862 gd:1 +ttp: b2083/3125 bl:2.3991 bb:1.0833 rl:2.3074 rb:1.0552 dl:856-857 gd:1 +ttp: b2074/3125 bl:2.2589 bb:1.0724 rl:2.3073 rb:1.0552 dl:850-851 gd:1 +ttp: b2070/3125 bl:2.1366 bb:1.0067 rl:2.3067 rb:1.0551 dl:847-847 gd:1 +ttp: b2062/3125 bl:2.2129 bb:1.0398 rl:2.3064 rb:1.0550 dl:841-842 gd:1 +ttp: b2055/3125 bl:2.3306 bb:1.0977 rl:2.3065 rb:1.0551 dl:837-837 gd:1 +ttp: b2047/3125 bl:2.2348 bb:1.0381 rl:2.3063 rb:1.0551 dl:831-831 gd:1 +ttp: b2039/3125 bl:2.4475 bb:1.0853 rl:2.3067 rb:1.0552 dl:824-825 gd:1 +ttp: b2030/3125 bl:2.2935 bb:1.0366 rl:2.3067 rb:1.0551 dl:818-818 gd:1 +ttp: b2022/3125 bl:2.4123 bb:0.9805 rl:2.3070 rb:1.0549 dl:812-813 gd:1 +ttp: b2016/3125 bl:2.2172 bb:1.0362 rl:2.3067 rb:1.0548 dl:808-809 gd:1 +ttp: b2011/3125 bl:2.4056 bb:1.0938 rl:2.3070 rb:1.0549 dl:805-806 gd:1 +ttp: b2007/3125 bl:2.3268 bb:1.0265 rl:2.3071 rb:1.0548 dl:803-803 gd:1 +ttp: b1997/3125 bl:2.3530 bb:1.0901 rl:2.3072 rb:1.0549 dl:796-797 gd:1 +ttp: b1990/3125 bl:2.3859 bb:1.1248 rl:2.3074 rb:1.0551 dl:791-792 gd:1 +ttp: b1982/3125 bl:2.4041 bb:1.0536 rl:2.3077 rb:1.0551 dl:786-787 gd:1 +ttp: b1973/3125 bl:2.2802 bb:1.0685 rl:2.3076 rb:1.0552 dl:780-781 gd:1 +ttp: b1967/3125 bl:2.3247 bb:1.0736 rl:2.3077 rb:1.0552 dl:777-777 gd:1 +ttp: b1957/3125 bl:2.4104 bb:1.0778 rl:2.3080 rb:1.0553 dl:771-772 gd:1 +ttp: b1950/3125 bl:2.3221 bb:1.0454 rl:2.3080 rb:1.0553 dl:767-768 gd:1 +ttp: b1945/3125 bl:2.4809 bb:1.1007 rl:2.3085 rb:1.0554 dl:764-764 gd:1 +ttp: b1936/3125 bl:2.4945 bb:1.0450 rl:2.3090 rb:1.0554 dl:758-759 gd:1 +ttp: b1930/3125 bl:2.3936 bb:1.0718 rl:2.3092 rb:1.0554 dl:755-755 gd:1 +ttp: b1922/3125 bl:2.4875 bb:1.1270 rl:2.3097 rb:1.0556 dl:750-750 gd:1 +ttp: b1913/3125 bl:2.3079 bb:1.0605 rl:2.3097 rb:1.0556 dl:744-745 gd:1 +ttp: b1906/3125 bl:2.3378 bb:1.0499 rl:2.3097 rb:1.0556 dl:741-741 gd:1 +ttp: b1898/3125 bl:2.4203 bb:1.1112 rl:2.3100 rb:1.0557 dl:736-736 gd:1 +ttp: b1890/3125 bl:2.2887 bb:1.0132 rl:2.3100 rb:1.0556 dl:731-731 gd:1 +ttp: b1880/3125 bl:2.6001 bb:1.1412 rl:2.3107 rb:1.0558 dl:725-726 gd:1 +ttp: b1874/3125 bl:2.2177 bb:0.9756 rl:2.3105 rb:1.0556 dl:722-722 gd:1 +ttp: b1864/3125 bl:2.5162 bb:1.0475 rl:2.3110 rb:1.0556 dl:716-717 gd:1 +ttp: b1858/3125 bl:2.3943 bb:1.0738 rl:2.3112 rb:1.0557 dl:713-713 gd:1 +ttp: b1851/3125 bl:2.3492 bb:1.0256 rl:2.3113 rb:1.0556 dl:709-709 gd:1 +ttp: b1843/3125 bl:2.3044 bb:1.0112 rl:2.3113 rb:1.0555 dl:705-705 gd:1 +ttp: b1832/3125 bl:2.2885 bb:1.0985 rl:2.3112 rb:1.0556 dl:699-700 gd:1 +ttp: b1826/3125 bl:2.2773 bb:1.0535 rl:2.3111 rb:1.0556 dl:696-696 gd:1 +ttp: b1817/3125 bl:2.3090 bb:1.0496 rl:2.3111 rb:1.0556 dl:691-692 gd:1 +ttp: b1809/3125 bl:2.2725 bb:1.0276 rl:2.3110 rb:1.0555 dl:687-688 gd:1 +ttp: b1802/3125 bl:2.3156 bb:1.1201 rl:2.3110 rb:1.0556 dl:683-683 gd:1 +ttp: b1794/3125 bl:2.4483 bb:1.0818 rl:2.3114 rb:1.0557 dl:679-679 gd:1 +ttp: b1786/3125 bl:2.4283 bb:1.0907 rl:2.3116 rb:1.0558 dl:675-675 gd:1 +ttp: b1776/3125 bl:2.3377 bb:1.0995 rl:2.3117 rb:1.0559 dl:669-670 gd:1 +ttp: b1768/3125 bl:2.2911 bb:1.0896 rl:2.3116 rb:1.0559 dl:665-666 gd:1 +ttp: b1763/3125 bl:2.4333 bb:1.0855 rl:2.3119 rb:1.0560 dl:663-663 gd:1 +ttp: b1754/3125 bl:2.2858 bb:0.9974 rl:2.3119 rb:1.0559 dl:657-658 gd:1 +ttp: b1745/3125 bl:2.3406 bb:0.9993 rl:2.3119 rb:1.0557 dl:653-654 gd:1 +ttp: b1739/3125 bl:2.3049 bb:1.0686 rl:2.3119 rb:1.0558 dl:650-650 gd:1 +ttp: b1730/3125 bl:2.1638 bb:1.0722 rl:2.3116 rb:1.0558 dl:646-646 gd:1 +ttp: b1724/3125 bl:2.3399 bb:1.0104 rl:2.3116 rb:1.0557 dl:643-643 gd:1 +ttp: b1713/3125 bl:2.4344 bb:1.1515 rl:2.3119 rb:1.0559 dl:638-639 gd:1 +ttp: b1705/3125 bl:2.3202 bb:1.0826 rl:2.3119 rb:1.0559 dl:634-635 gd:1 +ttp: b1697/3125 bl:2.2672 bb:0.9986 rl:2.3118 rb:1.0558 dl:630-631 gd:1 +ttp: b1689/3125 bl:2.3609 bb:1.0493 rl:2.3119 rb:1.0558 dl:626-627 gd:1 +ttp: b1681/3125 bl:2.1588 bb:0.9922 rl:2.3116 rb:1.0557 dl:622-623 gd:1 +ttp: b1677/3125 bl:2.4592 bb:1.0799 rl:2.3119 rb:1.0557 dl:620-620 gd:1 +ttp: b1666/3125 bl:2.3145 bb:1.0684 rl:2.3119 rb:1.0558 dl:615-616 gd:1 +ttp: b1657/3125 bl:2.2359 bb:1.0654 rl:2.3118 rb:1.0558 dl:611-612 gd:1 +ttp: b1652/3125 bl:2.4562 bb:1.0859 rl:2.3121 rb:1.0558 dl:609-609 gd:1 +ttp: b1644/3125 bl:2.3590 bb:1.0574 rl:2.3122 rb:1.0558 dl:605-605 gd:1 +ttp: b1634/3125 bl:2.2829 bb:1.0142 rl:2.3121 rb:1.0558 dl:599-600 gd:1 +ttp: b1625/3125 bl:2.0938 bb:0.9999 rl:2.3117 rb:1.0557 dl:595-596 gd:1 +ttp: b1620/3125 bl:2.3943 bb:1.0743 rl:2.3118 rb:1.0557 dl:593-593 gd:1 +ttp: b1611/3125 bl:2.2607 bb:1.0467 rl:2.3117 rb:1.0557 dl:588-589 gd:1 +ttp: b1603/3125 bl:2.1899 bb:1.0242 rl:2.3115 rb:1.0556 dl:585-585 gd:1 +ttp: b1595/3125 bl:2.3644 bb:1.0758 rl:2.3116 rb:1.0557 dl:582-582 gd:1 +ttp: b1590/3125 bl:2.2349 bb:1.0220 rl:2.3115 rb:1.0556 dl:579-579 gd:1 +ttp: b1579/3125 bl:2.2167 bb:1.0171 rl:2.3113 rb:1.0555 dl:574-575 gd:1 +ttp: b1571/3125 bl:2.5190 bb:1.1814 rl:2.3117 rb:1.0557 dl:570-571 gd:1 +ttp: b1564/3125 bl:2.2763 bb:1.0853 rl:2.3116 rb:1.0558 dl:568-568 gd:1 +ttp: b1555/3125 bl:2.2926 bb:1.0866 rl:2.3116 rb:1.0558 dl:563-564 gd:1 +ttp: b1550/3125 bl:2.1962 bb:1.0008 rl:2.3114 rb:1.0558 dl:561-561 gd:1 +ttp: b1540/3125 bl:2.5194 bb:1.1030 rl:2.3117 rb:1.0558 dl:557-557 gd:1 +ttp: b1534/3125 bl:2.3329 bb:1.0510 rl:2.3118 rb:1.0558 dl:554-554 gd:1 +ttp: b1524/3125 bl:2.4004 bb:1.0831 rl:2.3119 rb:1.0559 dl:550-550 gd:1 +ttp: b1516/3125 bl:2.4670 bb:1.1177 rl:2.3122 rb:1.0560 dl:547-547 gd:1 +ttp: b1508/3125 bl:2.1623 bb:1.0392 rl:2.3119 rb:1.0560 dl:543-544 gd:1 +ttp: b1499/3125 bl:2.3899 bb:1.0396 rl:2.3121 rb:1.0559 dl:539-540 gd:1 +ttp: b1492/3125 bl:2.5078 bb:1.1097 rl:2.3124 rb:1.0560 dl:536-537 gd:1 +ttp: b1485/3125 bl:2.3928 bb:1.0445 rl:2.3125 rb:1.0560 dl:533-534 gd:1 +ttp: b1475/3125 bl:2.3222 bb:1.0512 rl:2.3125 rb:1.0560 dl:529-530 gd:1 +ttp: b1469/3125 bl:2.5409 bb:1.1604 rl:2.3129 rb:1.0562 dl:527-527 gd:1 +ttp: b1460/3125 bl:2.3016 bb:1.0365 rl:2.3129 rb:1.0561 dl:523-524 gd:1 +ttp: b1453/3125 bl:2.3122 bb:1.0707 rl:2.3129 rb:1.0562 dl:521-521 gd:1 +ttp: b1445/3125 bl:2.5041 bb:1.1156 rl:2.3132 rb:1.0563 dl:517-517 gd:1 +ttp: b1438/3125 bl:2.4534 bb:1.1033 rl:2.3134 rb:1.0563 dl:514-514 gd:1 +ttp: b1428/3125 bl:2.2541 bb:1.0351 rl:2.3133 rb:1.0563 dl:509-510 gd:1 +ttp: b1421/3125 bl:2.3376 bb:1.0145 rl:2.3134 rb:1.0562 dl:506-507 gd:1 +ttp: b1412/3125 bl:2.2543 bb:1.0594 rl:2.3133 rb:1.0562 dl:502-503 gd:1 +ttp: b1405/3125 bl:2.4636 bb:1.1594 rl:2.3135 rb:1.0564 dl:499-500 gd:1 +ttp: b1397/3125 bl:2.3130 bb:1.0702 rl:2.3135 rb:1.0564 dl:497-497 gd:1 +ttp: b1389/3125 bl:2.3749 bb:1.1154 rl:2.3136 rb:1.0565 dl:494-494 gd:1 +ttp: b1381/3125 bl:2.2571 bb:1.0968 rl:2.3135 rb:1.0566 dl:491-491 gd:1 +ttp: b1373/3125 bl:2.4389 bb:1.0464 rl:2.3137 rb:1.0565 dl:488-488 gd:1 +ttp: b1366/3125 bl:2.4084 bb:1.1444 rl:2.3139 rb:1.0567 dl:485-485 gd:1 +ttp: b1358/3125 bl:2.4158 bb:1.0467 rl:2.3140 rb:1.0566 dl:482-482 gd:1 +ttp: b1345/3125 bl:2.4044 bb:1.0726 rl:2.3141 rb:1.0567 dl:477-478 gd:1 +ttp: b1338/3125 bl:2.5031 bb:1.0962 rl:2.3144 rb:1.0567 dl:474-475 gd:1 +ttp: b1331/3125 bl:2.4463 bb:1.0729 rl:2.3146 rb:1.0568 dl:472-472 gd:1 +ttp: b1323/3125 bl:2.4027 bb:1.1080 rl:2.3147 rb:1.0568 dl:469-469 gd:1 +ttp: b1317/3125 bl:2.0931 bb:1.0454 rl:2.3144 rb:1.0568 dl:466-466 gd:1 +ttp: b1307/3125 bl:2.3754 bb:1.1391 rl:2.3145 rb:1.0569 dl:463-463 gd:1 +ttp: b1297/3125 bl:2.2877 bb:1.0868 rl:2.3145 rb:1.0570 dl:459-460 gd:1 +ttp: b1288/3125 bl:2.4814 bb:1.1187 rl:2.3147 rb:1.0570 dl:456-457 gd:1 +ttp: b1254/3125 bl:2.4871 bb:1.1632 rl:2.3149 rb:1.0572 dl:442-443 gd:1 +ttp: b1247/3125 bl:2.2843 bb:1.0591 rl:2.3149 rb:1.0572 dl:440-440 gd:1 +ttp: b1236/3125 bl:2.4924 bb:1.1042 rl:2.3151 rb:1.0573 dl:436-437 gd:1 +ttp: b1233/3125 bl:2.4828 bb:1.1068 rl:2.3153 rb:1.0573 dl:435-435 gd:1 +ttp: b1224/3125 bl:2.4224 bb:1.0579 rl:2.3155 rb:1.0573 dl:431-432 gd:1 +ttp: b1216/3125 bl:2.2790 bb:1.0642 rl:2.3154 rb:1.0573 dl:428-429 gd:1 +ttp: b1213/3125 bl:2.2458 bb:1.0403 rl:2.3153 rb:1.0573 dl:427-427 gd:1 +ttp: b1204/3125 bl:2.3296 bb:1.1219 rl:2.3154 rb:1.0574 dl:424-424 gd:1 +ttp: b1196/3125 bl:2.3512 bb:1.0601 rl:2.3154 rb:1.0574 dl:421-421 gd:1 +ttp: b1187/3125 bl:2.4551 bb:1.1542 rl:2.3156 rb:1.0575 dl:417-418 gd:1 +ttp: b1179/3125 bl:2.2704 bb:1.0208 rl:2.3155 rb:1.0575 dl:414-415 gd:1 +ttp: b1173/3125 bl:2.1614 bb:1.0246 rl:2.3153 rb:1.0574 dl:412-412 gd:1 +ttp: b1165/3125 bl:2.2180 bb:1.0844 rl:2.3152 rb:1.0575 dl:409-409 gd:1 +ttp: b1160/3125 bl:2.5137 bb:1.0905 rl:2.3155 rb:1.0575 dl:407-407 gd:1 +ttp: b1154/3125 bl:2.2399 bb:1.0067 rl:2.3154 rb:1.0574 dl:405-405 gd:1 +ttp: b1144/3125 bl:2.3812 bb:1.1060 rl:2.3154 rb:1.0575 dl:401-402 gd:1 +ttp: b1137/3125 bl:2.2193 bb:1.0266 rl:2.3153 rb:1.0575 dl:399-400 gd:1 +ttp: b1136/3125 bl:2.3867 bb:1.1286 rl:2.3154 rb:1.0575 dl:399-399 gd:1 +ttp: b1129/3125 bl:2.4247 bb:1.1108 rl:2.3155 rb:1.0576 dl:396-396 gd:1 +ttp: b1117/3125 bl:2.3493 bb:1.0537 rl:2.3156 rb:1.0576 dl:392-393 gd:1 +ttp: b1113/3125 bl:2.4615 bb:1.1014 rl:2.3157 rb:1.0576 dl:391-391 gd:1 +ttp: b1105/3125 bl:2.2852 bb:1.1026 rl:2.3157 rb:1.0577 dl:388-388 gd:1 +ttp: b1096/3125 bl:2.4058 bb:1.1128 rl:2.3158 rb:1.0578 dl:385-385 gd:1 +ttp: b1090/3125 bl:2.4421 bb:1.0364 rl:2.3160 rb:1.0577 dl:383-383 gd:1 +ttp: b1081/3125 bl:2.5487 bb:1.1494 rl:2.3162 rb:1.0578 dl:380-380 gd:1 +ttp: b1069/3125 bl:2.3720 bb:1.0878 rl:2.3163 rb:1.0579 dl:376-377 gd:1 +ttp: b1065/3125 bl:2.4245 bb:1.1884 rl:2.3164 rb:1.0580 dl:375-375 gd:1 +ttp: b1054/3125 bl:2.3705 bb:1.0925 rl:2.3165 rb:1.0580 dl:372-372 gd:1 +ttp: b1049/3125 bl:2.4469 bb:1.1074 rl:2.3166 rb:1.0581 dl:370-370 gd:1 +ttp: b1042/3125 bl:2.3076 bb:1.1287 rl:2.3166 rb:1.0582 dl:368-368 gd:1 +ttp: b1037/3125 bl:2.4547 bb:1.1307 rl:2.3167 rb:1.0582 dl:366-366 gd:1 +ttp: b1030/3125 bl:2.3730 bb:1.0650 rl:2.3168 rb:1.0582 dl:364-364 gd:1 +ttp: b1025/3125 bl:2.5481 bb:1.0990 rl:2.3170 rb:1.0583 dl:362-362 gd:1 +ttp: b1018/3125 bl:2.4800 bb:1.0747 rl:2.3172 rb:1.0583 dl:360-360 gd:1 +ttp: b1007/3125 bl:2.4263 bb:1.0376 rl:2.3173 rb:1.0583 dl:357-357 gd:1 +ttp: b1000/3125 bl:2.3249 bb:1.0562 rl:2.3173 rb:1.0583 dl:355-355 gd:1 +ttp: b990/3125 bl:2.2779 bb:1.1742 rl:2.3173 rb:1.0584 dl:352-352 gd:1 +ttp: b985/3125 bl:2.4306 bb:1.0915 rl:2.3174 rb:1.0584 dl:350-350 gd:1 +ttp: b974/3125 bl:2.2433 bb:1.1094 rl:2.3173 rb:1.0585 dl:346-347 gd:1 +ttp: b969/3125 bl:2.4928 bb:1.1239 rl:2.3175 rb:1.0585 dl:345-345 gd:1 +ttp: b959/3125 bl:2.3143 bb:1.0760 rl:2.3175 rb:1.0586 dl:342-342 gd:1 +ttp: b954/3125 bl:2.5106 bb:1.1501 rl:2.3177 rb:1.0586 dl:340-340 gd:1 +ttp: b943/3125 bl:2.2077 bb:1.0300 rl:2.3176 rb:1.0586 dl:336-337 gd:1 +ttp: b941/3125 bl:2.3072 bb:1.0509 rl:2.3176 rb:1.0586 dl:336-336 gd:1 +ttp: b935/3125 bl:2.3188 bb:1.1037 rl:2.3176 rb:1.0586 dl:334-334 gd:1 +ttp: b926/3125 bl:2.3796 bb:1.1288 rl:2.3176 rb:1.0587 dl:332-332 gd:1 +ttp: b919/3125 bl:2.4066 bb:1.1261 rl:2.3177 rb:1.0588 dl:330-330 gd:1 +ttp: b914/3125 bl:2.4338 bb:1.0161 rl:2.3178 rb:1.0587 dl:328-328 gd:1 +ttp: b906/3125 bl:2.5041 bb:1.1338 rl:2.3180 rb:1.0588 dl:326-326 gd:1 +ttp: b900/3125 bl:2.4840 bb:1.0993 rl:2.3182 rb:1.0588 dl:324-324 gd:1 +ttp: b893/3125 bl:2.2616 bb:1.0579 rl:2.3181 rb:1.0588 dl:322-322 gd:1 +ttp: b881/3125 bl:2.4961 bb:1.1215 rl:2.3183 rb:1.0589 dl:318-319 gd:1 +ttp: b875/3125 bl:2.2605 bb:1.1188 rl:2.3182 rb:1.0589 dl:317-317 gd:1 +ttp: b869/3125 bl:2.4693 bb:1.1726 rl:2.3183 rb:1.0590 dl:315-315 gd:1 +ttp: b857/3125 bl:2.4511 bb:1.1237 rl:2.3185 rb:1.0591 dl:312-312 gd:1 +ttp: b852/3125 bl:2.2662 bb:1.0129 rl:2.3184 rb:1.0591 dl:310-310 gd:1 +ttp: b844/3125 bl:2.6153 bb:1.1815 rl:2.3187 rb:1.0592 dl:308-308 gd:1 +ttp: b837/3125 bl:2.2003 bb:1.1212 rl:2.3186 rb:1.0592 dl:306-306 gd:1 +ttp: b829/3125 bl:2.3997 bb:1.1718 rl:2.3186 rb:1.0593 dl:304-304 gd:1 +ttp: b822/3125 bl:2.3732 bb:1.1080 rl:2.3187 rb:1.0593 dl:302-302 gd:1 +ttp: b807/3125 bl:2.3151 bb:1.1475 rl:2.3187 rb:1.0594 dl:298-299 gd:1 +ttp: b799/3125 bl:2.4725 bb:1.1891 rl:2.3188 rb:1.0595 dl:296-297 gd:1 +ttp: b793/3125 bl:2.4841 bb:1.0641 rl:2.3190 rb:1.0595 dl:295-295 gd:1 +ttp: b786/3125 bl:2.4158 bb:1.1384 rl:2.3190 rb:1.0596 dl:293-293 gd:1 +ttp: b775/3125 bl:2.5411 bb:1.1739 rl:2.3192 rb:1.0597 dl:289-290 gd:1 +ttp: b769/3125 bl:2.5086 bb:1.1748 rl:2.3194 rb:1.0598 dl:288-288 gd:1 +ttp: b763/3125 bl:2.5438 bb:1.1243 rl:2.3195 rb:1.0598 dl:286-286 gd:1 +ttp: b755/3125 bl:2.4004 bb:1.1212 rl:2.3196 rb:1.0599 dl:284-284 gd:1 +ttp: b749/3125 bl:2.4878 bb:1.1067 rl:2.3197 rb:1.0599 dl:282-282 gd:1 +ttp: b741/3125 bl:2.3868 bb:1.1105 rl:2.3198 rb:1.0600 dl:280-280 gd:1 +ttp: b732/3125 bl:2.4479 bb:1.1044 rl:2.3199 rb:1.0600 dl:278-278 gd:1 +ttp: b724/3125 bl:2.3787 bb:1.1264 rl:2.3199 rb:1.0600 dl:276-276 gd:1 +ttp: b717/3125 bl:2.4430 bb:1.0742 rl:2.3200 rb:1.0600 dl:274-274 gd:1 +ttp: b708/3125 bl:2.5705 bb:1.1048 rl:2.3202 rb:1.0601 dl:272-272 gd:1 +ttp: b700/3125 bl:2.4796 bb:1.1525 rl:2.3204 rb:1.0602 dl:270-270 gd:1 +ttp: b690/3125 bl:2.4017 bb:1.1408 rl:2.3204 rb:1.0602 dl:268-268 gd:1 +ttp: b682/3125 bl:2.3767 bb:1.1575 rl:2.3205 rb:1.0603 dl:266-266 gd:1 +ttp: b675/3125 bl:2.6280 bb:1.3645 rl:2.3207 rb:1.0605 dl:264-264 gd:1 +ttp: b664/3125 bl:2.3669 bb:1.0620 rl:2.3207 rb:1.0605 dl:261-262 gd:1 +ttp: b657/3125 bl:2.5350 bb:1.1664 rl:2.3209 rb:1.0606 dl:260-260 gd:1 +ttp: b649/3125 bl:2.2540 bb:1.0921 rl:2.3208 rb:1.0606 dl:257-258 gd:1 +ttp: b642/3125 bl:2.3038 bb:1.1546 rl:2.3208 rb:1.0606 dl:256-256 gd:1 +ttp: b632/3125 bl:2.4935 bb:1.1874 rl:2.3209 rb:1.0607 dl:253-254 gd:1 +ttp: b631/3125 bl:2.4456 bb:1.2326 rl:2.3210 rb:1.0608 dl:253-253 gd:1 +ttp: b615/3125 bl:2.5336 bb:1.1801 rl:2.3212 rb:1.0609 dl:249-250 gd:1 +ttp: b614/3125 bl:2.3937 bb:1.1444 rl:2.3212 rb:1.0610 dl:249-249 gd:1 +ttp: b599/3125 bl:2.3684 bb:1.1099 rl:2.3212 rb:1.0610 dl:245-246 gd:1 +ttp: b598/3125 bl:2.2606 bb:1.0438 rl:2.3212 rb:1.0610 dl:245-245 gd:1 +ttp: b590/3125 bl:2.4318 bb:1.1361 rl:2.3213 rb:1.0610 dl:243-243 gd:1 +ttp: b580/3125 bl:2.5870 bb:1.2132 rl:2.3215 rb:1.0611 dl:241-241 gd:1 +ttp: b572/3125 bl:2.4833 bb:1.1461 rl:2.3216 rb:1.0612 dl:239-239 gd:1 +ttp: b564/3125 bl:2.5478 bb:1.1408 rl:2.3217 rb:1.0612 dl:237-237 gd:1 +ttp: b555/3125 bl:2.4745 bb:1.1886 rl:2.3218 rb:1.0613 dl:235-235 gd:1 +ttp: b545/3125 bl:2.3104 bb:1.1048 rl:2.3218 rb:1.0613 dl:233-233 gd:1 +ttp: b535/3125 bl:2.3652 bb:1.0813 rl:2.3218 rb:1.0614 dl:230-231 gd:1 +ttp: b534/3125 bl:2.4562 bb:1.1627 rl:2.3219 rb:1.0614 dl:230-230 gd:1 +ttp: b524/3125 bl:2.4283 bb:1.2215 rl:2.3220 rb:1.0615 dl:228-228 gd:1 +ttp: b515/3125 bl:2.4340 bb:1.1499 rl:2.3221 rb:1.0616 dl:226-226 gd:1 +ttp: b506/3125 bl:2.4437 bb:1.2300 rl:2.3221 rb:1.0617 dl:224-224 gd:1 +ttp: b496/3125 bl:2.2704 bb:1.1254 rl:2.3221 rb:1.0617 dl:221-222 gd:1 +ttp: b489/3125 bl:2.2313 bb:1.0980 rl:2.3220 rb:1.0617 dl:219-220 gd:1 +ttp: b488/3125 bl:2.4616 bb:1.1935 rl:2.3221 rb:1.0618 dl:219-219 gd:1 +ttp: b477/3125 bl:2.2936 bb:1.1534 rl:2.3221 rb:1.0618 dl:217-217 gd:1 +ttp: b468/3125 bl:2.4338 bb:1.1599 rl:2.3222 rb:1.0619 dl:215-215 gd:1 +ttp: b459/3125 bl:2.4009 bb:1.1277 rl:2.3222 rb:1.0619 dl:213-213 gd:1 +ttp: b449/3125 bl:2.6859 bb:1.1980 rl:2.3224 rb:1.0620 dl:210-211 gd:1 +ttp: b448/3125 bl:2.4403 bb:1.1983 rl:2.3225 rb:1.0621 dl:210-210 gd:1 +ttp: b440/3125 bl:2.3448 bb:1.1615 rl:2.3225 rb:1.0621 dl:208-208 gd:1 +ttp: b431/3125 bl:2.3825 bb:1.1699 rl:2.3225 rb:1.0622 dl:206-206 gd:1 +ttp: b420/3125 bl:2.3029 bb:1.1719 rl:2.3225 rb:1.0623 dl:204-204 gd:1 +ttp: b408/3125 bl:2.4858 bb:1.1580 rl:2.3226 rb:1.0623 dl:201-202 gd:1 +ttp: b407/3125 bl:2.7312 bb:1.2646 rl:2.3228 rb:1.0624 dl:201-201 gd:1 +ttp: b398/3125 bl:2.4153 bb:1.1154 rl:2.3229 rb:1.0624 dl:199-199 gd:1 +ttp: b388/3125 bl:2.4716 bb:1.2710 rl:2.3230 rb:1.0625 dl:197-197 gd:1 +ttp: b379/3125 bl:2.3087 bb:1.1590 rl:2.3230 rb:1.0626 dl:195-195 gd:1 +ttp: b371/3125 bl:2.5584 bb:1.1959 rl:2.3231 rb:1.0627 dl:193-193 gd:1 +ttp: b361/3125 bl:2.2436 bb:1.1478 rl:2.3231 rb:1.0627 dl:190-191 gd:1 +ttp: b360/3125 bl:2.4170 bb:1.1824 rl:2.3231 rb:1.0628 dl:190-190 gd:1 +ttp: b343/3125 bl:2.5031 bb:1.2127 rl:2.3232 rb:1.0628 dl:186-187 gd:1 +ttp: b341/3125 bl:2.3121 bb:1.0190 rl:2.3232 rb:1.0628 dl:186-186 gd:1 +ttp: b330/3125 bl:2.3950 bb:1.1347 rl:2.3232 rb:1.0628 dl:184-184 gd:1 +ttp: b328/3125 bl:2.3986 bb:1.1845 rl:2.3233 rb:1.0629 dl:183-183 gd:1 +ttp: b319/3125 bl:2.3873 bb:1.1948 rl:2.3233 rb:1.0630 dl:181-181 gd:1 +ttp: b310/3125 bl:2.3304 bb:1.1036 rl:2.3233 rb:1.0630 dl:179-179 gd:1 +ttp: b298/3125 bl:2.5118 bb:1.2225 rl:2.3234 rb:1.0630 dl:177-177 gd:1 +ttp: b290/3125 bl:2.5829 bb:1.2940 rl:2.3235 rb:1.0631 dl:175-175 gd:1 +ttp: b279/3125 bl:2.6393 bb:1.2679 rl:2.3237 rb:1.0632 dl:172-173 gd:1 +ttp: b276/3125 bl:2.5383 bb:1.2789 rl:2.3238 rb:1.0633 dl:172-172 gd:1 +ttp: b266/3125 bl:2.7143 bb:1.2548 rl:2.3239 rb:1.0634 dl:170-170 gd:1 +ttp: b264/3125 bl:2.7506 bb:1.2706 rl:2.3241 rb:1.0635 dl:169-169 gd:1 +ttp: b248/3125 bl:2.2715 bb:1.2087 rl:2.3241 rb:1.0636 dl:165-166 gd:1 +ttp: b247/3125 bl:2.4654 bb:1.1023 rl:2.3242 rb:1.0636 dl:165-165 gd:1 +ttp: b238/3125 bl:2.3431 bb:1.1275 rl:2.3242 rb:1.0636 dl:163-163 gd:1 +ttp: b230/3125 bl:2.3649 bb:1.2375 rl:2.3242 rb:1.0637 dl:161-161 gd:1 +ttp: b223/3125 bl:2.8045 bb:1.2869 rl:2.3244 rb:1.0638 dl:159-159 gd:1 +ttp: b213/3125 bl:2.4025 bb:1.2749 rl:2.3244 rb:1.0638 dl:157-157 gd:1 +ttp: b204/3125 bl:2.4758 bb:1.2530 rl:2.3245 rb:1.0639 dl:155-155 gd:1 +ttp: b197/3125 bl:2.4123 bb:1.0725 rl:2.3245 rb:1.0639 dl:153-153 gd:1 +ttp: b189/3125 bl:2.6067 bb:1.2295 rl:2.3246 rb:1.0640 dl:151-151 gd:1 +ttp: b180/3125 bl:2.3632 bb:1.1307 rl:2.3247 rb:1.0640 dl:149-149 gd:1 +ttp: b172/3125 bl:2.5600 bb:1.1944 rl:2.3248 rb:1.0641 dl:147-147 gd:1 +ttp: b165/3125 bl:2.4570 bb:1.1731 rl:2.3248 rb:1.0641 dl:145-145 gd:1 +ttp: b153/3125 bl:2.5010 bb:1.2521 rl:2.3249 rb:1.0642 dl:142-143 gd:1 +ttp: b152/3125 bl:2.4932 bb:1.1609 rl:2.3249 rb:1.0642 dl:142-142 gd:1 +ttp: b138/3125 bl:2.6833 bb:1.2475 rl:2.3251 rb:1.0643 dl:138-139 gd:1 +ttp: b130/3125 bl:2.6433 bb:1.2474 rl:2.3252 rb:1.0643 dl:136-137 gd:1 +ttp: b123/3125 bl:2.6149 bb:1.2205 rl:2.3253 rb:1.0644 dl:134-135 gd:1 +ttp: b121/3125 bl:2.4374 bb:1.1639 rl:2.3253 rb:1.0644 dl:134-134 gd:1 +ttp: b113/3125 bl:2.6461 bb:1.2014 rl:2.3254 rb:1.0645 dl:132-132 gd:1 +ttp: b106/3125 bl:2.7513 bb:1.2992 rl:2.3256 rb:1.0646 dl:130-130 gd:1 +ttp: b98/3125 bl:2.6824 bb:1.1873 rl:2.3257 rb:1.0646 dl:128-128 gd:1 +ttp: b85/3125 bl:2.5309 bb:1.2763 rl:2.3258 rb:1.0647 dl:124-125 gd:1 +ttp: b79/3125 bl:2.7067 bb:1.2348 rl:2.3259 rb:1.0647 dl:123-123 gd:1 +ttp: b69/3125 bl:2.4728 bb:1.1057 rl:2.3260 rb:1.0647 dl:119-120 gd:1 +ttp: b65/3125 bl:2.5746 bb:1.2763 rl:2.3260 rb:1.0648 dl:118-118 gd:1 +ttp: b56/3125 bl:2.5772 bb:1.1954 rl:2.3261 rb:1.0648 dl:115-115 gd:1 +ttp: b48/3125 bl:2.5648 bb:1.1249 rl:2.3262 rb:1.0649 dl:112-112 gd:1 +ttp: b38/3125 bl:2.6098 bb:1.1935 rl:2.3263 rb:1.0649 dl:107-108 gd:1 +ttp: b33/3125 bl:2.7948 bb:1.2566 rl:2.3264 rb:1.0649 dl:105-105 gd:1 +ttp: b24/3125 bl:2.6547 bb:1.1595 rl:2.3265 rb:1.0650 dl:101-101 gd:1 +ttp: b14/3125 bl:2.7536 bb:1.2636 rl:2.3266 rb:1.0650 dl:94-95 gd:1 +ttp: b7/3125 bl:2.9315 bb:1.2624 rl:2.3267 rb:1.0651 dl:86-88 gd:1 +quantized_ttt_phased val_loss:2.32484879 val_bpb:1.06236517 eval_time:482574ms +total_eval_time:482.6s