Skip to content

Comments

feat: introduce dynamic mining pass price DEV-1084#65

Open
iHiteshAgrawal wants to merge 3 commits intostagingfrom
task/DEV-1084
Open

feat: introduce dynamic mining pass price DEV-1084#65
iHiteshAgrawal wants to merge 3 commits intostagingfrom
task/DEV-1084

Conversation

@iHiteshAgrawal
Copy link
Member

@iHiteshAgrawal iHiteshAgrawal commented Feb 11, 2026

Summary by CodeRabbit

  • New Features

    • Dynamic mining pass pricing with per‑tier configurable fees, base fees and a fee floor; new CLI tasks to apply fees and start seasons; init script to configure dynamic pricing.
  • Analytics

    • New simulation and reporting tools for pricing, ROI, utilization and unprofitable-user analysis; CSV export and summary reports.
  • Tests

    • New unit and fork tests covering dynamic pricing, upgrades and storage layout.
  • Chores

    • Editor/workspace settings, package manager/dependency added, and updated ignore patterns.

@coderabbitai
Copy link

coderabbitai bot commented Feb 11, 2026

📝 Walkthrough

Walkthrough

Adds dynamic mining-pass pricing across the system: new on‑chain pricing APIs and storage, initializer values, Hardhat tasks to compute/apply dynamic fees, multiple analysis scripts, updated deployment artifacts, and comprehensive tests including fork-based storage verification.

Changes

Cohort / File(s) Summary
Config & Tooling
\.gitignore, .vscode/settings.json, package.json
Add ignore patterns (.DS_Store, .archive/), VSCode Python/conda settings, packageManager field and viem dependency.
Storage & Initializers
src/libraries/AppStorage.sol, src/upgradeInitializers/DiamondInit.sol, src/upgradeInitializers/FujiMigrationInit.sol
Explicit storage slot annotations, added fields for mining-pass fees, base fees, fee floor, deposit limits, fee receivers/shares; new Fuji migration initializer to realign storage and populate fee receiver arrays.
Diamond Manager Facet
src/facets/DiamondManagerFacet.sol
New errors/events and external APIs to set/get mining pass fees (batch + per‑tier), set base fees, set fee floor, and read fee/deposit limit data with floor enforcement and validations.
Hardhat Tasks & Init Script
tasks/startSeason.ts, tasks/diamondInfo.ts, scripts/initDynamicPricing.ts, hardhat.config.ts
New tasks: season:start (compute/apply dynamic fees, optionally start season), diamond:info (display fee/season/user data); init script applies base fees and fee floor; hardhat config loads new tasks.
Simulation & Analysis Scripts
scripts/simulation/...
exportSeasonAnalysis.ts, miningPassPricing.ts, miningPassROIAnalysis.ts, analyzeUtilization.ts, unprofitableAnalysis.ts
Five new analysis tools for CSV export, pricing simulations (multiplier & ROI‑controlled), season ROI analysis, utilization parsing, and unprofitable-user breakdowns; use subgraph and price APIs.
Tests & Fork Tests
test/foundry/Facets/MiningPassDynamicPricing.t.sol, test/foundry/fork/FujiForkMiningPass.t.sol, test/foundry/utils/DiamondTest.sol
New Forge tests covering getters/setters, floor enforcement, ownership checks, dynamic pricing behaviors, and a Fuji fork test verifying storage layout and end‑to‑end purchase flows; diamond selector list extended.
Deployment Artifacts
deployments/*.json (multiple files)
Updated artifacts: new/changed ABI entries (errors, getters), updated addresses, skipIfAlreadyDeployed toggles, and recompiled bytecode in several facet artifacts.
Miscellaneous
.gitignore, package.json, hardhat.config.ts
Minor formatting/import additions and tooling adjustments to ensure tasks and new dependencies load.

Sequence Diagram(s)

mermaid
sequenceDiagram
rect rgba(200,220,255,0.5)
participant CLI as Hardhat Task (season:start)
participant PriceAPI as CoinGecko / Subgraph
participant Diamond as LiquidMiningDiamond (DiamondManagerFacet)
end
CLI->>PriceAPI: fetch VAPE price & optional historical points
PriceAPI-->>CLI: return price / metrics
CLI->>Diamond: read base fees, deposit limits, fee floor
Diamond-->>CLI: return base fees / limits / floor
CLI->>CLI: compute dynamicFees (multiplier + floor)
CLI->>Diamond: setMiningPassFees(dynamicFees)
Diamond-->>CLI: emit MiningPassFeesUpdated
CLI->>Diamond: startNewSeasonWithEndTimestamp(..., rewards)
Diamond-->>CLI: confirm season started

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Poem

🐰 A little hop, a code-sprung cheer,

Fees that flex as prices veer,
Scripts that count and tests that prove,
Slots aligned and floors to soothe,
I nibble bugs and plant the view—dynamic fees, anew!

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: introduce dynamic mining pass price DEV-1084' directly and clearly describes the main change—introducing dynamic pricing for mining passes.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch task/DEV-1084

Warning

Review ran into problems

🔥 Problems

Errors were encountered while retrieving linked issues.

Errors (1)
  • JIRA integration encountered authorization issues. Please disconnect and reconnect the integration in the CodeRabbit UI.

Git: Failed to clone repository. Please run the @coderabbitai full review command to re-trigger a full review. If the issue persists, set path_filters to include or exclude specific files.

Tip

Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

🤖 Fix all issues with AI agents
In @.vscode/settings.json:
- Around line 1-5: Remove the committed VSCode workspace settings and ignore
them going forward: delete the settings file from the PR (unstage/remove it from
the repository commit, e.g., git rm --cached) and add the .vscode/ directory to
.gitignore so IDE settings are not tracked; specifically remove
".vscode/settings.json" from the repo and add a ".vscode/" entry to .gitignore,
then commit the updated .gitignore.

In `@scripts/simulation/analyzeUtilization.ts`:
- Around line 67-69: The pctNotMaxed calculation can produce NaN when
totalNotProfit is 0; update the code that computes pctNotMaxed (currently using
totalNotProfit and notProfitableNotMaxed) to guard against division by zero by
computing pctNotMaxed only if totalNotProfit > 0 (e.g., use a conditional or
ternary) and otherwise set a safe formatted value like "0.0" (so toFixed is not
called on NaN), then use that safe value in the console.log.
- Around line 38-40: The current calculation for pctOfLimit divides deposited by
limit and can produce Infinity/NaN when tierLimits[limitStr] is undefined (limit
=== 0); update the logic around the variables limit, limitStr, tierLimits,
deposited and pctOfLimit to guard against a zero/undefined limit: check if limit
is a positive number before doing (deposited / limit) and if not, set pctOfLimit
to a safe default string (e.g., "0.0" or "N/A") or handle it explicitly so no
division occurs.

In `@scripts/simulation/exportSeasonAnalysis.ts`:
- Around line 230-242: Guard all divisions in the summary by checking
denominators before computing ratios: when building rows with totalOriginalFees,
totalDynamicFees, usersWithPaidPass, and when computing ROI/percentages for
originalProfitable/dynamicProfitable use conditional logic to avoid dividing by
zero (e.g., if totalOriginalFees or totalDynamicFees is 0, output "0.00x" or
"N/A" for ROI; if usersWithPaidPass is 0, output "0 / 0 (0.0%)" or "N/A" for
percentages and improvement), updating the code that pushes rows (references:
rows.push lines that use totalOriginalFees, totalDynamicFees, totalVAPEValue,
usersWithPaidPass, originalProfitable, dynamicProfitable) and apply the same
guards to the console.log statements around the later summary output (console
logs referenced at ~277–279) so no division-by-zero produces NaN/Infinity.

In `@scripts/simulation/miningPassPricing.ts`:
- Around line 188-196: The minimum-price clamp is forcing tier 0 (free) passes
to cost $0.01; update the logic in miningPassPricing.ts so the Math.max(...,
0.01) floor is only applied for paid tiers. Specifically, where dynamicPriceUSDC
is set (and in calculateSimpleDynamicPrice), change the assignment so that if
tier === 0 you keep dynamicPriceUSDC (or originalPrice) as 0, otherwise apply
the Math.max(..., 0.01) clamp; adjust any related fields (e.g.,
dynamicPriceUSDC, originalPriceUSDC) so tier 0 remains zero-priced and paid
tiers retain the $0.01 minimum.

In `@scripts/simulation/miningPassROIAnalysis.ts`:
- Around line 155-159: The calculateDynamicFee function applies a $0.01 minimum
to all tiers, which incorrectly charges tier 0 (original fee 0); update
calculateDynamicFee (and use ORIGINAL_TIER_FEES_USDC and REFERENCE_VAPE_PRICE)
to return 0 immediately if the originalFee is 0, otherwise compute originalFee *
(vapePrice / REFERENCE_VAPE_PRICE) and then apply Math.max(..., 0.01) so only
paid tiers get the $0.01 floor.
- Around line 393-398: Avoid division-by-zero when computing percentages and
improvement: guard uses of totalUsers and totalProfitableOrig before dividing
(the console.log lines that compute percentages with (totalProfitableOrig /
totalUsers) and (totalProfitableDyn / totalUsers) and the improvement
calculation using (totalProfitableDyn - totalProfitableOrig) /
totalProfitableOrig). If totalUsers is zero, print a safe placeholder (e.g.,
"N/A" or "0.0%") instead of performing the division; if totalProfitableOrig is
zero, skip or format the improvement as "N/A" (or show the absolute delta only)
rather than dividing by zero. Update the blocks that compute the percentage
strings and the improvement variable (references: totalProfitableOrig,
totalProfitableDyn, totalUsers, improvement) to perform these guards before
formatting and logging.

In `@scripts/simulation/unprofitableAnalysis.ts`:
- Around line 93-104: The final console.log divides grandTotalLoss by
unprofitableUsers.length which can be zero; guard that division by computing an
averageLoss variable like avgPerUser = unprofitableUsers.length ? grandTotalLoss
/ unprofitableUsers.length : 0 and use that in the TOTAL line formatting so you
never produce NaN; update the TOTAL console.log to use avgPerUser (formatted
toFixed(2) and padStart as before) and reference grandTotalLoss and
unprofitableUsers in the same scope.
- Around line 45-47: The calculation of utilization can divide by zero when
tierLimits[limitStr] is missing (limit becomes 0), causing Infinity; update the
logic around const limit = tierLimits[limitStr] || 0 and the utilization
calculation to guard against zero/NaN: compute deposited from depositedStr as
before, check if limit is a positive number (e.g., > 0) before doing (deposited
/ limit) * 100 and otherwise set utilization to 0 (or null/NaN per project
convention), mirroring the safe handling used in analyzeUtilization.ts and
ensure any downstream code can handle the fallback value.

In `@src/facets/DiamondManagerFacet.sol`:
- Around line 164-185: The emitted MiningPassFeesUpdated from
setMiningPassTierFee doesn't include which tier changed; update the contract to
emit a tier-aware event: either add a new event like
MiningPassTierFeeUpdated(uint256 indexed tier, uint256 fee) and emit it from
setMiningPassTierFee, or change the existing MiningPassFeesUpdated signature to
include an indexed tier and emit that with (tier, fee); ensure you update any
places that listen for MiningPassFeesUpdated and adjust tests to expect the new
event and payload, while keeping the storage write to
s.miningPassTierToFee[tier] as-is.
- Around line 187-195: The new floor set by setMiningPassFeeFloor can invalidate
existing tier fees in s.miningPassTierToFee, so modify setMiningPassFeeFloor to
either (A) iterate all existing mining pass tiers (the same index/collection
used by setMiningPassFees/setMiningPassTierFee) and revert if any stored fee <
floorBps, or (B) add a runtime clamp in the purchase path (where mining pass
price is read) to use max(storedFee, s.miningPassFeeFloorBps). Pick one
approach: for write-time validation, loop the stored tiers in
setMiningPassFeeFloor and revert with DiamondManagerFacet__Invalid_Input() if
any fee < floorBps; for runtime safety, change the purchase logic to compute
price = max(s.miningPassTierToFee[tier], s.miningPassFeeFloorBps) before
charging. Ensure you reference and update the same storage keys
(s.miningPassTierToFee, s.miningPassFeeFloorBps) and related functions
(setMiningPassFees, setMiningPassTierFee, and the purchase function)
accordingly.

In `@src/libraries/AppStorage.sol`:
- Around line 80-83: The two new storage fields baseMiningPassTierToFee and
miningPassFeeFloorBps were inserted mid-struct in AppStorage causing a storage
layout collision; move these declarations out of the MINING PASS/GENERAL region
and append them after the last existing storage variable isSeasonClaimed in the
AppStorage struct so they occupy new slots, update any nearby comments to
reflect their new position, and ensure no other fields are reordered or removed.
🧹 Nitpick comments (8)
package.json (1)

71-73: viem should be a devDependency, not a production dependency.

viem is only consumed by simulation/analysis scripts under scripts/simulation/. Placing it in dependencies means it will be installed by consumers of this package. Move it to devDependencies.

Proposed fix
-  "dependencies": {
-    "viem": "2.45.1"
-  }
+  "devDependencies": {
+    ...
+    "viem": "2.45.1",
+    ...
+  }
scripts/simulation/exportSeasonAnalysis.ts (1)

91-110: No HTTP status check before parsing the response.

If the subgraph returns a non-200 response, response.json() may throw or return unexpected data. Check response.ok before parsing.

Proposed fix
             const response = await fetch(SUBGRAPH_URL, {
                 method: 'POST',
                 headers: { 'Content-Type': 'application/json' },
                 body: JSON.stringify({ query }),
             })
+            if (!response.ok) {
+                console.log(`Subgraph returned ${response.status}, stopping pagination.`)
+                hasMore = false
+                continue
+            }
             const data = await response.json()
scripts/simulation/miningPassPricing.ts (2)

5-5: Unused constant VAPE_TOKEN.

VAPE_TOKEN is declared but never referenced in this file.


198-212: Default referenceVAPEPrice = 1.0 is misleading.

The constant REFERENCE_VAPE_PRICE is 0.606, but the function defaults to 1.0 if the caller omits the parameter. This inconsistency could produce incorrect pricing if the function is later called without an explicit reference price.

Proposed fix
 function calculateSimpleDynamicPrice(
     tier: number,
     vapePrice: number,
-    referenceVAPEPrice: number = 1.0
+    referenceVAPEPrice: number = REFERENCE_VAPE_PRICE
 ): { originalPrice: number; dynamicPrice: number; multiplier: number } {
scripts/simulation/miningPassROIAnalysis.ts (1)

104-132: fetchUsersForSeason is limited to 1000 users with no pagination.

Unlike the paginated fetchAllUsersForSeason in exportSeasonAnalysis.ts, this function fetches at most 1000 users. If a season has more participants, the analysis will be based on incomplete data, silently skewing ROI metrics.

Consider adding pagination or at least logging a warning when the result hits the limit.

Proposed minimal warning
         const data = await response.json()
-        return data.data?.users?.items || []
+        const items = data.data?.users?.items || []
+        if (items.length === 1000) {
+            console.warn(`⚠️  Season ${seasonId}: hit 1000 user limit — results may be incomplete.`)
+        }
+        return items
src/facets/DiamondManagerFacet.sol (1)

456-499: Getters lack tier bounds validation, unlike their setter counterparts.

getMiningPassTierFee, getBaseMiningPassTierFee, getMiningPassTierFloorFee, and getMiningPassTierDepositLimit all accept an unbounded tier parameter and silently return 0 (default storage) for out-of-range values. The setters enforce tier <= 10. Off-chain consumers may misinterpret a 0 return for tier 99 as valid data.

Consider adding a tier > 10 revert (matching setMiningPassTierFee) or documenting the behavior explicitly.

♻️ Example bounds check for getters
 function getMiningPassTierFee(uint256 tier) external view returns (uint256) {
+    if (tier > 10) revert DiamondManagerFacet__InvalidTier();
     return s.miningPassTierToFee[tier];
 }

 function getBaseMiningPassTierFee(uint256 tier) external view returns (uint256) {
+    if (tier > 10) revert DiamondManagerFacet__InvalidTier();
     return s.baseMiningPassTierToFee[tier];
 }

 function getMiningPassTierFloorFee(uint256 tier) external view returns (uint256) {
+    if (tier > 10) revert DiamondManagerFacet__InvalidTier();
     uint256 baseFee = s.baseMiningPassTierToFee[tier];
     return (baseFee * s.miningPassFeeFloorBps) / TOTAL_SHARES;
 }

 function getMiningPassTierDepositLimit(uint256 tier) external view returns (uint256) {
+    if (tier > 10) revert DiamondManagerFacet__InvalidTier();
     return s.miningPassTierToDepositLimit[tier];
 }
test/foundry/Facets/MiningPassDynamicPricing.t.sol (2)

98-120: Consider verifying event emissions in setter tests.

The setMiningPassFees, setMiningPassTierFee, and setMiningPassFeeFloor functions all emit events, but none of the tests use vm.expectEmit to verify them. Adding event assertions (especially for MiningPassFeesUpdated and MiningPassFeeFloorUpdated) would strengthen coverage and catch regressions in event parameters.


140-153: Floor enforcement test for bulk set relies on tier 1 reverting first — consider documenting or adding an isolated test.

This test constructs fees at 24% across all tiers. The revert will trigger on tier 1 (the first non-zero tier in the loop). If the loop order or skip logic changes, the test may become misleading. The behavior is correct today, but it's worth noting this is an implicit ordering dependency.

Comment on lines +1 to +5
{
"python-envs.defaultEnvManager": "ms-python.python:conda",
"python-envs.defaultPackageManager": "ms-python.python:conda",
"python-envs.pythonProjects": []
} No newline at end of file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# Check if .vscode is in .gitignore
if [ -f .gitignore ]; then
  echo "=== .gitignore content (searching for .vscode) ==="
  rg -n '\.vscode' .gitignore || echo "No .vscode found in .gitignore"
  echo ""
  echo "=== Full .gitignore content (first 50 lines) ==="
  head -50 .gitignore
else
  echo ".gitignore file not found"
fi

Repository: VaporFi/liquid-mining

Length of output: 461


🏁 Script executed:

# Check if there are other .vscode files in the repo
echo "=== All .vscode files in the repository ==="
find . -path './.git' -prune -o -type f -path '*/.vscode/*' -print 2>/dev/null | head -20

Repository: VaporFi/liquid-mining

Length of output: 132


🏁 Script executed:

# Check git diff to see what was actually added in this commit/PR
echo "=== Check if .vscode/settings.json is a new file ==="
git ls-files .vscode/ 2>/dev/null || echo "No .vscode files tracked in git"

Repository: VaporFi/liquid-mining

Length of output: 139


🏁 Script executed:

# Check for Python files in the repository
echo "=== Python files in repository ==="
find . -path './.git' -prune -o -type f \( -name "*.py" \) -print 2>/dev/null | head -20

Repository: VaporFi/liquid-mining

Length of output: 137


🏁 Script executed:

# Check the PR/recent git context to understand the scope
echo "=== Repository structure (top level) ==="
ls -la | grep -v '^d' | head -20
echo ""
echo "=== Directory structure ==="
find . -maxdepth 2 -type d ! -path '*/\.git*' ! -path '*/.vscode*' ! -path '*/node_modules*' | sort | head -30

Repository: VaporFi/liquid-mining

Length of output: 1712


Add .vscode/ to .gitignore and remove .vscode/settings.json from this PR.

The repository already ignores IDE configurations (.idea is in .gitignore). VSCode workspace settings should follow the same pattern rather than being committed. With only one Python script in a Solidity-focused project, this appears to be personal developer configuration, not a project requirement.

🤖 Prompt for AI Agents
In @.vscode/settings.json around lines 1 - 5, Remove the committed VSCode
workspace settings and ignore them going forward: delete the settings file from
the PR (unstage/remove it from the repository commit, e.g., git rm --cached) and
add the .vscode/ directory to .gitignore so IDE settings are not tracked;
specifically remove ".vscode/settings.json" from the repo and add a ".vscode/"
entry to .gitignore, then commit the updated .gitignore.

Comment on lines +38 to +40
const limit = tierLimits[limitStr] || 0
const deposited = parseFloat(depositedStr.replace(/,/g, ''))
const pctOfLimit = ((deposited / limit) * 100).toFixed(1)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Division by zero when limitStr doesn't match any key in tierLimits.

When tierLimits[limitStr] is undefined, limit becomes 0, and the division on line 40 produces Infinity/NaN. Guard against this.

Proposed fix
     const limit = tierLimits[limitStr] || 0
+    if (limit === 0) continue
     const deposited = parseFloat(depositedStr.replace(/,/g, ''))
     const pctOfLimit = ((deposited / limit) * 100).toFixed(1)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const limit = tierLimits[limitStr] || 0
const deposited = parseFloat(depositedStr.replace(/,/g, ''))
const pctOfLimit = ((deposited / limit) * 100).toFixed(1)
const limit = tierLimits[limitStr] || 0
if (limit === 0) continue
const deposited = parseFloat(depositedStr.replace(/,/g, ''))
const pctOfLimit = ((deposited / limit) * 100).toFixed(1)
🤖 Prompt for AI Agents
In `@scripts/simulation/analyzeUtilization.ts` around lines 38 - 40, The current
calculation for pctOfLimit divides deposited by limit and can produce
Infinity/NaN when tierLimits[limitStr] is undefined (limit === 0); update the
logic around the variables limit, limitStr, tierLimits, deposited and pctOfLimit
to guard against a zero/undefined limit: check if limit is a positive number
before doing (deposited / limit) and if not, set pctOfLimit to a safe default
string (e.g., "0.0" or "N/A") or handle it explicitly so no division occurs.

Comment on lines +67 to +69
const totalNotProfit = notProfitableMaxed + notProfitableNotMaxed
const pctNotMaxed = ((notProfitableNotMaxed / totalNotProfit) * 100).toFixed(1)
console.log(`\n📊 Of ${totalNotProfit} unprofitable users, ${notProfitableNotMaxed} (${pctNotMaxed}%) did NOT max their tier limit`)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Division by zero if no unprofitable users exist.

If totalNotProfit is 0, the percentage calculation on line 68 will produce NaN.

Proposed fix
 const totalNotProfit = notProfitableMaxed + notProfitableNotMaxed
-const pctNotMaxed = ((notProfitableNotMaxed / totalNotProfit) * 100).toFixed(1)
-console.log(`\n📊 Of ${totalNotProfit} unprofitable users, ${notProfitableNotMaxed} (${pctNotMaxed}%) did NOT max their tier limit`)
+if (totalNotProfit > 0) {
+    const pctNotMaxed = ((notProfitableNotMaxed / totalNotProfit) * 100).toFixed(1)
+    console.log(`\n📊 Of ${totalNotProfit} unprofitable users, ${notProfitableNotMaxed} (${pctNotMaxed}%) did NOT max their tier limit`)
+} else {
+    console.log(`\n📊 No unprofitable users found.`)
+}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const totalNotProfit = notProfitableMaxed + notProfitableNotMaxed
const pctNotMaxed = ((notProfitableNotMaxed / totalNotProfit) * 100).toFixed(1)
console.log(`\n📊 Of ${totalNotProfit} unprofitable users, ${notProfitableNotMaxed} (${pctNotMaxed}%) did NOT max their tier limit`)
const totalNotProfit = notProfitableMaxed + notProfitableNotMaxed
if (totalNotProfit > 0) {
const pctNotMaxed = ((notProfitableNotMaxed / totalNotProfit) * 100).toFixed(1)
console.log(`\n📊 Of ${totalNotProfit} unprofitable users, ${notProfitableNotMaxed} (${pctNotMaxed}%) did NOT max their tier limit`)
} else {
console.log(`\n📊 No unprofitable users found.`)
}
🤖 Prompt for AI Agents
In `@scripts/simulation/analyzeUtilization.ts` around lines 67 - 69, The
pctNotMaxed calculation can produce NaN when totalNotProfit is 0; update the
code that computes pctNotMaxed (currently using totalNotProfit and
notProfitableNotMaxed) to guard against division by zero by computing
pctNotMaxed only if totalNotProfit > 0 (e.g., use a conditional or ternary) and
otherwise set a safe formatted value like "0.0" (so toFixed is not called on
NaN), then use that safe value in the console.log.

Comment on lines +230 to +242
rows.push(['Total Original Fees Collected', `$${totalOriginalFees.toFixed(2)}`])
rows.push(['Total Dynamic Fees (Proposed)', `$${totalDynamicFees.toFixed(2)}`])
rows.push(['Fee Reduction', `${((1 - totalDynamicFees / totalOriginalFees) * 100).toFixed(1)}%`])
rows.push([])
rows.push(['Total VAPE Value Distributed', `$${totalVAPEValue.toFixed(2)}`])
rows.push([])
rows.push(['PROFITABILITY ANALYSIS (Paid Pass Users Only)'])
rows.push(['Original Pricing - Profitable Users', `${originalProfitable} / ${usersWithPaidPass} (${(originalProfitable / usersWithPaidPass * 100).toFixed(1)}%)`])
rows.push(['Dynamic Pricing - Profitable Users', `${dynamicProfitable} / ${usersWithPaidPass} (${(dynamicProfitable / usersWithPaidPass * 100).toFixed(1)}%)`])
rows.push(['Improvement', `+${dynamicProfitable - originalProfitable} users (+${((dynamicProfitable - originalProfitable) / usersWithPaidPass * 100).toFixed(1)}%)`])
rows.push([])
rows.push(['Average Original ROI', `${(totalVAPEValue / totalOriginalFees).toFixed(2)}x`])
rows.push(['Average Dynamic ROI', `${(totalVAPEValue / totalDynamicFees).toFixed(2)}x`])
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Multiple division-by-zero risks in summary statistics.

Lines 232, 237–239, and 241–242 all divide by totalOriginalFees, usersWithPaidPass, or totalDynamicFees. If no paid-pass users exist, all these values are 0, producing NaN/Infinity in the CSV output. Guard these divisions.

Example guard
-    rows.push(['Fee Reduction', `${((1 - totalDynamicFees / totalOriginalFees) * 100).toFixed(1)}%`])
+    rows.push(['Fee Reduction', totalOriginalFees > 0 ? `${((1 - totalDynamicFees / totalOriginalFees) * 100).toFixed(1)}%` : 'N/A'])

Apply similar guards for usersWithPaidPass divisions on lines 237–239, 241–242, and the console logs on 277–279.

🤖 Prompt for AI Agents
In `@scripts/simulation/exportSeasonAnalysis.ts` around lines 230 - 242, Guard all
divisions in the summary by checking denominators before computing ratios: when
building rows with totalOriginalFees, totalDynamicFees, usersWithPaidPass, and
when computing ROI/percentages for originalProfitable/dynamicProfitable use
conditional logic to avoid dividing by zero (e.g., if totalOriginalFees or
totalDynamicFees is 0, output "0.00x" or "N/A" for ROI; if usersWithPaidPass is
0, output "0 / 0 (0.0%)" or "N/A" for percentages and improvement), updating the
code that pushes rows (references: rows.push lines that use totalOriginalFees,
totalDynamicFees, totalVAPEValue, usersWithPaidPass, originalProfitable,
dynamicProfitable) and apply the same guards to the console.log statements
around the later summary output (console logs referenced at ~277–279) so no
division-by-zero produces NaN/Infinity.

Comment on lines +188 to +196
tier,
originalPriceUSDC: originalPrice,
dynamicPriceUSDC: Math.max(dynamicPriceUSDC, 0.01),
expectedVAPE: expectedVAPENumber,
expectedValueUSD,
estimatedROI: actualROI,
depositLimitVPND: tier === 10 ? 'Unlimited (using 50M for calc)' : formatUnits(depositLimit, 18),
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Tier 0 (free) users incorrectly get a minimum fee of $0.01.

Math.max(dynamicPriceUSDC, 0.01) on line 190 forces a non-zero price even for tier 0, where originalPrice is 0 and users shouldn't be charged. The same issue exists in calculateSimpleDynamicPrice at line 209.

Proposed fix for both functions
 // In calculateDynamicPrice (line 190):
-        dynamicPriceUSDC: Math.max(dynamicPriceUSDC, 0.01),
+        dynamicPriceUSDC: originalPrice === 0 ? 0 : Math.max(dynamicPriceUSDC, 0.01),

 // In calculateSimpleDynamicPrice (line 209):
-        dynamicPrice: Math.max(dynamicPrice, 0.01),
+        dynamicPrice: originalPrice === 0 ? 0 : Math.max(dynamicPrice, 0.01),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
tier,
originalPriceUSDC: originalPrice,
dynamicPriceUSDC: Math.max(dynamicPriceUSDC, 0.01),
expectedVAPE: expectedVAPENumber,
expectedValueUSD,
estimatedROI: actualROI,
depositLimitVPND: tier === 10 ? 'Unlimited (using 50M for calc)' : formatUnits(depositLimit, 18),
}
}
tier,
originalPriceUSDC: originalPrice,
dynamicPriceUSDC: originalPrice === 0 ? 0 : Math.max(dynamicPriceUSDC, 0.01),
expectedVAPE: expectedVAPENumber,
expectedValueUSD,
estimatedROI: actualROI,
depositLimitVPND: tier === 10 ? 'Unlimited (using 50M for calc)' : formatUnits(depositLimit, 18),
}
}
🤖 Prompt for AI Agents
In `@scripts/simulation/miningPassPricing.ts` around lines 188 - 196, The
minimum-price clamp is forcing tier 0 (free) passes to cost $0.01; update the
logic in miningPassPricing.ts so the Math.max(..., 0.01) floor is only applied
for paid tiers. Specifically, where dynamicPriceUSDC is set (and in
calculateSimpleDynamicPrice), change the assignment so that if tier === 0 you
keep dynamicPriceUSDC (or originalPrice) as 0, otherwise apply the Math.max(...,
0.01) clamp; adjust any related fields (e.g., dynamicPriceUSDC,
originalPriceUSDC) so tier 0 remains zero-priced and paid tiers retain the $0.01
minimum.

Comment on lines +45 to +47
const limit = tierLimits[limitStr] || 0
const deposited = parseFloat(depositedStr.replace(/,/g, ''))
const utilization = (deposited / limit) * 100
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Division by zero when limit is 0.

Same pattern as in analyzeUtilization.ts: if tierLimits[limitStr] doesn't match, limit is 0 and utilization becomes Infinity.

Proposed fix
     const limit = tierLimits[limitStr] || 0
+    if (limit === 0) continue
     const deposited = parseFloat(depositedStr.replace(/,/g, ''))
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const limit = tierLimits[limitStr] || 0
const deposited = parseFloat(depositedStr.replace(/,/g, ''))
const utilization = (deposited / limit) * 100
const limit = tierLimits[limitStr] || 0
if (limit === 0) continue
const deposited = parseFloat(depositedStr.replace(/,/g, ''))
const utilization = (deposited / limit) * 100
🤖 Prompt for AI Agents
In `@scripts/simulation/unprofitableAnalysis.ts` around lines 45 - 47, The
calculation of utilization can divide by zero when tierLimits[limitStr] is
missing (limit becomes 0), causing Infinity; update the logic around const limit
= tierLimits[limitStr] || 0 and the utilization calculation to guard against
zero/NaN: compute deposited from depositedStr as before, check if limit is a
positive number (e.g., > 0) before doing (deposited / limit) * 100 and otherwise
set utilization to 0 (or null/NaN per project convention), mirroring the safe
handling used in analyzeUtilization.ts and ensure any downstream code can handle
the fallback value.

Comment on lines +93 to +104
let grandTotalLoss = 0
for (const tier of Object.keys(tierStats).map(Number).sort((a, b) => a - b)) {
const stats = tierStats[tier]
const avgROI = (stats.rois.reduce((a, b) => a + b, 0) / stats.count).toFixed(2)
const avgUtil = (stats.utils.reduce((a, b) => a + b, 0) / stats.count).toFixed(1)
const avgLoss = (stats.totalLoss / stats.count).toFixed(2)
grandTotalLoss += stats.totalLoss
console.log(` ${tier} | ${String(stats.count).padStart(2)} | $${stats.totalLoss.toFixed(2).padStart(7)} | $${avgLoss.padStart(5)} | ${avgROI}x | ${avgUtil}%`)
}

console.log('─'.repeat(70))
console.log(`TOTAL | ${unprofitableUsers.length} | $${grandTotalLoss.toFixed(2).padStart(7)} | $${(grandTotalLoss / unprofitableUsers.length).toFixed(2).padStart(5)} |`)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Division by zero if unprofitableUsers is empty.

Line 104 divides grandTotalLoss by unprofitableUsers.length. If the array is empty (e.g., all users are profitable), this produces NaN.

Proposed fix
-console.log(`TOTAL |   ${unprofitableUsers.length}  |   $${grandTotalLoss.toFixed(2).padStart(7)} |  $${(grandTotalLoss / unprofitableUsers.length).toFixed(2).padStart(5)} |`)
+const avgLossTotal = unprofitableUsers.length > 0 ? (grandTotalLoss / unprofitableUsers.length).toFixed(2) : '0.00'
+console.log(`TOTAL |   ${unprofitableUsers.length}  |   $${grandTotalLoss.toFixed(2).padStart(7)} |  $${avgLossTotal.padStart(5)} |`)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let grandTotalLoss = 0
for (const tier of Object.keys(tierStats).map(Number).sort((a, b) => a - b)) {
const stats = tierStats[tier]
const avgROI = (stats.rois.reduce((a, b) => a + b, 0) / stats.count).toFixed(2)
const avgUtil = (stats.utils.reduce((a, b) => a + b, 0) / stats.count).toFixed(1)
const avgLoss = (stats.totalLoss / stats.count).toFixed(2)
grandTotalLoss += stats.totalLoss
console.log(` ${tier} | ${String(stats.count).padStart(2)} | $${stats.totalLoss.toFixed(2).padStart(7)} | $${avgLoss.padStart(5)} | ${avgROI}x | ${avgUtil}%`)
}
console.log('─'.repeat(70))
console.log(`TOTAL | ${unprofitableUsers.length} | $${grandTotalLoss.toFixed(2).padStart(7)} | $${(grandTotalLoss / unprofitableUsers.length).toFixed(2).padStart(5)} |`)
let grandTotalLoss = 0
for (const tier of Object.keys(tierStats).map(Number).sort((a, b) => a - b)) {
const stats = tierStats[tier]
const avgROI = (stats.rois.reduce((a, b) => a + b, 0) / stats.count).toFixed(2)
const avgUtil = (stats.utils.reduce((a, b) => a + b, 0) / stats.count).toFixed(1)
const avgLoss = (stats.totalLoss / stats.count).toFixed(2)
grandTotalLoss += stats.totalLoss
console.log(` ${tier} | ${String(stats.count).padStart(2)} | $${stats.totalLoss.toFixed(2).padStart(7)} | $${avgLoss.padStart(5)} | ${avgROI}x | ${avgUtil}%`)
}
console.log('─'.repeat(70))
const avgLossTotal = unprofitableUsers.length > 0 ? (grandTotalLoss / unprofitableUsers.length).toFixed(2) : '0.00'
console.log(`TOTAL | ${unprofitableUsers.length} | $${grandTotalLoss.toFixed(2).padStart(7)} | $${avgLossTotal.padStart(5)} |`)
🤖 Prompt for AI Agents
In `@scripts/simulation/unprofitableAnalysis.ts` around lines 93 - 104, The final
console.log divides grandTotalLoss by unprofitableUsers.length which can be
zero; guard that division by computing an averageLoss variable like avgPerUser =
unprofitableUsers.length ? grandTotalLoss / unprofitableUsers.length : 0 and use
that in the TOTAL line formatting so you never produce NaN; update the TOTAL
console.log to use avgPerUser (formatted toFixed(2) and padStart as before) and
reference grandTotalLoss and unprofitableUsers in the same scope.

Comment on lines +164 to +185
/// @notice Update a single mining pass tier fee
/// @param tier The tier to update (0-10)
/// @param fee The new fee in USDC (6 decimals)
function setMiningPassTierFee(uint256 tier, uint256 fee) external onlyOwner {
if (tier > 10) {
revert DiamondManagerFacet__InvalidTier();
}

uint256 baseFee = s.baseMiningPassTierToFee[tier];
uint256 floorFee = (baseFee * s.miningPassFeeFloorBps) / TOTAL_SHARES;

// Fee must be >= floor price (but tier 0 is always free)
if (tier > 0 && fee < floorFee) {
revert DiamondManagerFacet__FeeBelowFloor();
}

s.miningPassTierToFee[tier] = fee;

uint256[] memory fees = new uint256[](1);
fees[0] = fee;
emit MiningPassFeesUpdated(fees);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

MiningPassFeesUpdated event from single-tier update lacks tier identification.

When setMiningPassTierFee emits MiningPassFeesUpdated, it creates a 1-element array containing only the new fee. Off-chain consumers (indexers, UIs) cannot determine which tier was updated. This makes the event ambiguous — it's indistinguishable from a bulk update with 1 entry, and the tier index is lost.

Consider either:

  1. Adding a dedicated event like MiningPassTierFeeUpdated(uint256 tier, uint256 fee), or
  2. Including the tier in the existing event signature, e.g., MiningPassFeesUpdated(uint256 indexed tier, uint256 fee) for single-tier updates.
🤖 Prompt for AI Agents
In `@src/facets/DiamondManagerFacet.sol` around lines 164 - 185, The emitted
MiningPassFeesUpdated from setMiningPassTierFee doesn't include which tier
changed; update the contract to emit a tier-aware event: either add a new event
like MiningPassTierFeeUpdated(uint256 indexed tier, uint256 fee) and emit it
from setMiningPassTierFee, or change the existing MiningPassFeesUpdated
signature to include an indexed tier and emit that with (tier, fee); ensure you
update any places that listen for MiningPassFeesUpdated and adjust tests to
expect the new event and payload, while keeping the storage write to
s.miningPassTierToFee[tier] as-is.

Comment on lines +187 to +195
/// @notice Update the floor price percentage for mining pass fees
/// @param floorBps Floor price in basis points (e.g., 2500 = 25%)
function setMiningPassFeeFloor(uint256 floorBps) external onlyOwner {
if (floorBps > TOTAL_SHARES) {
revert DiamondManagerFacet__Invalid_Input();
}
s.miningPassFeeFloorBps = floorBps;
emit MiningPassFeeFloorUpdated(floorBps);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Raising the floor does not retroactively validate existing tier fees.

If setMiningPassFeeFloor increases floorBps, previously stored fees in s.miningPassTierToFee may now be below the new floor. Since the floor is only enforced at fee-setting time (not at purchase time), purchases will continue at below-floor prices until setMiningPassFees or setMiningPassTierFee is called again.

Consider either:

  1. Validating all current fees against the new floor within this function and reverting if any would be violated, or
  2. Enforcing the floor at purchase time as well (i.e., max(storedFee, floorFee)).

Option 1 keeps the check at write time while preventing inconsistent state. Option 2 provides a runtime safety net.

🤖 Prompt for AI Agents
In `@src/facets/DiamondManagerFacet.sol` around lines 187 - 195, The new floor set
by setMiningPassFeeFloor can invalidate existing tier fees in
s.miningPassTierToFee, so modify setMiningPassFeeFloor to either (A) iterate all
existing mining pass tiers (the same index/collection used by
setMiningPassFees/setMiningPassTierFee) and revert if any stored fee < floorBps,
or (B) add a runtime clamp in the purchase path (where mining pass price is
read) to use max(storedFee, s.miningPassFeeFloorBps). Pick one approach: for
write-time validation, loop the stored tiers in setMiningPassFeeFloor and revert
with DiamondManagerFacet__Invalid_Input() if any fee < floorBps; for runtime
safety, change the purchase logic to compute price =
max(s.miningPassTierToFee[tier], s.miningPassFeeFloorBps) before charging.
Ensure you reference and update the same storage keys (s.miningPassTierToFee,
s.miningPassFeeFloorBps) and related functions (setMiningPassFees,
setMiningPassTierFee, and the purchase function) accordingly.

Comment on lines 80 to 83
/// @dev Base fees (original fees set at deployment, used for floor price calculation)
mapping(uint256 => uint256) baseMiningPassTierToFee;
/// @dev Floor price percentage in basis points (e.g., 2500 = 25%)
uint256 miningPassFeeFloorBps;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

# First, let's find and read the AppStorage.sol file to verify the struct layout
fd AppStorage.sol --type f

Repository: VaporFi/liquid-mining

Length of output: 93


🏁 Script executed:

# Once we have the file, let's check its full structure around the storage fields
git ls-files | grep -i appstorage

Repository: VaporFi/liquid-mining

Length of output: 264


🏁 Script executed:

# Search for deployment references and Season 30 mentions
rg "Season 30|0x[A-Fa-f0-9]{40}" --type-add 'sol:*.sol' --type-add 'ts:*.ts' --type-add 'json:*.json' -g '!node_modules' | head -50

Repository: VaporFi/liquid-mining

Length of output: 50378


🏁 Script executed:

# Read AppStorage.sol to see the struct definition and field positions
wc -l src/libraries/AppStorage.sol && head -120 src/libraries/AppStorage.sol | tail -50

Repository: VaporFi/liquid-mining

Length of output: 1950


🏁 Script executed:

# Get the full struct around the problematic lines
sed -n '50,110p' src/libraries/AppStorage.sol

Repository: VaporFi/liquid-mining

Length of output: 1899


Critical: Storage layout collision — new fields must be appended at the end of the struct, not inserted in the middle.

These two new fields (baseMiningPassTierToFee, miningPassFeeFloorBps) are inserted between the MINING PASS and GENERAL sections. In the Diamond proxy pattern, all facets share a single storage layout via AppStorage. Since this diamond is already deployed (confirmed by active Season 30 data and references in multiple script files), inserting fields mid-struct shifts the storage slots of every subsequent field (depositToken, rewardToken, feeToken, stratosphereAddress, reentrancyGuardStatus, emissionsManager, gelatoExecutor, isSeasonClaimed), causing silent data corruption on upgrade.

New storage fields must be appended after the last field in the struct (after isSeasonClaimed).

Proposed fix
    address[] miningPassFeeReceivers;
    uint256[] miningPassFeeReceiversShares;
-   /// `@dev` Base fees (original fees set at deployment, used for floor price calculation)
-   mapping(uint256 => uint256) baseMiningPassTierToFee;
-   /// `@dev` Floor price percentage in basis points (e.g., 2500 = 25%)
-   uint256 miningPassFeeFloorBps;
    ///////////////
    /// GENERAL ///
    ///////////////
    address depositToken;
    ...
    mapping(uint256 => bool) isSeasonClaimed;
+   ///////////////////////////
+   /// MINING PASS PRICING ///
+   ///////////////////////////
+   /// `@dev` Base fees (original fees set at deployment, used for floor price calculation)
+   mapping(uint256 => uint256) baseMiningPassTierToFee;
+   /// `@dev` Floor price percentage in basis points (e.g., 2500 = 25%)
+   uint256 miningPassFeeFloorBps;
}
🤖 Prompt for AI Agents
In `@src/libraries/AppStorage.sol` around lines 80 - 83, The two new storage
fields baseMiningPassTierToFee and miningPassFeeFloorBps were inserted
mid-struct in AppStorage causing a storage layout collision; move these
declarations out of the MINING PASS/GENERAL region and append them after the
last existing storage variable isSeasonClaimed in the AppStorage struct so they
occupy new slots, update any nearby comments to reflect their new position, and
ensure no other fields are reordered or removed.

Comment on lines 55 to 57
mapping(uint256 => uint256) __deprecated_unlockFeeDiscount; // slot 8
mapping(uint256 => uint256) __deprecated_depositDiscount; // slot 9
mapping(uint256 => uint256) __deprecated_restakeDiscount; // slot 10
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment for self: try to remove it while fixing collision on fuji

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
deployments/DepositFacet.json (1)

84-159: ⚠️ Potential issue | 🟡 Minor

Hardhat ABI is missing the DepositFacet__Less_Than_One_Day error defined in the source code.

The source code defines this error at line 17 and uses it at line 77, but the hardhat artifact ABI is missing it. This inconsistency will cause decoding failures when this error is reverted in local tests. Update the hardhat artifact to match the current source code.

deployments/MiningPassFacet.json (1)

133-234: ⚠️ Potential issue | 🟠 Major

Hardhat ABI is missing the new getMiningPassFee function and MiningPassFacet__InvalidFeeReceivers error.

The fuji and avalanche sections both include the new MiningPassFacet__InvalidFeeReceivers error and getMiningPassFee(address,uint256) function, but the hardhat section still has the old ABI without these entries. This means local development and Hardhat-based tests consuming this artifact won't have access to the new dynamic pricing interface.

Please regenerate or update the hardhat deployment artifact to match the fuji/avalanche ABI.

🤖 Fix all issues with AI agents
In `@deployments/MiningPassFacet.json`:
- Around line 121-122: Fuji's deployed bytecode references storage slot 0x19
(depositToken) instead of 0x17 for s.miningPassTierToFee[_tier], indicating
compilation against an out-of-date AppStorage layout; recompile the
MiningPassFacet (symbol: s.miningPassTierToFee and the contract/source
MiningPassFacet) against the current AppStorage.sol, regenerate the Fuji
artifact (replace the bytecode in MiningPassFacet.json), ensure identical
compiler version, settings and ABI across networks, compare the resulting
storage layout mapping (slots for miningPassTierToFee and depositToken) with
Avalanche's artifact, and redeploy Fuji only after confirming on-chain storage
expectations match the new bytecode to avoid storage collisions.

In `@src/facets/DiamondManagerFacet.sol`:
- Around line 188-205: The current setBaseMiningPassFees function updates
s.baseMiningPassTierToFee without ensuring existing s.miningPassTierToFee
entries still meet the new floor (computed as baseFee * floorBps / 10000),
leading to inconsistent state; update setBaseMiningPassFees to, for each tier i,
compute uint256 floor = fees[i] * s.floorBps / 10000 and either revert if
s.miningPassTierToFee[i] < floor (preferred) or set s.miningPassTierToFee[i] =
floor to enforce it immediately; reference the functions/vars
setBaseMiningPassFees, s.baseMiningPassTierToFee, s.miningPassTierToFee,
floorBps and keep behavior consistent with
setMiningPassTierFee/setMiningPassFees expectations.

In `@tasks/diamondInfo.ts`:
- Around line 114-117: The destructured variable name depositTimestamp is
misleading because dm.getUserDepositAmount(user, seasonId) returns
(depositAmount, depositPoints); update the destructuring in tasks/diamondInfo.ts
to use depositPoints (or drop the second element with an underscore if unused)
so it matches the returned semantics from getUserDepositAmount and avoid
confusion for future maintainers; ensure you reference the function name
getUserDepositAmount and the variables depositAmount/depositPoints when making
the change.

In `@tasks/startSeason.ts`:
- Around line 5-7: The local constant FLOOR_BPS (2500) in tasks/startSeason.ts
can drift from the on-chain miningPassFeeFloorBps; update the flow to fetch the
canonical value from the contract via getMiningPassFeeFloorBps() and use that
returned value in calculateDynamicFees instead of the hardcoded FLOOR_BPS; make
sure to call the contract getter before running startSeason logic (or pass the
fetched floor into calculateDynamicFees), handle the async call and any possible
failure, and remove or deprecate the hardcoded FLOOR_BPS constant so all fee
floor checks use the on-chain value.

In `@test/foundry/fork/FujiForkMiningPass.t.sol`:
- Around line 126-151: _issue_: _upgradeFacets currently only uses
IDiamondCut.FacetCutAction.Replace for newDMF/newMPF so any brand-new selectors
(the 10 dynamic-pricing functions like setMiningPassFees, setBaseMiningPassFees,
setMiningPassFeeFloor, getMiningPassTierFee, getBaseMiningPassTierFee,
getMiningPassTierFloorFee, getMiningPassFeeFloorBps, getAllMiningPassFees,
getMiningPassTierDepositLimit, setMiningPassTierFee) are never registered and
later calls in _reinitializeMiningPassData and tests will revert. Fix: update
_upgradeFacets to include an additional IDiamondCut.FacetCut with action
IDiamondCut.FacetCutAction.Add that registers the new function selectors from
the new facet instances (newDMF/newMPF); adjust the cuts array size (e.g., from
2 to 3) and populate the Add cut with the bytes4[] of the new selectors for
whichever facet defines those functions so the diamond actually adds those
functions instead of only replacing existing selectors.
🧹 Nitpick comments (5)
scripts/initDynamicPricing.ts (1)

19-47: Consider adding an idempotency check before overwriting on-chain state.

Re-running this script will silently overwrite existing pricing configuration. For a production initialization script, consider checking current values first (e.g., reading getBaseMiningPassTierFee(1)) and prompting or aborting if already set.

src/facets/DiamondManagerFacet.sol (1)

472-519: Getter functions don't validate tier bounds.

getMiningPassTierFee, getBaseMiningPassTierFee, getMiningPassTierFloorFee, and getMiningPassTierDepositLimit will silently return 0 (or a computed 0) for any tier > 10. This is harmless for view functions but could mislead off-chain consumers into thinking tier 11+ exists with zero fees.

Consider adding the same if (tier > 10) revert DiamondManagerFacet__InvalidTier(); guard used in setMiningPassTierFee.

tasks/startSeason.ts (2)

9-17: fetchVAPEPrice has no timeout or retry — can hang indefinitely.

If CoinGecko is slow or unreachable, the task will block forever. Consider adding an AbortController timeout:

Proposed fix
 async function fetchVAPEPrice(): Promise<number> {
+    const controller = new AbortController()
+    const timeout = setTimeout(() => controller.abort(), 10_000)
     const response = await fetch(
-        'https://api.coingecko.com/api/v3/simple/price?ids=vaporfi&vs_currencies=usd'
+        'https://api.coingecko.com/api/v3/simple/price?ids=vaporfi&vs_currencies=usd',
+        { signal: controller.signal }
     )
+    clearTimeout(timeout)

108-131: No error handling around on-chain transactions.

If setMiningPassFees succeeds but startNewSeasonWithEndTimestamp reverts, the fees are updated but no season is started — leaving the system in an inconsistent state. Consider wrapping the transaction sequence in a try/catch with clear messaging about partial completion, or at minimum log which step failed.

tasks/diamondInfo.ts (1)

72-86: N+1 query pattern — 11 iterations each making 4 RPC calls.

Each tier iteration makes 4 separate contract calls. For a CLI info tool this is acceptable, but getAllMiningPassFees() is already available and could reduce calls. Consider batching where possible for better UX on slow RPC endpoints.

Comment on lines +188 to +205
/// @notice Update base mining pass fees for all tiers
/// @dev Base fees are used as the reference for floor price calculation
/// @param fees Array of 11 base fees for tiers 0-10 (in USDC with 6 decimals)
function setBaseMiningPassFees(uint256[] calldata fees) external onlyOwner {
if (fees.length != 11) {
revert DiamondManagerFacet__Invalid_Input();
}

for (uint256 i; i < 11;) {
s.baseMiningPassTierToFee[i] = fees[i];

unchecked {
i++;
}
}

emit BaseMiningPassFeesUpdated(fees);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

setBaseMiningPassFees can silently invalidate existing tier fees against the floor.

Raising base fees increases the computed floor (baseFee * floorBps / 10000), but the function does not check whether currently stored miningPassTierToFee[i] values still satisfy the new floor. A subsequent call to setMiningPassTierFee or setMiningPassFees would then be required to "fix" them, but until then the system is in an inconsistent state where active fees are below the floor.

Consider either:

  1. Validating existing tier fees against the new floor inside this function, or
  2. Enforcing the floor at purchase time via max(storedFee, floorFee).
🤖 Prompt for AI Agents
In `@src/facets/DiamondManagerFacet.sol` around lines 188 - 205, The current
setBaseMiningPassFees function updates s.baseMiningPassTierToFee without
ensuring existing s.miningPassTierToFee entries still meet the new floor
(computed as baseFee * floorBps / 10000), leading to inconsistent state; update
setBaseMiningPassFees to, for each tier i, compute uint256 floor = fees[i] *
s.floorBps / 10000 and either revert if s.miningPassTierToFee[i] < floor
(preferred) or set s.miningPassTierToFee[i] = floor to enforce it immediately;
reference the functions/vars setBaseMiningPassFees, s.baseMiningPassTierToFee,
s.miningPassTierToFee, floorBps and keep behavior consistent with
setMiningPassTierFee/setMiningPassFees expectations.

Comment on lines +114 to +117
const [depositAmount, depositTimestamp] = await dm.getUserDepositAmount(
user,
seasonId
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Misleading destructured variable name: depositTimestamp is actually depositPoints.

getUserDepositAmount returns (depositAmount, depositPoints) (see DiamondManagerFacet.sol Line 349-351), not a timestamp. The variable depositTimestamp is unused so there's no functional bug, but it could confuse future maintainers.

Proposed fix
-            const [depositAmount, depositTimestamp] = await dm.getUserDepositAmount(
+            const [depositAmount, _depositPoints] = await dm.getUserDepositAmount(
                 user,
                 seasonId
             )
🤖 Prompt for AI Agents
In `@tasks/diamondInfo.ts` around lines 114 - 117, The destructured variable name
depositTimestamp is misleading because dm.getUserDepositAmount(user, seasonId)
returns (depositAmount, depositPoints); update the destructuring in
tasks/diamondInfo.ts to use depositPoints (or drop the second element with an
underscore if unused) so it matches the returned semantics from
getUserDepositAmount and avoid confusion for future maintainers; ensure you
reference the function name getUserDepositAmount and the variables
depositAmount/depositPoints when making the change.

Comment on lines +5 to +7
const REFERENCE_VAPE_PRICE = 0.606
const FLOOR_BPS = 2500 // 25%
const NUM_TIERS = 11
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

FLOOR_BPS is hardcoded and may diverge from on-chain value.

The task uses FLOOR_BPS = 2500 locally in calculateDynamicFees, but the on-chain contract has its own miningPassFeeFloorBps. If the on-chain floor is changed independently, the locally computed fees might pass the local floor check but fail the on-chain check (or vice versa — pass on-chain but not represent the intended floor locally).

Consider reading getMiningPassFeeFloorBps() from the contract instead:

Proposed fix
-const FLOOR_BPS = 2500 // 25%
 ...
+        // ── 2. Read base fees and floor from contract ────────────────
+        const floorBps = Number(await DiamondManagerFacet.getMiningPassFeeFloorBps())
         const baseFees: bigint[] = []
         ...
-        const dynamicFees = calculateDynamicFees(baseFees, currentVAPEPrice, FLOOR_BPS)
+        const dynamicFees = calculateDynamicFees(baseFees, currentVAPEPrice, floorBps)
🤖 Prompt for AI Agents
In `@tasks/startSeason.ts` around lines 5 - 7, The local constant FLOOR_BPS (2500)
in tasks/startSeason.ts can drift from the on-chain miningPassFeeFloorBps;
update the flow to fetch the canonical value from the contract via
getMiningPassFeeFloorBps() and use that returned value in calculateDynamicFees
instead of the hardcoded FLOOR_BPS; make sure to call the contract getter before
running startSeason logic (or pass the fetched floor into calculateDynamicFees),
handle the async call and any possible failure, and remove or deprecate the
hardcoded FLOOR_BPS constant so all fee floor checks use the on-chain value.

Comment on lines 126 to 151
function _upgradeFacets() internal {
DiamondManagerFacet newDMF = new DiamondManagerFacet();
MiningPassFacet newMPF = new MiningPassFacet();

// Use the loupe to find existing facet addresses via known selectors
address oldDmfAddr = loupe.facetAddress(DiamondManagerFacet.setDepositToken.selector);
address oldMpfAddr = loupe.facetAddress(MiningPassFacet.purchase.selector);

// Get ALL registered selectors for those facets
bytes4[] memory dmfSelectors = loupe.facetFunctionSelectors(oldDmfAddr);
bytes4[] memory mpfSelectors = loupe.facetFunctionSelectors(oldMpfAddr);

console.log("DMF selectors to replace:", dmfSelectors.length);
console.log("MPF selectors to replace:", mpfSelectors.length);

IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](2);
cuts[0] = IDiamondCut.FacetCut({
facetAddress: address(newDMF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: dmfSelectors
});
cuts[1] = IDiamondCut.FacetCut({
facetAddress: address(newMPF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: mpfSelectors
});

vm.prank(owner);
IDiamondCut(DIAMOND).diamondCut(cuts, address(0), "");
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

_upgradeFacets only replaces existing selectors — new dynamic pricing selectors are never added.

FacetCutAction.Replace only updates the facet address for selectors already registered in the diamond. The 10 new functions (setMiningPassFees, setBaseMiningPassFees, setMiningPassFeeFloor, getMiningPassTierFee, getBaseMiningPassTierFee, getMiningPassTierFloorFee, getMiningPassFeeFloorBps, getAllMiningPassFees, getMiningPassTierDepositLimit, setMiningPassTierFee) don't exist on the deployed Fuji diamond, so they won't be registered. Calls to these functions in _reinitializeMiningPassData (line 168-170) and assertions (line 77, 81) will revert with "Diamond: Function does not exist".

You need an additional FacetCutAction.Add cut for the new selectors:

Proposed fix
-        IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](2);
+        // New selectors that don't exist on the current diamond
+        bytes4[] memory newSelectors = new bytes4[](10);
+        newSelectors[0] = DiamondManagerFacet.setMiningPassFees.selector;
+        newSelectors[1] = DiamondManagerFacet.setMiningPassTierFee.selector;
+        newSelectors[2] = DiamondManagerFacet.setMiningPassFeeFloor.selector;
+        newSelectors[3] = DiamondManagerFacet.getMiningPassTierFee.selector;
+        newSelectors[4] = DiamondManagerFacet.getBaseMiningPassTierFee.selector;
+        newSelectors[5] = DiamondManagerFacet.getMiningPassTierFloorFee.selector;
+        newSelectors[6] = DiamondManagerFacet.getMiningPassFeeFloorBps.selector;
+        newSelectors[7] = DiamondManagerFacet.getAllMiningPassFees.selector;
+        newSelectors[8] = DiamondManagerFacet.getMiningPassTierDepositLimit.selector;
+        newSelectors[9] = DiamondManagerFacet.setBaseMiningPassFees.selector;
+
+        IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](3);
         cuts[0] = IDiamondCut.FacetCut({
             facetAddress: address(newDMF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: dmfSelectors
         });
         cuts[1] = IDiamondCut.FacetCut({
             facetAddress: address(newMPF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: mpfSelectors
         });
+        cuts[2] = IDiamondCut.FacetCut({
+            facetAddress: address(newDMF), action: IDiamondCut.FacetCutAction.Add, functionSelectors: newSelectors
+        });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function _upgradeFacets() internal {
DiamondManagerFacet newDMF = new DiamondManagerFacet();
MiningPassFacet newMPF = new MiningPassFacet();
// Use the loupe to find existing facet addresses via known selectors
address oldDmfAddr = loupe.facetAddress(DiamondManagerFacet.setDepositToken.selector);
address oldMpfAddr = loupe.facetAddress(MiningPassFacet.purchase.selector);
// Get ALL registered selectors for those facets
bytes4[] memory dmfSelectors = loupe.facetFunctionSelectors(oldDmfAddr);
bytes4[] memory mpfSelectors = loupe.facetFunctionSelectors(oldMpfAddr);
console.log("DMF selectors to replace:", dmfSelectors.length);
console.log("MPF selectors to replace:", mpfSelectors.length);
IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](2);
cuts[0] = IDiamondCut.FacetCut({
facetAddress: address(newDMF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: dmfSelectors
});
cuts[1] = IDiamondCut.FacetCut({
facetAddress: address(newMPF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: mpfSelectors
});
vm.prank(owner);
IDiamondCut(DIAMOND).diamondCut(cuts, address(0), "");
}
function _upgradeFacets() internal {
DiamondManagerFacet newDMF = new DiamondManagerFacet();
MiningPassFacet newMPF = new MiningPassFacet();
// Use the loupe to find existing facet addresses via known selectors
address oldDmfAddr = loupe.facetAddress(DiamondManagerFacet.setDepositToken.selector);
address oldMpfAddr = loupe.facetAddress(MiningPassFacet.purchase.selector);
// Get ALL registered selectors for those facets
bytes4[] memory dmfSelectors = loupe.facetFunctionSelectors(oldDmfAddr);
bytes4[] memory mpfSelectors = loupe.facetFunctionSelectors(oldMpfAddr);
console.log("DMF selectors to replace:", dmfSelectors.length);
console.log("MPF selectors to replace:", mpfSelectors.length);
// New selectors that don't exist on the current diamond
bytes4[] memory newSelectors = new bytes4[](10);
newSelectors[0] = DiamondManagerFacet.setMiningPassFees.selector;
newSelectors[1] = DiamondManagerFacet.setMiningPassTierFee.selector;
newSelectors[2] = DiamondManagerFacet.setMiningPassFeeFloor.selector;
newSelectors[3] = DiamondManagerFacet.getMiningPassTierFee.selector;
newSelectors[4] = DiamondManagerFacet.getBaseMiningPassTierFee.selector;
newSelectors[5] = DiamondManagerFacet.getMiningPassTierFloorFee.selector;
newSelectors[6] = DiamondManagerFacet.getMiningPassFeeFloorBps.selector;
newSelectors[7] = DiamondManagerFacet.getAllMiningPassFees.selector;
newSelectors[8] = DiamondManagerFacet.getMiningPassTierDepositLimit.selector;
newSelectors[9] = DiamondManagerFacet.setBaseMiningPassFees.selector;
IDiamondCut.FacetCut[] memory cuts = new IDiamondCut.FacetCut[](3);
cuts[0] = IDiamondCut.FacetCut({
facetAddress: address(newDMF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: dmfSelectors
});
cuts[1] = IDiamondCut.FacetCut({
facetAddress: address(newMPF), action: IDiamondCut.FacetCutAction.Replace, functionSelectors: mpfSelectors
});
cuts[2] = IDiamondCut.FacetCut({
facetAddress: address(newDMF), action: IDiamondCut.FacetCutAction.Add, functionSelectors: newSelectors
});
vm.prank(owner);
IDiamondCut(DIAMOND).diamondCut(cuts, address(0), "");
}
🤖 Prompt for AI Agents
In `@test/foundry/fork/FujiForkMiningPass.t.sol` around lines 126 - 151, _issue_:
_upgradeFacets currently only uses IDiamondCut.FacetCutAction.Replace for
newDMF/newMPF so any brand-new selectors (the 10 dynamic-pricing functions like
setMiningPassFees, setBaseMiningPassFees, setMiningPassFeeFloor,
getMiningPassTierFee, getBaseMiningPassTierFee, getMiningPassTierFloorFee,
getMiningPassFeeFloorBps, getAllMiningPassFees, getMiningPassTierDepositLimit,
setMiningPassTierFee) are never registered and later calls in
_reinitializeMiningPassData and tests will revert. Fix: update _upgradeFacets to
include an additional IDiamondCut.FacetCut with action
IDiamondCut.FacetCutAction.Add that registers the new function selectors from
the new facet instances (newDMF/newMPF); adjust the cuts array size (e.g., from
2 to 3) and populate the Add cut with the bytes4[] of the new selectors for
whichever facet defines those functions so the diamond actually adds those
functions instead of only replacing existing selectors.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (4)
src/upgradeInitializers/FujiMigrationInit.sol (3)

63-93: Slot 28 is not zeroed in the assembly cleanup block.

Slots 27, 29, and 30 are explicitly cleared, but slot 28 is skipped with a comment that miningPassFeeFloorBps "will be set below" (line 164). While the transaction is atomic so the stale stratosphereAddress value will be overwritten, for consistency and defensive coding it's cleaner to also zero slot 28 alongside its neighbors—especially since this is a one-shot migration and the cost is a single sstore.

Proposed fix
             // Clear old slots that are no longer used as scalars
             // Slot 27 is now baseMiningPassTierToFee (mapping base = 0)
             sstore(27, 0)
             // Slot 28 is now miningPassFeeFloorBps (will be set below)
+            sstore(28, 0)
             // Slot 29 is now gelatoExecutor (address, leave 0 for now)
             sstore(29, 0)
             // Slot 30 is now isSeasonClaimed (mapping base = 0)
             sstore(30, 0)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/upgradeInitializers/FujiMigrationInit.sol` around lines 63 - 93, The
assembly cleanup block in FujiMigrationInit.sol currently clears storage slots
27, 29, and 30 but skips slot 28 (commenting that miningPassFeeFloorBps will be
set later), leaving stratosphereAddress stale until overwritten; add an sstore
to explicitly zero slot 28 in the same assembly cleanup sequence so slot 28 is
cleared defensively along with 27/29/30 before subsequent writes to
miningPassFeeFloorBps or stratosphereAddress, referencing the existing assembly
block that performs sstore for slots 27, 29, and 30 and the storage variables
stratosphereAddress and miningPassFeeFloorBps.

39-41: No re-execution guard for this one-time migration.

The comment states this must be called "exactly once," but there is no on-chain mechanism (e.g., a boolean flag checked and set at the top of init) to enforce that. While the diamond owner controls when diamondCut is called, an accidental re-execution would silently overwrite storage—particularly risky for the scalar slot relocations, which would read already-migrated (or zeroed) values and propagate them.

For a testnet migration this is low-risk, but consider adding a simple guard.

Example guard
 function init(Args memory _args) external {
+    // Prevent accidental re-execution
+    require(s.miningPassFeeFloorBps == 0, "FujiMigration: already executed");
+
     // ─── Step 1: Migrate GENERAL scalars ...

This works because miningPassFeeFloorBps is a new field that starts at 0 and gets set to 2500 at the end of init. Any re-invocation would see 2500 and revert.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/upgradeInitializers/FujiMigrationInit.sol` around lines 39 - 41, The init
function in FujiMigrationInit.sol is missing a one-time execution guard, so
re-running the migration can silently overwrite storage; add a guard at the top
of init (e.g., a boolean _fujiMigrationInitialized or check the new scalar
miningPassFeeFloorBps) that reverts if the migration already ran, and set that
flag (or set miningPassFeeFloorBps to its post-migration value like 2500) at the
end of init; reference the init function and the miningPassFeeFloorBps storage
slot (or a new private bool) to implement the check-and-set pattern to prevent
re-execution.

124-148: Fee tier values are duplicated between miningPassTierToFee and baseMiningPassTierToFee.

Lines 125-135 and 138-148 are identical value assignments to two different mappings. This is intentional (base fees serve as the reference for dynamic pricing), but consider extracting a helper or loop to avoid the copy-paste and ensure they stay in sync if values change.

Example helper approach
+    // Helper: set fee for both current and base mappings
+    uint256[11] memory fees = [
+        uint256(0), 0.5 * 1e6, 1 * 1e6, 2 * 1e6, 4 * 1e6,
+        8 * 1e6, 15 * 1e6, 30 * 1e6, 50 * 1e6, 75 * 1e6, 100 * 1e6
+    ];
+    for (uint256 i = 0; i <= 10; i++) {
+        s.miningPassTierToFee[i] = fees[i];
+        s.baseMiningPassTierToFee[i] = fees[i];
+    }
-    // Re-initialize mining pass fees (prod slot 17, old Fuji slot 23)
-    s.miningPassTierToFee[0] = 0;
-    ... (all 22 lines)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/upgradeInitializers/FujiMigrationInit.sol` around lines 124 - 148, The
assignments to miningPassTierToFee and baseMiningPassTierToFee are duplicated;
extract a single helper (e.g., internal function initializeTierFees or
_setTierFees) that accepts the tier keys and fee values and sets both mappings
in one loop, then call that helper from the initializer instead of repeating the
same value assignments so both mappings remain in sync.
test/foundry/fork/FujiForkMiningPass.t.sol (1)

220-237: Overly broad catch in _ensureActiveSeason can silently swallow post-migration errors.

Any revert from getSeasonEndTimestamp — including ABI mismatches, storage layout corruption, or out-of-gas — is silently caught and the helper just starts a new season. A more targeted check (e.g. catch the known "no season" revert string) would avoid masking real post-upgrade breakage.

♻️ Suggested narrowing
         try diamond.getSeasonEndTimestamp(seasonId) returns (uint256 endTs) {
             if (endTs <= block.timestamp) {
                 vm.prank(owner);
                 diamond.startNewSeasonWithEndTimestamp(1000 * 1e18, block.timestamp + 30 days);
             }
-        } catch {
+        } catch Error(string memory reason) {
+            // Only proceed if the season simply doesn't exist yet
+            bytes32 h = keccak256(bytes(reason));
+            require(
+                h == keccak256("Season does not exist") || h == keccak256("no active season"),
+                string.concat("_ensureActiveSeason: unexpected revert: ", reason)
+            );
             vm.prank(owner);
             diamond.startNewSeasonWithEndTimestamp(1000 * 1e18, block.timestamp + 30 days);
         }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/foundry/fork/FujiForkMiningPass.t.sol` around lines 220 - 237, The catch
in _ensureActiveSeason is too broad and may swallow real post-migration failures
from getSeasonEndTimestamp; change the handler to only absorb the expected "no
season" (or the exact revert string/selector your contract emits) and rethrow
any other errors—locate _ensureActiveSeason and the call to
getSeasonEndTimestamp and update the catch to inspect the error (e.g., check
error.message or error.selector for the known "no season" marker)
returning/starting a new season only when that specific condition matches,
otherwise propagate the error so unexpected ABI/storage/OOG issues surface.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/upgradeInitializers/FujiMigrationInit.sol`:
- Around line 99-122: The three loops that iterate over _args.*FeeReceivers and
index into the matching *FeeReceiversShares arrays must validate that the
receiver and share arrays have identical lengths before looping; add explicit
checks (e.g., require(_args.partnerFeeReceivers.length ==
_args.partnerFeeReceiversShares.length, "partner fee receivers/shares length
mismatch")) for each pair (partnerFeeReceivers vs partnerFeeReceiversShares,
platformFeeReceivers vs platformFeeReceiversShares, referralFeeReceivers vs
referralFeeReceiversShares) in the FujiMigrationInit migration function so you
revert with a clear error when lengths differ rather than allowing out-of-bounds
panics or silently dropping extra shares.

In `@test/foundry/fork/FujiForkMiningPass.t.sol`:
- Around line 64-87: The test test_Fork_PreMigration_OldSlotLayout currently
uses vm.createSelectFork("fuji") which forks the current head and will break
once the migration runs; update the test to pin the fork to a specific block
number captured before the migration by calling vm.createSelectFork("fuji",
PRE_MIGRATION_BLOCK) (replace PRE_MIGRATION_BLOCK with the explicit block number
taken from the chain before the migration) so the assertions about slots 25–28
remain stable across CI runs.
- Around line 191-201: The test currently calls
loupe.facetFunctionSelectors(loupe.facetAddress(knownSelector)) without checking
for address(0), which causes diamondCut to fail cryptically if knownSelector
isn't registered; modify the test in FujiForkMiningPass.t.sol to first call
address facetAddr = loupe.facetAddress(knownSelector) and check facetAddr !=
address(0) before calling loupe.facetFunctionSelectors, and if it's address(0)
either skip/assert-fail with a clear message identifying knownSelector or throw
a descriptive revert/log so you never pass an empty selectors array into
diamondCut.

---

Duplicate comments:
In `@src/libraries/AppStorage.sol`:
- Around line 90-101: The new fields baseMiningPassTierToFee and
miningPassFeeFloorBps were inserted before existing storage entries causing
gelatoExecutor and isSeasonClaimed to shift and collide with deployed storage;
move the two new variables to be declared after isSeasonClaimed in
AppStorage.sol so the original ordering (gelatoExecutor then isSeasonClaimed at
their original slots) remains unchanged, update any comments/struct layout
notes, and run the provided verification script to confirm slot ordering; ensure
no other reordering occurred for emissionsManager/baseMiningPass or related
fields.

In `@test/foundry/fork/FujiForkMiningPass.t.sol`:
- Around line 177-201: _deployAndReplace currently only emits
FacetCutAction.Replace using loupe.facetFunctionSelectors(oldFacet), so any
selectors that are new in the upgraded DiamondManagerFacet (e.g.
getMiningPassTierFee, getMiningPassFeeFloorBps) never get registered and calls
revert; fix by computing the selector delta between the new facet's selector
list and the old facet's list (or use a hard-coded new-selector list) and append
a FacetCutAction.Add entry for that delta before making the cut array;
specifically update _deployAndReplace to call
loupe.facetFunctionSelectors(oldFacet), obtain newSelectors from the compiled
DiamondManagerFacet, diff newSelectors - oldSelectors (or use the hard-coded
list), and push a {facetAddress: newFacet, action: FacetCutAction.Add,
functionSelectors: delta} entry in addition to the existing Replace entry so the
new getters (getMiningPassTierFee, getMiningPassFeeFloorBps) are registered.

---

Nitpick comments:
In `@src/upgradeInitializers/FujiMigrationInit.sol`:
- Around line 63-93: The assembly cleanup block in FujiMigrationInit.sol
currently clears storage slots 27, 29, and 30 but skips slot 28 (commenting that
miningPassFeeFloorBps will be set later), leaving stratosphereAddress stale
until overwritten; add an sstore to explicitly zero slot 28 in the same assembly
cleanup sequence so slot 28 is cleared defensively along with 27/29/30 before
subsequent writes to miningPassFeeFloorBps or stratosphereAddress, referencing
the existing assembly block that performs sstore for slots 27, 29, and 30 and
the storage variables stratosphereAddress and miningPassFeeFloorBps.
- Around line 39-41: The init function in FujiMigrationInit.sol is missing a
one-time execution guard, so re-running the migration can silently overwrite
storage; add a guard at the top of init (e.g., a boolean
_fujiMigrationInitialized or check the new scalar miningPassFeeFloorBps) that
reverts if the migration already ran, and set that flag (or set
miningPassFeeFloorBps to its post-migration value like 2500) at the end of init;
reference the init function and the miningPassFeeFloorBps storage slot (or a new
private bool) to implement the check-and-set pattern to prevent re-execution.
- Around line 124-148: The assignments to miningPassTierToFee and
baseMiningPassTierToFee are duplicated; extract a single helper (e.g., internal
function initializeTierFees or _setTierFees) that accepts the tier keys and fee
values and sets both mappings in one loop, then call that helper from the
initializer instead of repeating the same value assignments so both mappings
remain in sync.

In `@test/foundry/fork/FujiForkMiningPass.t.sol`:
- Around line 220-237: The catch in _ensureActiveSeason is too broad and may
swallow real post-migration failures from getSeasonEndTimestamp; change the
handler to only absorb the expected "no season" (or the exact revert
string/selector your contract emits) and rethrow any other errors—locate
_ensureActiveSeason and the call to getSeasonEndTimestamp and update the catch
to inspect the error (e.g., check error.message or error.selector for the known
"no season" marker) returning/starting a new season only when that specific
condition matches, otherwise propagate the error so unexpected ABI/storage/OOG
issues surface.

Comment on lines +99 to +122
// Unlock fee receivers (prod slots 9-10, old Fuji slots 12-13)
// Clear any stale length at slots 9-10 first
_clearArray(9);
_clearArray(10);
for (uint256 i = 0; i < _args.unlockFeeReceivers.length; i++) {
s.unlockFeeReceivers.push(_args.unlockFeeReceivers[i]);
s.unlockFeeReceiversShares.push(_args.unlockFeeReceiversShares[i]);
}

// Boost fee receivers (prod slots 14-15, old Fuji slots 17-18)
_clearArray(14);
_clearArray(15);
for (uint256 i = 0; i < _args.boostFeeReceivers.length; i++) {
s.boostFeeReceivers.push(_args.boostFeeReceivers[i]);
s.boostFeeReceiversShares.push(_args.boostFeeReceiversShares[i]);
}

// Mining pass fee receivers (prod slots 19-20, never existed in 14f06c0)
_clearArray(19);
_clearArray(20);
for (uint256 i = 0; i < _args.miningPassFeeReceivers.length; i++) {
s.miningPassFeeReceivers.push(_args.miningPassFeeReceivers[i]);
s.miningPassFeeReceiversShares.push(_args.miningPassFeeReceiversShares[i]);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Missing array-length validation between fee receivers and their shares.

All three loops iterate over _args.*FeeReceivers.length and index into the corresponding *FeeReceiversShares array at the same index. If shares.length < receivers.length, the transaction reverts with a generic Panic(0x32) (index out of bounds) instead of a descriptive error. If shares.length > receivers.length, the extra shares are silently dropped, producing an inconsistent configuration.

Add explicit length checks before each loop.

Proposed fix
+        require(
+            _args.unlockFeeReceivers.length == _args.unlockFeeReceiversShares.length,
+            "FujiMigration: unlock receivers/shares length mismatch"
+        );
         // Unlock fee receivers (prod slots 9-10, old Fuji slots 12-13)
         _clearArray(9);
         _clearArray(10);
         for (uint256 i = 0; i < _args.unlockFeeReceivers.length; i++) {
             s.unlockFeeReceivers.push(_args.unlockFeeReceivers[i]);
             s.unlockFeeReceiversShares.push(_args.unlockFeeReceiversShares[i]);
         }
 
+        require(
+            _args.boostFeeReceivers.length == _args.boostFeeReceiversShares.length,
+            "FujiMigration: boost receivers/shares length mismatch"
+        );
         // Boost fee receivers (prod slots 14-15, old Fuji slots 17-18)
         _clearArray(14);
         _clearArray(15);
         for (uint256 i = 0; i < _args.boostFeeReceivers.length; i++) {
             s.boostFeeReceivers.push(_args.boostFeeReceivers[i]);
             s.boostFeeReceiversShares.push(_args.boostFeeReceiversShares[i]);
         }
 
+        require(
+            _args.miningPassFeeReceivers.length == _args.miningPassFeeReceiversShares.length,
+            "FujiMigration: miningPass receivers/shares length mismatch"
+        );
         // Mining pass fee receivers (prod slots 19-20, never existed in 14f06c0)
         _clearArray(19);
         _clearArray(20);
         for (uint256 i = 0; i < _args.miningPassFeeReceivers.length; i++) {
             s.miningPassFeeReceivers.push(_args.miningPassFeeReceivers[i]);
             s.miningPassFeeReceiversShares.push(_args.miningPassFeeReceiversShares[i]);
         }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Unlock fee receivers (prod slots 9-10, old Fuji slots 12-13)
// Clear any stale length at slots 9-10 first
_clearArray(9);
_clearArray(10);
for (uint256 i = 0; i < _args.unlockFeeReceivers.length; i++) {
s.unlockFeeReceivers.push(_args.unlockFeeReceivers[i]);
s.unlockFeeReceiversShares.push(_args.unlockFeeReceiversShares[i]);
}
// Boost fee receivers (prod slots 14-15, old Fuji slots 17-18)
_clearArray(14);
_clearArray(15);
for (uint256 i = 0; i < _args.boostFeeReceivers.length; i++) {
s.boostFeeReceivers.push(_args.boostFeeReceivers[i]);
s.boostFeeReceiversShares.push(_args.boostFeeReceiversShares[i]);
}
// Mining pass fee receivers (prod slots 19-20, never existed in 14f06c0)
_clearArray(19);
_clearArray(20);
for (uint256 i = 0; i < _args.miningPassFeeReceivers.length; i++) {
s.miningPassFeeReceivers.push(_args.miningPassFeeReceivers[i]);
s.miningPassFeeReceiversShares.push(_args.miningPassFeeReceiversShares[i]);
}
require(
_args.unlockFeeReceivers.length == _args.unlockFeeReceiversShares.length,
"FujiMigration: unlock receivers/shares length mismatch"
);
// Unlock fee receivers (prod slots 9-10, old Fuji slots 12-13)
_clearArray(9);
_clearArray(10);
for (uint256 i = 0; i < _args.unlockFeeReceivers.length; i++) {
s.unlockFeeReceivers.push(_args.unlockFeeReceivers[i]);
s.unlockFeeReceiversShares.push(_args.unlockFeeReceiversShares[i]);
}
require(
_args.boostFeeReceivers.length == _args.boostFeeReceiversShares.length,
"FujiMigration: boost receivers/shares length mismatch"
);
// Boost fee receivers (prod slots 14-15, old Fuji slots 17-18)
_clearArray(14);
_clearArray(15);
for (uint256 i = 0; i < _args.boostFeeReceivers.length; i++) {
s.boostFeeReceivers.push(_args.boostFeeReceivers[i]);
s.boostFeeReceiversShares.push(_args.boostFeeReceiversShares[i]);
}
require(
_args.miningPassFeeReceivers.length == _args.miningPassFeeReceiversShares.length,
"FujiMigration: miningPass receivers/shares length mismatch"
);
// Mining pass fee receivers (prod slots 19-20, never existed in 14f06c0)
_clearArray(19);
_clearArray(20);
for (uint256 i = 0; i < _args.miningPassFeeReceivers.length; i++) {
s.miningPassFeeReceivers.push(_args.miningPassFeeReceivers[i]);
s.miningPassFeeReceiversShares.push(_args.miningPassFeeReceiversShares[i]);
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/upgradeInitializers/FujiMigrationInit.sol` around lines 99 - 122, The
three loops that iterate over _args.*FeeReceivers and index into the matching
*FeeReceiversShares arrays must validate that the receiver and share arrays have
identical lengths before looping; add explicit checks (e.g.,
require(_args.partnerFeeReceivers.length ==
_args.partnerFeeReceiversShares.length, "partner fee receivers/shares length
mismatch")) for each pair (partnerFeeReceivers vs partnerFeeReceiversShares,
platformFeeReceivers vs platformFeeReceiversShares, referralFeeReceivers vs
referralFeeReceiversShares) in the FujiMigrationInit migration function so you
revert with a clear error when lengths differ rather than allowing out-of-bounds
panics or silently dropping extra shares.

Comment on lines +64 to +87
function test_Fork_PreMigration_OldSlotLayout() public {
// Old Fuji 14f06c0 layout: GENERAL at slots 25-28
address slot25 = _readAddress(25);
address slot26 = _readAddress(26);
address slot27 = _readAddress(27);
address slot28 = _readAddress(28);

console.log("Pre-migration (old Fuji slots 25-28):");
console.log(" slot 25:", slot25);
console.log(" slot 26:", slot26);
console.log(" slot 27:", slot27);
console.log(" slot 28:", slot28);

assertEq(slot25, VPND, "old slot 25 should be VPND (depositToken)");
assertEq(slot26, VAPE, "old slot 26 should be VAPE (rewardToken)");
assertEq(slot27, USDC, "old slot 27 should be USDC (feeToken)");
assertEq(slot28, STRATOSPHERE, "old slot 28 should be stratosphere");

// Production slots 21-24 should NOT have these values yet
address slot21 = _readAddress(21);
address slot23 = _readAddress(23);
assertTrue(slot21 != VPND, "prod slot 21 should NOT have VPND yet");
assertTrue(slot23 != USDC, "prod slot 23 should NOT have USDC yet");
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

test_Fork_PreMigration_OldSlotLayout is permanently broken once the migration runs on Fuji.

vm.createSelectFork("fuji") without a block number always forks the current chain head. As soon as this PR's migration is applied to the Fuji testnet, slots 25–28 will no longer hold the old VPND/VAPE/USDC/STRATOSPHERE values, causing every assertion in this test to fail on every subsequent CI run.

Pin the fork to a block number captured before the migration to keep the pre-migration assertion stable:

-        vm.createSelectFork("fuji");
+        vm.createSelectFork("fuji", PRE_MIGRATION_BLOCK);   // e.g. uint256 constant PRE_MIGRATION_BLOCK = 12345678;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/foundry/fork/FujiForkMiningPass.t.sol` around lines 64 - 87, The test
test_Fork_PreMigration_OldSlotLayout currently uses vm.createSelectFork("fuji")
which forks the current head and will break once the migration runs; update the
test to pin the fork to a specific block number captured before the migration by
calling vm.createSelectFork("fuji", PRE_MIGRATION_BLOCK) (replace
PRE_MIGRATION_BLOCK with the explicit block number taken from the chain before
the migration) so the assertions about slots 25–28 remain stable across CI runs.

Comment on lines +191 to +201
function _deployAndReplace(address newFacet, bytes4 knownSelector)
internal
view
returns (IDiamondCut.FacetCut memory)
{
address oldFacet = loupe.facetAddress(knownSelector);
bytes4[] memory selectors = loupe.facetFunctionSelectors(oldFacet);
return IDiamondCut.FacetCut({
facetAddress: newFacet, action: IDiamondCut.FacetCutAction.Replace, functionSelectors: selectors
});
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing guard for address(0) returned by the loupe — results in a cryptic diamondCut revert.

If knownSelector is not registered on the current Fuji diamond, loupe.facetAddress(knownSelector) returns address(0), and loupe.facetFunctionSelectors(address(0)) returns an empty array. Passing an empty functionSelectors array to diamondCut violates EIP-2535 and will revert with "LibDiamondCut: No selectors in facet to cut" — with no indication of which facet/selector caused the problem.

🛡️ Proposed fix
 function _deployAndReplace(address newFacet, bytes4 knownSelector)
     internal
     view
     returns (IDiamondCut.FacetCut memory)
 {
     address oldFacet = loupe.facetAddress(knownSelector);
+    require(oldFacet != address(0), "_deployAndReplace: knownSelector not registered on diamond");
     bytes4[] memory selectors = loupe.facetFunctionSelectors(oldFacet);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/foundry/fork/FujiForkMiningPass.t.sol` around lines 191 - 201, The test
currently calls loupe.facetFunctionSelectors(loupe.facetAddress(knownSelector))
without checking for address(0), which causes diamondCut to fail cryptically if
knownSelector isn't registered; modify the test in FujiForkMiningPass.t.sol to
first call address facetAddr = loupe.facetAddress(knownSelector) and check
facetAddr != address(0) before calling loupe.facetFunctionSelectors, and if it's
address(0) either skip/assert-fail with a clear message identifying
knownSelector or throw a descriptive revert/log so you never pass an empty
selectors array into diamondCut.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant