-
Notifications
You must be signed in to change notification settings - Fork 550
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memoize transaction-related hashes #14752
Comments
Now it could shave ~30% I think. With the increased number of events it could be more than that (need to measure) |
Current state Recent Connected to mainnet, it took 4 minutes to load the frontier (down from 5m11s on 3.0.1 release). When tested with the A few more optimizations to reduce hashing could be performed (as described by the issue). Alternatively, some smart parallelization strategies could be employed to run 2-3 hashing procedures in parallel after synchronizing 2-3 ledgers instead of just one we currently synchronize for the bootstrap. With 6-core parallelism, catchup lower bound could be cut down to 90 seconds, which is acceptable. |
While examining the performance of
Staged_ledger.update_coinbase_stack_and_get_data
, I noticed that even after optimizations culminated by PR #14643, up to 50% of time is being spent in first and send passes of transaction application.This was measured on blocks of 128 txs each being a 9-account-update zkapp (deploying 8 new accounts). No events or memo were used, this might have affected the results.
When I tried to make a breakdown of cost-centers I noticed the following pieces to take significant part of the costs:
derive_token_id
inAccount_id
)Zkapp_account.Event.hash
andZkapp_account.Make_events.push_hash
)hash_zkapp_uri
inZkapp_account
)These hashing routines (unlike account hash and merge hash used in merkle tree building) are solely dependent on a transaction in question, hence can be performed before block creation (and hashes memorized).
This would reduce the demand of block window duration, hopefully reducing time of
update_coinbase_stack_and_get_data
two-fold.The text was updated successfully, but these errors were encountered: