Skip to content

Conversation

@EricHayter
Copy link
Contributor

This commit adds chunking functionalities for load/save operations of bloom filters. Additional information is added in the serialization of each filter. Specifically, when saving each filter the total size of the filter is written followed by chunks of the filter (max size of 64 MB per chunk).

Addresses #5314.

This commit adds chunking functionalities for load/save operations of
bloom filters. Additional information is added in the serialization of
each filter. Specifically, when saving each filter the total size of the
filter is written followed by chunks of the filter (max size of 64 MB
per chunk).

Signed-off-by: Eric <[email protected]>
@EricHayter EricHayter force-pushed the large-sbf-serialization branch from 40e6b5b to 0fbd8dd Compare October 23, 2025 01:04
@EricHayter
Copy link
Contributor Author

I've done some manual testing to determine at what capacity and error rate would induce chunking on filters. However, when I implement the same sequences of commands specifically BF.RESERVE for that large of a filter (> 64 MB) the RUN call fails:
E20251022 20:57:24.968197 8972 rdb_load.cc:2152] Error while calling HandleAux(): Out of memory, or used memory is too high

What would be a good way of testing the functionality in the unit tests?

@romange
Copy link
Collaborator

romange commented Oct 23, 2025

I see the tests pass, can you please push the test that fails so I could advice?
I think explicit overriding of max_memory_limit limit might help...

@EricHayter
Copy link
Contributor Author

Got the test working. max_memory_limit fixed it. Thanks.

@EricHayter EricHayter marked this pull request as ready for review October 23, 2025 14:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants