Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion python/ray/data/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
# We will attempt to slice blocks whose size exceeds this factor *
# target_max_block_size. We will warn the user if slicing fails and we produce
# blocks larger than this threshold.
MAX_SAFE_BLOCK_SIZE_FACTOR = 1.5
MAX_SAFE_BLOCK_SIZE_FACTOR = float(os.environ.get("RAY_DATA_MAX_SAFE_BLOCK_SIZE_FACTOR", "1.5"))

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to make this as part of the DataContext configuration rather than using the environment variable?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is not part of data context, so may be harder to do it, and it will serve the same purpose here?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mainly for making it consistent with what Ray Data config things


# Dataset will avoid creating blocks smaller than this size in bytes on read.
# This takes precedence over DEFAULT_MIN_PARALLELISM.
Expand Down
Loading