Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: pass scale to DF round in spark_round #1341

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions native/spark-expr/src/math_funcs/round.rs
Original file line number Diff line number Diff line change
Expand Up @@ -85,9 +85,10 @@ pub fn spark_round(
let (precision, scale) = get_precision_scale(data_type);
make_decimal_array(array, precision, scale, &f)
}
DataType::Float32 | DataType::Float64 => {
Ok(ColumnarValue::Array(round(&[Arc::clone(array)])?))
}
DataType::Float32 | DataType::Float64 => Ok(ColumnarValue::Array(round(&[
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI round for float is disabled https://github.com/apache/datafusion-comet/blob/main/spark/src/main/scala/org/apache/comet/serde/QueryPlanSerde.scala#L1739

BTW we should be able to use point instead of args[1]?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

point is being set to the scalar value on this line and we want to pass the columnar value to DF

    let ColumnarValue::Scalar(ScalarValue::Int64(Some(point))) = point else 

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in my use case, im not using comet's query planning so it should be fine

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this isn't being tested end-to-end, could you add a Rust unit test so that we protect against regressions in the future?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still for the reason mentioned in the link, it may not work some cases.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added tests for floats

Arc::clone(array),
args[1].to_array(array.len())?,
])?)),
dt => exec_err!("Not supported datatype for ROUND: {dt}"),
},
ColumnarValue::Scalar(a) => match a {
Expand All @@ -109,7 +110,7 @@ pub fn spark_round(
make_decimal_scalar(a, precision, scale, &f)
}
ScalarValue::Float32(_) | ScalarValue::Float64(_) => Ok(ColumnarValue::Scalar(
ScalarValue::try_from_array(&round(&[a.to_array()?])?, 0)?,
ScalarValue::try_from_array(&round(&[a.to_array()?, args[1].to_array(1)?])?, 0)?,
)),
dt => exec_err!("Not supported datatype for ROUND: {dt}"),
},
Expand Down
Loading