-
Notifications
You must be signed in to change notification settings - Fork 176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: pass scale to DF round in spark_round #1341
base: main
Are you sure you want to change the base?
Conversation
DataType::Float32 | DataType::Float64 => { | ||
Ok(ColumnarValue::Array(round(&[Arc::clone(array)])?)) | ||
} | ||
DataType::Float32 | DataType::Float64 => Ok(ColumnarValue::Array(round(&[ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI round for float is disabled https://github.com/apache/datafusion-comet/blob/main/spark/src/main/scala/org/apache/comet/serde/QueryPlanSerde.scala#L1739
BTW we should be able to use point
instead of args[1]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
point is being set to the scalar value on this line and we want to pass the columnar value to DF
let ColumnarValue::Scalar(ScalarValue::Int64(Some(point))) = point else
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in my use case, im not using comet's query planning so it should be fine
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this isn't being tested end-to-end, could you add a Rust unit test so that we protect against regressions in the future?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Still for the reason mentioned in the link, it may not work some cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added tests for floats
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1341 +/- ##
=============================================
- Coverage 56.12% 39.09% -17.04%
- Complexity 976 2065 +1089
=============================================
Files 119 260 +141
Lines 11743 60269 +48526
Branches 2251 12834 +10583
=============================================
+ Hits 6591 23562 +16971
- Misses 4012 32226 +28214
- Partials 1140 4481 +3341 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am ok to accept this PR as round is disabled for floating points. Current round behavior is not Spark compatible and we will change the logic to match with the Spark behavior in the future though.
|
||
#[test] | ||
fn test_round_f32() -> Result<()> { | ||
let args = vec![ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally we should have scalar test cases as well as random value tests and ideally we need expected values from Spark results
Which issue does this PR close?
Closes #1340.
Rationale for this change
What changes are included in this PR?
How are these changes tested?