Decimal casting from floating point value is resulting in incorrect value.
test("Incorrect decimal casting") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
withTable("dynparttest2") {
Seq[(Integer, Integer)](
(1, 1),
(1, 3),
(2, 3),
(3, 3),
(4, null),
(5, null)
).toDF("key", "value").createOrReplaceTempView("src")
// decimal
sql("create table dynparttest2 (value int) partitioned by (pdec decimal(5, 1))")
sql(
"""
|insert into table dynparttest2 partition(pdec)
| select count(*), cast('100.12' as decimal(5, 1)) as pdec from src
""".stripMargin)
checkAnswer(
sql("select * from dynparttest2"),
Seq(Row(6, new java.math.BigDecimal("100.1"))))
}
}
}
Results do not match for query:
Timezone: sun.util.calendar.ZoneInfo[id="America/Los_Angeles",offset=-28800000,dstSavings=3600000,useDaylight=true,transitions=185,lastRule=java.util.SimpleTimeZone[id=America/Los_Angeles,offset=-28800000,dstSavings=3600000,useDaylight=true,startYear=0,startMode=3,startMonth=2,startDay=8,startDayOfWeek=1,startTime=7200000,startTimeMode=0,endMode=3,endMonth=10,endDay=1,endDayOfWeek=1,endTime=7200000,endTimeMode=0]]
Timezone Env:
== Parsed Logical Plan ==
'Project [*]
+- 'UnresolvedRelation [dynparttest2], [], false
== Analyzed Logical Plan ==
value: int, pdec: decimal(5,1)
Project [value#80, pdec#81]
+- SubqueryAlias spark_catalog.default.dynparttest2
+- Relation spark_catalog.default.dynparttest2[value#80,pdec#81] parquet
== Optimized Logical Plan ==
Relation spark_catalog.default.dynparttest2[value#80,pdec#81] parquet
== Physical Plan ==
VeloxColumnarToRow
+- ^(1) FileFileSourceScanExecTransformer parquet spark_catalog.default.dynparttest2[value#80,pdec#81] Batched: true, DataFilters: [], Format: Parquet, Location: CatalogFileIndex(1 paths)[...../incubator-gluten/spark-warehouse/org.apa..., PartitionFilters: [], PushedFilters: [], ReadSchema: struct<value:int> NativeFilters: []
== Results ==
== Results ==
!== Correct Answer - 1 == == Gluten Answer - 1 ==
struct<> struct<>
![6,100.1] [6,10.0]
Backend
VL (Velox)
Bug description
Decimal casting from floating point value is resulting in incorrect value.
cast('100.12' as decimal(5, 1))Spark answer:
100.1Gluten answer:
10.0Same issue is happening with
Repro Testcase:
Test logs
Gluten version
main branch
Spark version
spark-4.0.x
Spark configurations
No response
System information
No response
Relevant logs