Skip to content

[VL][BUG] Incorrect decimal casting from floating point value #11402

@Surbhi-Vijay

Description

@Surbhi-Vijay

Backend

VL (Velox)

Bug description

Decimal casting from floating point value is resulting in incorrect value.

cast('100.12' as decimal(5, 1))
Spark answer: 100.1
Gluten answer: 10.0

Same issue is happening with

Repro Testcase:

  test("Incorrect decimal casting") {
    withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") {
      withTable("dynparttest2") {
        Seq[(Integer, Integer)](
          (1, 1),
          (1, 3),
          (2, 3),
          (3, 3),
          (4, null),
          (5, null)
        ).toDF("key", "value").createOrReplaceTempView("src")

        // decimal
        sql("create table dynparttest2 (value int) partitioned by (pdec decimal(5, 1))")
        sql(
          """
            |insert into table dynparttest2 partition(pdec)
            | select count(*), cast('100.12' as decimal(5, 1)) as pdec from src
          """.stripMargin)
        checkAnswer(
          sql("select * from dynparttest2"),
          Seq(Row(6, new java.math.BigDecimal("100.1"))))
      }
    }
  }

Test logs

Results do not match for query:
Timezone: sun.util.calendar.ZoneInfo[id="America/Los_Angeles",offset=-28800000,dstSavings=3600000,useDaylight=true,transitions=185,lastRule=java.util.SimpleTimeZone[id=America/Los_Angeles,offset=-28800000,dstSavings=3600000,useDaylight=true,startYear=0,startMode=3,startMonth=2,startDay=8,startDayOfWeek=1,startTime=7200000,startTimeMode=0,endMode=3,endMonth=10,endDay=1,endDayOfWeek=1,endTime=7200000,endTimeMode=0]]
Timezone Env: 

== Parsed Logical Plan ==
'Project [*]
+- 'UnresolvedRelation [dynparttest2], [], false

== Analyzed Logical Plan ==
value: int, pdec: decimal(5,1)
Project [value#80, pdec#81]
+- SubqueryAlias spark_catalog.default.dynparttest2
   +- Relation spark_catalog.default.dynparttest2[value#80,pdec#81] parquet

== Optimized Logical Plan ==
Relation spark_catalog.default.dynparttest2[value#80,pdec#81] parquet

== Physical Plan ==
VeloxColumnarToRow
+- ^(1) FileFileSourceScanExecTransformer parquet spark_catalog.default.dynparttest2[value#80,pdec#81] Batched: true, DataFilters: [], Format: Parquet, Location: CatalogFileIndex(1 paths)[...../incubator-gluten/spark-warehouse/org.apa..., PartitionFilters: [], PushedFilters: [], ReadSchema: struct<value:int> NativeFilters: []

== Results ==

== Results ==
!== Correct Answer - 1 ==   == Gluten Answer - 1 ==
 struct<>                   struct<>
![6,100.1]                  [6,10.0]

Gluten version

main branch

Spark version

spark-4.0.x

Spark configurations

No response

System information

No response

Relevant logs

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No fields configured for Bug.

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions