You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
while testing the array_repeat, getting error for following statement:
checkSparkAnswerAndOperator(sql("SELECT array_repeat(_2, _4) from t1 where _4 is null"))
getting this error while trying to pass null to the count argument
- array_repeat *** FAILED *** (5 seconds, 334 milliseconds)
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0 (TID 2) (192.168.1.20 executor driver): org.apache.comet.CometNativeException: Compute error: concat requires input of at least one array
at org.apache.comet.Native.executePlan(Native Method)
at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1(CometExecIterator.scala:127)
at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1$adapted(CometExecIterator.scala:125)
at org.apache.comet.vector.NativeUtil.getNextBatch(NativeUtil.scala:157)
at org.apache.comet.CometExecIterator.getNextBatch(CometExecIterator.scala:125)
at org.apache.comet.CometExecIterator.hasNext(CometExecIterator.scala:146)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.cometcolumnartorow_nextBatch_0$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:760)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.util.Iterators$.size(Iterators.scala:29)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1953)
at org.apache.spark.rdd.RDD.$anonfun$count$1(RDD.scala:1269)
at org.apache.spark.rdd.RDD.$anonfun$count$1$adapted(RDD.scala:1269)
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92)
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
at org.apache.spark.scheduler.Task.run(Task.scala:139)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1529)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:842)
Steps to reproduce
No response
Expected behavior
It should run the test successfully, when trying to pass null in count argument
Additional context
No response
The text was updated successfully, but these errors were encountered:
Describe the bug
while testing the array_repeat, getting error for following statement:
getting this error while trying to pass
null
to thecount
argumentSteps to reproduce
No response
Expected behavior
It should run the test successfully, when trying to pass
null
in count argumentAdditional context
No response
The text was updated successfully, but these errors were encountered: