Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-49642][SQL] Remove the ANSI config suggestion in DATETIME_FIELD_OUT_OF_BOUNDS #49670

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

the-sakthi
Copy link
Member

What changes were proposed in this pull request?

Removal of ANSI turn off suggestion for DATETIME_FIELD_OUT_OF_BOUNDS error message.

Why are the changes needed?

Now that in Spark 4.0.0 we have moved to ANSI mode on by default, we want to keep suggestions of this kind to the minimum.

Does this PR introduce any user-facing change?

Yes, error message has changed.

How was this patch tested?

Existing tests.

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the SQL label Jan 25, 2025
@mihailom-db
Copy link
Contributor

@the-sakthi just linking the PR that addresses some issues around this error #48242. Due to long time it being open, github closed it. We need to make sure suggestions are aligned with this.

@mihailom-db
Copy link
Contributor

TLDR; I would say it is better if we first fix the java specific message and only then remove the ANSI suggestion.

@the-sakthi
Copy link
Member Author

Ack, let me take a look at the linked PR @mihailom-db

Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unless we are prepared to start abandoning support for non-ANSI, I don't think we should remove similar prompts. If there is an intention to abandon non-ANSI support, I suggest initiating a discussion in the dev mailing list first.

@mihailom-db
Copy link
Contributor

@LuciferYang thanks for the concern, but there a couple of reasons why we would want to do this, apart from abandoning non-ANSI behaviour.

In spark 4.0.0 ANSI is turned on by default. Because of this, we need to make sure we do not suggest turning off ANSI config that easily. Previously this suggestion made sense, as users had to had set ANSI config explicitly, so suggestion to turn it off was a suggestion to revert an explicit set to the default state. Now we would suggest turning off (switching to non-default value) on a config that ensures spark queries return proper result, without returning unexpected nulls on erroneous inputs.

Additionally, once user sets a config to specific value, they would usually stick with it, without considering it until they run into some problems. Switching off ANSI would make many different expressions return nulls, which is really hard to catch without inspecting data, which might not be something user wants to do, when the default behaviour of spark now is with ANSI on. Also, getting to the phase where the query already run, sometimes leads to the state where it is almost impossible to go back and revert the change in data without a big pain.

So IMO we need to keep clear the difference between the change that is coming, the switch of default value, and the newly proposed thing of abandoning non-ANSI behaviour.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants