-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-51254][PYTHON][CONNECT] Disallow --master with Spark Connect URL #50000
Conversation
68a02ed
to
10a18b6
Compare
@@ -507,6 +507,11 @@ | |||
"Variant binary is malformed. Please check the data source is valid." | |||
] | |||
}, | |||
"MASTER_URL_INVALID": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't we already have a error condition for it? How do we validate MASTER url before without Spark Connect?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We actually don't have the error class (and even if there is, that has to be listed in JVM's error class JSON file). The error is thrown from prepareSubmitEnvironment
at core/src/main/scala/org/apache/spark/deploy/SparkSubmit
.
@@ -508,6 +508,11 @@ def getOrCreate(self) -> "SparkSession": | |||
|
|||
if url is None and is_api_mode_connect: | |||
url = opts.get("spark.master", os.environ.get("MASTER", "local")) | |||
if url.startswith("sc://"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of a bandaid fix, can we refactor the code to have different branches for --remote
and --master
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem is that this is Python specific. Spark Connect Server is launched without Spark Submit itself. For other cases, the exceptions are covered via SparkSubmit.prepareSubmitEnvironment
.
Merged to master and branch-4.0. |
### What changes were proposed in this pull request? This PR proposes to disallow Spark Connect strings in `--master` when Spark API mode is `connect`. This is Python specific issue. ### Why are the changes needed? Should work as documented in #49107 ### Does this PR introduce _any_ user-facing change? Not yet because the main change has not been released (#49107) ### How was this patch tested? Manually tested: ``` ./bin/pyspark --master "sc://localhost:15002" --conf spark.api.mode=connect ``` ``` Python 3.11.9 (main, Apr 19 2024, 11:44:45) [Clang 14.0.6 ] on darwin Type "help", "copyright", "credits" or "license" for more information. /.../spark/python/pyspark/shell.py:77: UserWarning: Failed to initialize Spark session. warnings.warn("Failed to initialize Spark session.") Traceback (most recent call last): File "/.../spark/python/pyspark/shell.py", line 52, in <module> spark = SparkSession.builder.getOrCreate() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.../spark/python/pyspark/sql/session.py", line 512, in getOrCreate raise PySparkRuntimeError( pyspark.errors.exceptions.base.PySparkRuntimeError: [MASTER_URL_INVALID] Master must either be yarn or start with spark, k8s, or local. ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #50000 from HyukjinKwon/SPARK-51254. Authored-by: Hyukjin Kwon <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]> (cherry picked from commit 6603a4e) Signed-off-by: Hyukjin Kwon <[email protected]>
What changes were proposed in this pull request?
This PR proposes to disallow Spark Connect strings in
--master
when Spark API mode isconnect
. This is Python specific issue.Why are the changes needed?
Should work as documented in #49107
Does this PR introduce any user-facing change?
Not yet because the main change has not been released (#49107)
How was this patch tested?
Manually tested:
Was this patch authored or co-authored using generative AI tooling?
No.