-
Notifications
You must be signed in to change notification settings - Fork 244
build: Add Spark 4.0 to release build script #2514
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@parthchandra will the jars for Spark 3.x still work with JDK 11, or do we also need to update the target java version in the Maven build? |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #2514 +/- ##
============================================
+ Coverage 56.12% 58.49% +2.36%
- Complexity 976 1447 +471
============================================
Files 119 146 +27
Lines 11743 13550 +1807
Branches 2251 2356 +105
============================================
+ Hits 6591 7926 +1335
- Misses 4012 4390 +378
- Partials 1140 1234 +94 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Let me verify that. |
The jars for Spark 3.x will work with JDK 11. The pom file already specifies
For the spark 4.0 profile this is overridden with
Also the compiler plugin specifies the configured version so it is all consistent.
I also built with JDK 17 and the byte code versions for the default profile (Spark 3.5) were compatible with JDK 8 (major version 52) for the classes I checked. With the spark-4.0 profile the bytecode versions corresponded to JDK 17 (major version 61) |
The <profile>
<id>jdk17</id>
<activation>
<jdk>17</jdk>
</activation>
<properties>
<java.version>17</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>
</profile> So if I build with JDK 17, then I get JDK 17 classes. If I try and run with JDK 11, I get:
I think we need to override the |
Added jdk 11 explicitly for spark 3.4, 3.5 profiles |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @parthchandra. I tested this locally, and LGTM
Which issue does this PR close?
Closes #1989
Updates the build script to build spark 4.0 jars.
Build now requires JDK 17 for all versions.