Skip to content

Conversation

parthchandra
Copy link
Contributor

@parthchandra parthchandra commented Oct 1, 2025

Which issue does this PR close?

Closes #1989

Updates the build script to build spark 4.0 jars.
Build now requires JDK 17 for all versions.

@parthchandra parthchandra requested a review from andygrove October 1, 2025 22:52
@andygrove
Copy link
Member

andygrove commented Oct 1, 2025

@parthchandra will the jars for Spark 3.x still work with JDK 11, or do we also need to update the target java version in the Maven build?

@codecov-commenter
Copy link

codecov-commenter commented Oct 1, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 58.49%. Comparing base (f09f8af) to head (f347a53).
⚠️ Report is 568 commits behind head on main.

Additional details and impacted files
@@             Coverage Diff              @@
##               main    #2514      +/-   ##
============================================
+ Coverage     56.12%   58.49%   +2.36%     
- Complexity      976     1447     +471     
============================================
  Files           119      146      +27     
  Lines         11743    13550    +1807     
  Branches       2251     2356     +105     
============================================
+ Hits           6591     7926    +1335     
- Misses         4012     4390     +378     
- Partials       1140     1234      +94     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@parthchandra
Copy link
Contributor Author

@parthchandra will the jars for Spark 3.x still work with JDK 11, or do we also need to update the target java version in the Maven build?

Let me verify that.

@parthchandra
Copy link
Contributor Author

@parthchandra will the jars for Spark 3.x still work with JDK 11, or do we also need to update the target java version in the Maven build?

Let me verify that.

The jars for Spark 3.x will work with JDK 11.

The pom file already specifies

    <java.version>11</java.version>
    <maven.compiler.source>${java.version}</maven.compiler.source>
    <maven.compiler.target>${java.version}</maven.compiler.target>

For the spark 4.0 profile this is overridden with

        <java.version>17</java.version>
        <maven.compiler.source>${java.version}</maven.compiler.source>
        <maven.compiler.target>${java.version}</maven.compiler.target>

Also the compiler plugin specifies the configured version so it is all consistent.

        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-compiler-plugin</artifactId>
          <version>${maven-compiler-plugin.version}</version>
          <configuration>
            <source>${java.version}</source>
            <target>${java.version}</target>
            <skipMain>true</skipMain>
            <skip>true</skip>
          </configuration>
        </plugin>

I also built with JDK 17 and the byte code versions for the default profile (Spark 3.5) were compatible with JDK 8 (major version 52) for the classes I checked. With the spark-4.0 profile the bytecode versions corresponded to JDK 17 (major version 61)

@andygrove
Copy link
Member

andygrove commented Oct 2, 2025

The jars for Spark 3.x will work with JDK 11.

The pom file already specifies

    <java.version>11</java.version>
    <maven.compiler.source>${java.version}</maven.compiler.source>
    <maven.compiler.target>${java.version}</maven.compiler.target>

The java.version gets overridden by the JDK profiles though:

    <profile>
      <id>jdk17</id>
      <activation>
        <jdk>17</jdk>
      </activation>
      <properties>
        <java.version>17</java.version>
        <maven.compiler.source>${java.version}</maven.compiler.source>
        <maven.compiler.target>${java.version}</maven.compiler.target>
      </properties>
    </profile>

So if I build with JDK 17, then I get JDK 17 classes. If I try and run with JDK 11, I get:

java.lang.UnsupportedClassVersionError: org/apache/spark/sql/comet/execution/shuffle/CometBypassMergeSortShuffleWriter has been compiled by a more recent version of the Java Runtime (class file version 61.0), this version of the Java Runtime only recognizes class file versions up to 55.0

I think we need to override the maven.compiler.target (or java.version) for each Spark version to set it to the minimum supported JDK version for that Spark version.

@parthchandra
Copy link
Contributor Author

Added jdk 11 explicitly for spark 3.4, 3.5 profiles

Copy link
Member

@andygrove andygrove left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @parthchandra. I tested this locally, and LGTM

@andygrove andygrove merged commit 25745a5 into apache:main Oct 3, 2025
144 of 145 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Update release scripts to publish Comet jars for Spark 4.0.0
3 participants