Skip to content

Bump delta-spark from 4.0.1 to 4.1.0#57

Open
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/delta-spark-4.1.0
Open

Bump delta-spark from 4.0.1 to 4.1.0#57
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/delta-spark-4.1.0

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Mar 1, 2026

Bumps delta-spark from 4.0.1 to 4.1.0.

Release notes

Sourced from delta-spark's releases.

Delta Lake 4.1.0

We are excited to announce the release of Delta Lake 4.1.0! This release includes significant new features, performance improvements, and important platform upgrades.

Highlights

  • [Spark] Apache Spark 4.1.0 Support. The default build of Delta 4.1.0 leverages Apache Spark 4.1.0; however, it retains compatibility with Apache Spark 4.0.1.
  • [Spark] Catalog managed table enhancements (preview): Support UC managed table creation, batch read/write, streaming read/write.
  • [Spark] Spark V2 connector based on Delta Kernel API : A new Spark DataSource V2 connector backed by Delta Kernel, supporting streaming reads for catalog-managed tables.
  • [Spark] Server-Side Planning (preview): Delegate scan planning to catalog servers following the Apache Iceberg REST Catalog API. Supported filter, projection, and limits are pushed down to do the query planning.
  • [Spark] Conflict-free feature enablement: Enable Deletion Vectors and Column Mapping on existing tables without blocking or conflicting with concurrent writes.
  • [Kernel] Full support for catalog-managed tables, enabling Kernel-based connectors to interact with catalog-managed Delta tables (e.g., via Unity Catalog).

Delta Spark

Delta Spark 4.1.0 is built on Apache Spark 4.1.0 and Apache Spark 4.0.1. Similar to Apache Spark, we have released Maven artifacts for Scala 2.13.

Starting in Delta 4.1.0, Maven artifacts include a Spark version suffix (e.g., delta-spark_4.1_2.13 instead of delta-spark_2.13), with backward compatibility preserved in this release but dependency updates recommended. Separate artifacts are now published for Spark 4.1 and Spark 4.0 so users can choose the version matching their Spark runtime.

The key features of this release are:

  • Catalog-managed table enhancements: Delta Spark has added more support for Unity Catalog managed Delta tables via catalogManaged feature, enabling table creation, batch and streaming reads/writes (including time travel, and DML operations), history inspection, and OAuth-based authentication. This is still in preview and production usage is not recommended.
  • Delta v2 Spark Connector: A new Spark DataSource V2 connector backed by Delta Kernel supporting streaming read for catalog-managed table.
  • Server-Side Planning (preview): Delegate scan planning to an external catalog server, with filter, projection, and limit pushdown and multi-cloud credential support.
  • Atomic CTAS: CREATE TABLE AS SELECT for UC managed delta tables (MANAGED and EXTERNAL) is now fully atomic, working with UC 0.4.0. Other operations (including REPLACE TABLE, REPLACE TABLE AS SELECT, CREATE OR REPLACE TABLE, Dynamic Partition Overwrite) now fail fast instead of running in best-effort mode.
  • Conflict-free Deletion Vector enablement: Enable Deletion Vectors on existing tables without conflicting with concurrent transactions or requiring a maintenance window.

... (truncated)

Commits
  • 7d2762e Setting version to 4.1.0
  • d8d533d [Build] Remove the delta-hudi and delta-contribs spark suffix for artifacts. ...
  • 2d9a652 [kernel-spark] Refactor v1 connector to reuse schemaReadOptions constructor (...
  • 9bc0218 [ServerSidePlanning] Azure and GCS credential usage modifications (#6077)
  • ff90bc6 [Spark] Make null partition check use physical column names during lookup (#6...
  • ed554d0 [Protocol RFC] Specify collation readers and writers requirements (#3741)
  • 3daac0c [Spark] Remove unused TrackingGenericInMemoryCommitCoordinatorBuilder (#6068)
  • 7212314 [Protocol] Merging the catalog-managed tables to delta protocol (#6066)
  • b95698a [BUILD] Use per-variant SPARK_HOME for integration tests (#6067)
  • 488c916 [Spark] Enable QoL features for catalogManaged tables (#5841)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [delta-spark](https://github.com/delta-io/delta) from 4.0.1 to 4.1.0.
- [Release notes](https://github.com/delta-io/delta/releases)
- [Commits](delta-io/delta@v4.0.1...v4.1.0)

---
updated-dependencies:
- dependency-name: delta-spark
  dependency-version: 4.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels Mar 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants