Releases: confluentinc/confluent-kafka-python
v2.12.1
v2.12.1
v2.12.1 is a maintenance release with the following fixes:
- Restored macOS binaries compatibility with macOS 13
libversion()now returns the string/integer tuple for varients on version -- useversion()for string only response- Added Python 3.14 support and dropped 3.7 support -- Free-threaded capabilities not fully supported yet
- Fixed use.schema.id in
sr.lookup_schema() - Removed tomli dependency from standard (non-documentation) requirements
- Fixed experimental asyncio example files to correctly use new capabilities
- Fixed invalid argument error on schema lookups on repeat requests
- Fixed documentation generation and added error checks for builds to prevent future breaks
v2.12.0
confluent-kafka-python v2.12.0
v2.12.0 is a feature release with the following enhancements:
KIP-848 – General Availability
Starting with confluent-kafka-python 2.12.0, the next generation consumer group rebalance protocol defined in KIP-848 is production-ready. Please refer to the following migration guide for moving from classic to consumer protocol.
Note: The new consumer group protocol defined in KIP-848 is not enabled by default. There are few contract change associated with the new protocol and might cause breaking changes. group.protocol configuration property dictates whether to use the new consumer protocol or older classic protocol. It defaults to classic if not provided.
AsyncIO Producer (experimental)
Introduces beta class AIOProducer for asynchronous message production in asyncio applications.
Added
- AsyncIO Producer (experimental): Introduces beta class
AIOProducerfor
asynchronous message production in asyncio applications. This API offloads
blocking librdkafka calls to a thread pool and schedules common callbacks
(error_cb,throttle_cb,stats_cb,oauth_cb,logger) onto the event
loop for safe usage inside async frameworks.
Features
- Batched async produce:
await AIOProducer(...).produce(topic, value=...)
buffers messages and flushes when the buffer threshold or timeout is reached. - Async lifecycle:
await producer.flush(),await producer.purge(), and
transactional operations (init_transactions,begin_transaction,
commit_transaction,abort_transaction).
Limitations
- Per-message headers are not supported in the current batched async produce
path. If headers are required, use the synchronousProducer.produce(...)or
offload a sync produce call to a thread executor within your async app.
Guidance
- Use the AsyncIO Producer inside async apps/servers (FastAPI/Starlette, aiohttp,
asyncio tasks) to avoid blocking the event loop. - For batch jobs, scripts, or highest-throughput pipelines without an event
loop, the synchronousProducerremains recommended.
Enhancement and Fixes
- Kafka OAuth/OIDC metadata based authentication examples with Azure IMDS (#2083).
confluent-kafka-python v2.12.0 is based on librdkafka v2.12.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.11.1
confluent-kafka-python v2.11.1
v2.11.1 is a maintenance release with the following enhancements:
confluent-kafka-python v2.11.1 is based on librdkafka v2.11.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.11.0
confluent-kafka-python v2.11.0
v2.11.0 is a feature release with the following enhancements:
confluent-kafka-python v2.11.0 is based on librdkafka v2.11.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.10.1
confluent-kafka-python v2.10.1
v2.10.1 is a maintenance release with the following fixes
- Handled
Nonevalue for optionalctxparameter inProtobufDeserializer(#1939) - Handled
Nonevalue for optionalctxparameter inAvroDeserializer(#1973)
confluent-kafka-python v2.10.1 is based on librdkafka v2.10.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.10.0
confluent-kafka-python v2.10.0
v2.10.0 is a feature release with the following fixes and enhancements:
- [KIP-848] Group Config is now supported in AlterConfigs, IncrementalAlterConfigs and DescribeConfigs. (#1856)
- [KIP-848]
describe_consumer_groups()now supports KIP-848 introducedconsumergroups. Two new fields for consumer group type and target assignment have also been added. Type defines whether this group is aclassicorconsumergroup. Target assignment is only valid for theconsumerprotocol and its defaults to NULL. (#1873).
confluent-kafka-python v2.10.0 is based on librdkafka v2.10.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.9.0
confluent-kafka-python v2.9.0
v2.9.0 is a feature release with the following fixes and enhancements:
Add Client Credentials OAuth support for Schema Registry (#1919)
Add custom OAuth support for Schema Registry (#1925)
confluent-kafka-python v2.9.0 is based on librdkafka v2.8.0, see the librdkafka release notes for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.8.2
confluent-kafka-python v2.8.2
v2.8.2 is a maintenance release with the following fixes and enhancements:
- Fixed caching to ensure cached schema matches input. (#1922)
- Fix handling of named Avro schemas (#1928)
confluent-kafka-python v2.8.2 is based on librdkafka v2.8.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
Note: Versioning is skipped due to breaking change in v2.8.1.
Do not run software with v2.8.1 installed.
v2.8.0
confluent-kafka-python v2.8.0
v2.8.0 is a feature release with the features, fixes and enhancements:
- Ensure algorithm query param is passed for CSFLE (#1889)
- DGS-19492 Handle records nested in arrays/maps when searching for tags (#1890)
confluent-kafka-python v2.8.0 is based on librdkafka v2.8.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.
v2.7.0
confluent-kafka-python v2.7.0
Note: As part of this release, we are deprecating v2.6.2 release and yanking it from PyPI. Please refrain from using v2.6.2. Use v2.7.0 instead.
Note: This release modifies the dependencies of the Schema Registry client.
If you are using the Schema Registry client, please ensure that you install the
extra dependencies using the following syntax:
pip install confluent-kafka[schemaregistry]
or
pip install confluent-kafka[avro,schemaregistry]
Please see the README.md for more information related to installing protobuf, jsonschema or rules dependencies.
v2.7.0 is a feature release with the following features, fixes and enhancements:
- Support for Data Contracts with Schema Registry, including
- Data Quality rules
- Data Transformation rules
- Client-Side Field Level Encryption (CSFLE)
- Schema Migration rules (requires Python 3.9+)
- Migrated the Schema Registry client from requests to httpx
- Add support for multiple URLs (#409)
- Allow configuring timeout (#622)
- Fix deletion semantics (#1127)
- Python deserializer can take SR client (#1174)
- Fix handling of Avro unions (#1562)
- Remove deprecated RefResolver for JSON (#1840)
- Support delete of subject version (#1851)
- Added missing dependency on googleapis-common-protos when using protobufs. (#1881, @Tenzer)
confluent-kafka-python v2.7.0 is based on librdkafka v2.6.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.