Skip to content

Commit

Permalink
Merge Master
Browse files Browse the repository at this point in the history
  • Loading branch information
PratRanj07 committed Dec 26, 2024
2 parents 0d8a453 + 4f25c8c commit b88b020
Show file tree
Hide file tree
Showing 109 changed files with 12,923 additions and 1,143 deletions.
43 changes: 43 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
<!--
Suggested PR template: Fill/delete/add sections as needed. Optionally delete any commented block.
-->
What
----
<!--
Briefly describe **what** you have changed and **why**.
Optionally include implementation strategy.
-->

Checklist
------------------
- [ ] Contains customer facing changes? Including API/behavior changes <!-- This can help identify if it has introduced any breaking changes -->
- [ ] Did you add sufficient unit test and/or integration test coverage for this PR?
- If not, please explain why it is not required

References
----------
JIRA:
<!--
Copy&paste links: to Jira ticket, other PRs, issues, Slack conversations...
For code bumps: link to PR, tag or GitHub `/compare/master...master`
-->

Test & Review
------------
<!--
Has it been tested? how?
Copy&paste any handy instructions, steps or requirements that can save time to the reviewer or any reader.
-->

Open questions / Follow-ups
--------------------------
<!--
Optional: anything open to discussion for the reviewer, out of scope, or follow-ups.
-->

<!--
Review stakeholders
------------------
<!--
Optional: mention stakeholders or if special context that is required to review.
-->
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -29,3 +29,4 @@ tmp-KafkaCluster
.venv
venv_test
venv_examples
*Zone.Identifier
2 changes: 1 addition & 1 deletion .semaphore/semaphore.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ execution_time_limit:
global_job_config:
env_vars:
- name: LIBRDKAFKA_VERSION
value: v2.6.0
value: v2.6.1
prologue:
commands:
- checkout
Expand Down
56 changes: 55 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,64 @@
# Confluent's Python client for Apache Kafka

## v2.7.0

v2.7.0 is a feature release with the features, fixes and enhancements present in v2.6.2 including the following fix:

- Added missing dependency on googleapis-common-protos when using protobufs. (#1881, @tenzer)

confluent-kafka-python v2.7.0 is based on librdkafka v2.6.1, see the
[librdkafka release notes](https://github.com/confluentinc/librdkafka/releases/tag/v2.6.1)
for a complete list of changes, enhancements, fixes and upgrade considerations.

## v2.6.2

> [!WARNING]
> Due to an error in which we included dependency changes to a recent patch release, Confluent recommends users to **refrain from upgrading to 2.6.2** of Confluent Kafka. Confluent will release a new minor version, 2.7.0, where the dependency changes will be appropriately included. Users who have already upgraded to 2.6.2 and made the required dependency changes are free to remain on that version and are recommended to upgrade to 2.7.0 when that version is available. Upon the release of 2.7.0, the 2.6.2 version will be marked deprecated.
We apologize for the inconvenience and appreciate the feedback that we have gotten from the community.

v2.6.2 is a feature release with the following features, fixes and enhancements:

Note: This release modifies the dependencies of the Schema Registry client.
If you are using the Schema Registry client, please ensure that you install the
extra dependencies using the following syntax:

```
pip install confluent-kafka[schemaregistry]
```

or

```
pip install confluent-kafka[avro,schemaregistry]
```

Please see the [README.md](README.md) for more information.

- Support for Data Contracts with Schema Registry, including
- Data Quality rules
- Data Transformation rules
- Client-Side Field Level Encryption (CSFLE)
- Schema Migration rules (requires Python 3.9+)
- Migrated the Schema Registry client from requests to httpx
- Add support for multiple URLs (#409)
- Allow configuring timeout (#622)
- Fix deletion semantics (#1127)
- Python deserializer can take SR client (#1174)
- Fix handling of Avro unions (#1562)
- Remove deprecated RefResolver for JSON (#1840)
- Support delete of subject version (#1851)

confluent-kafka-python is based on librdkafka v2.6.1, see the
[librdkafka release notes](https://github.com/confluentinc/librdkafka/releases/tag/v2.6.1)
for a complete list of changes, enhancements, fixes and upgrade considerations.


## v2.6.1

v2.6.1 is a maintenance release with the following fixes and enhancements:

- Migrated build system from `setup.py` to `pyproject.toml` in accordance with `PEP 517` and `PEP 518`, improving project configuration, build system requirements management, and compatibility with modern Python packaging tools like `pip` and `build`.
- Migrated build system from `setup.py` to `pyproject.toml` in accordance with `PEP 517` and `PEP 518`, improving project configuration, build system requirements management, and compatibility with modern Python packaging tools like `pip` and `build`. (#1592)
- Removed python 3.6 support. (#1592)
- Added an example for OAUTH OIDC producer with support for confluent cloud (#1769, @sarwarbhuiyan)

confluent-kafka-python is based on librdkafka v2.6.1, see the
Expand Down
20 changes: 20 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
> [!WARNING]
> Due to an error in which we included dependency changes to a recent patch release, Confluent recommends users to **refrain from upgrading to 2.6.2** of Confluent Kafka. Confluent will release a new minor version, 2.7.0, where the dependency changes will be appropriately included. Users who have already upgraded to 2.6.2 and made the required dependency changes are free to remain on that version and are recommended to upgrade to 2.7.0 when that version is available. Upon the release of 2.7.0, the 2.6.2 version will be marked deprecated.
We apologize for the inconvenience and appreciate the feedback that we have gotten from the community.

Confluent's Python Client for Apache Kafka<sup>TM</sup>
=======================================================

Expand Down Expand Up @@ -134,6 +138,22 @@ The `Producer`, `Consumer` and `AdminClient` are all thread safe.
confluent-kafka using the instructions in the
"Install from source" section below.

To use Schema Registry with the Avro serializer/deserializer:

$ pip install confluent-kafka[avro,schemaregistry]

To use Schema Registry with the JSON serializer/deserializer:

$ pip install confluent-kafka[json,schemaregistry]

To use Schema Registry with the Protobuf serializer/deserializer:

$ pip install confluent-kafka[protobuf,schemaregistry]

When using Data Contract rules (including CSFLE) add the `rules`extra, e.g.:

$ pip install confluent-kafka[avro,schemaregistry,rules]

**Install from source**

For source install, see the *Install from source* section in [INSTALL.md](INSTALL.md).
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
# built documents.
#
# The short X.Y version.
version = '2.6.0'
version = '2.7.0'
# The full version, including alpha/beta/rc tags.
release = version
######################################################################
Expand Down
3 changes: 2 additions & 1 deletion examples/avro/user_generic.avsc
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
"fields": [
{
"name": "name",
"type": "string"
"type": "string",
"confluent:tags": ["PII"]
},
{
"name": "favorite_number",
Expand Down
3 changes: 2 additions & 1 deletion examples/avro/user_specific.avsc
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@
"fields": [
{
"name": "name",
"type": "string"
"type": "string",
"confluent:tags": ["PII"]
},
{
"name": "favorite_number",
Expand Down
145 changes: 145 additions & 0 deletions examples/avro_consumer_encryption.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright 2024 Confluent Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.


# A simple example demonstrating use of AvroDeserializer.

import argparse

from confluent_kafka.schema_registry.rules.encryption.encrypt_executor import \
FieldEncryptionExecutor

from confluent_kafka.schema_registry.rules.encryption.localkms.local_driver import \
LocalKmsDriver

from confluent_kafka.schema_registry.rules.encryption.hcvault.hcvault_driver import \
HcVaultKmsDriver

from confluent_kafka.schema_registry.rules.encryption.gcpkms.gcp_driver import \
GcpKmsDriver

from confluent_kafka.schema_registry.rules.encryption.azurekms.azure_driver import \
AzureKmsDriver

from confluent_kafka.schema_registry.rules.encryption.awskms.aws_driver import \
AwsKmsDriver

from confluent_kafka import Consumer
from confluent_kafka.serialization import SerializationContext, MessageField
from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.avro import AvroDeserializer


class User(object):
"""
User record
Args:
name (str): User's name
favorite_number (int): User's favorite number
favorite_color (str): User's favorite color
"""

def __init__(self, name=None, favorite_number=None, favorite_color=None):
self.name = name
self.favorite_number = favorite_number
self.favorite_color = favorite_color


def dict_to_user(obj, ctx):
"""
Converts object literal(dict) to a User instance.
Args:
obj (dict): Object literal(dict)
ctx (SerializationContext): Metadata pertaining to the serialization
operation.
"""

if obj is None:
return None

return User(name=obj['name'],
favorite_number=obj['favorite_number'],
favorite_color=obj['favorite_color'])


def main(args):
# Register the KMS drivers and the field-level encryption executor
AwsKmsDriver.register()
AzureKmsDriver.register()
GcpKmsDriver.register()
HcVaultKmsDriver.register()
LocalKmsDriver.register()
FieldEncryptionExecutor.register()

topic = args.topic

# When using Data Contract rules, a schema should not be passed to the
# AvroDeserializer. The schema is fetched from the Schema Registry.
schema_str = None

sr_conf = {'url': args.schema_registry}
schema_registry_client = SchemaRegistryClient(sr_conf)

avro_deserializer = AvroDeserializer(schema_registry_client,
schema_str,
dict_to_user)

consumer_conf = {'bootstrap.servers': args.bootstrap_servers,
'group.id': args.group,
'auto.offset.reset': "earliest"}

consumer = Consumer(consumer_conf)
consumer.subscribe([topic])

while True:
try:
# SIGINT can't be handled when polling, limit timeout to 1 second.
msg = consumer.poll(1.0)
if msg is None:
continue

user = avro_deserializer(msg.value(), SerializationContext(msg.topic(), MessageField.VALUE))
if user is not None:
print("User record {}: name: {}\n"
"\tfavorite_number: {}\n"
"\tfavorite_color: {}\n"
.format(msg.key(), user.name,
user.favorite_number,
user.favorite_color))
except KeyboardInterrupt:
break

consumer.close()


if __name__ == '__main__':
parser = argparse.ArgumentParser(description="AvroDeserializer example")
parser.add_argument('-b', dest="bootstrap_servers", required=True,
help="Bootstrap broker(s) (host[:port])")
parser.add_argument('-s', dest="schema_registry", required=True,
help="Schema Registry (http(s)://host[:port]")
parser.add_argument('-t', dest="topic", default="example_serde_avro",
help="Topic name")
parser.add_argument('-g', dest="group", default="example_serde_avro",
help="Consumer group")

main(parser.parse_args())
Loading

0 comments on commit b88b020

Please sign in to comment.