Skip to content

changefeed: support changfeeed in azure #20552

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: release-8.1
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions TOC-tidb-cloud.md
Original file line number Diff line number Diff line change
Expand Up @@ -327,8 +327,9 @@
- [To TiDB Cloud Sink](/tidb-cloud/changefeed-sink-to-tidb-cloud.md)
- [To Cloud Storage](/tidb-cloud/changefeed-sink-to-cloud-storage.md)
- Reference
- [Set Up Self-Hosted Kafka Private Link Service in AWS](/tidb-cloud/setup-self-hosted-kafka-private-link-service.md)
- [Set Up Self-Hosted Kafka Private Service Connect in Google Cloud](/tidb-cloud/setup-self-hosted-kafka-private-service-connect.md)
- [Set Up Self-Hosted Kafka Private Link Service in AWS](/tidb-cloud/setup-aws-self-hosted-kafka-private-link-service.md)
- [Set Up Self-Hosted Kafka Private Service Connect in Google Cloud](/tidb-cloud/setup-gcp-self-hosted-kafka-private-service-connect.md)
- [Set Up Self-Hosted Kafka Private Service Connect in Azure](/tidb-cloud/setup-azure-self-hosted-kafka-private-link-service.md.md)
- Disaster Recovery
- [Recovery Group Overview](/tidb-cloud/recovery-group-overview.md)
- [Get Started](/tidb-cloud/recovery-group-get-started.md)
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
65 changes: 48 additions & 17 deletions tidb-cloud/changefeed-sink-to-apache-kafka.md

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -132,8 +132,7 @@ Take the following steps to create the Kafka VPC.
- **Subnet name**: `bastion`
- **IPv4 subnet CIDR block**: `10.0.192.0/18`

4. Click **Create subnet**. The **Subnets Listing** page is displayed.
5. Configure the bastion subnet to the Public subnet.
4. Configure the bastion subnet to the Public subnet.

1. Go to [VPC dashboard > Internet gateways](https://console.aws.amazon.com/vpcconsole/home#igws:). Create an Internet Gateway with the name `kafka-vpc-igw`.
2. On the **Internet gateways Detail** page, in **Actions**, click **Attach to VPC** to attach the Internet Gateway to the Kafka VPC.
Expand Down Expand Up @@ -434,18 +433,18 @@ LOG_DIR=$KAFKA_LOG_DIR nohup $KAFKA_START_CMD "$KAFKA_CONFIG_DIR/server.properti

# Create a topic if it does not exist
create_topic() {
echo "Creating topic if it does not exist..."
$KAFKA_DIR/kafka-topics.sh --create --topic $TOPIC --bootstrap-server $BROKER_LIST --if-not-exists --partitions 3 --replication-factor 3
echo "Creating topic if it does not exist..."
$KAFKA_DIR/kafka-topics.sh --create --topic $TOPIC --bootstrap-server $BROKER_LIST --if-not-exists --partitions 3 --replication-factor 3
}

# Produce messages to the topic
produce_messages() {
echo "Producing messages to the topic..."
for ((chrono=1; chrono <= 10; chrono++)); do
message="Test message "$chrono
echo "Create "$message
echo $message | $KAFKA_DIR/kafka-console-producer.sh --broker-list $BROKER_LIST --topic $TOPIC
done
echo "Producing messages to the topic..."
for ((chrono=1; chrono <= 10; chrono++)); do
message="Test message "$chrono
echo "Create "$message
echo $message | $KAFKA_DIR/kafka-console-producer.sh --broker-list $BROKER_LIST --topic $TOPIC
done
}
create_topic
produce_messages
Expand All @@ -468,8 +467,8 @@ LOG_DIR=$KAFKA_LOG_DIR nohup $KAFKA_START_CMD "$KAFKA_CONFIG_DIR/server.properti
CONSUMER_GROUP="test-group"
# Consume messages from the topic
consume_messages() {
echo "Consuming messages from the topic..."
$KAFKA_DIR/kafka-console-consumer.sh --bootstrap-server $BROKER_LIST --topic $TOPIC --from-beginning --timeout-ms 5000 --consumer-property group.id=$CONSUMER_GROUP
echo "Consuming messages from the topic..."
$KAFKA_DIR/kafka-console-consumer.sh --bootstrap-server $BROKER_LIST --topic $TOPIC --from-beginning --timeout-ms 5000 --consumer-property group.id=$CONSUMER_GROUP
}
consume_messages
```
Expand Down Expand Up @@ -688,7 +687,7 @@ Do the following to set up the load balancer:
- `usw2-az2` with `broker-usw2-az2 subnet`
- `usw2-az3` with `broker-usw2-az3 subnet`
- **Security groups**: create a new security group with the following rules.
- Inbound rule allows all TCP from Kafka VPC: Type - `All TCP`; Source - `Anywhere-IPv4`
- Inbound rule allows all TCP from Kafka VPC: Type - `{ports of target groups}`, for example, `9092-9095`; Source - `{CIDR of TiDB Cloud}`. PS: you can get "CIDR of TiDB Cloud" in the region from TiDB Cloud console `Project Settings->Network Access->Project CIDR->AWS`
- Outbound rule allows all TCP to Kafka VPC: Type - `All TCP`; Destination - `Anywhere-IPv4`
- Listeners and routing:
- Protocol: `TCP`; Port: `9092`; Forward to: `bootstrap-target-group`
Expand Down
539 changes: 539 additions & 0 deletions tidb-cloud/setup-azure-self-hosted-kafka-private-link-service.md

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -318,18 +318,18 @@ Go to the [VM instances](https://console.cloud.google.com/compute/instances) pag

# Create a topic if it does not exist
create_topic() {
echo "Creating topic if it does not exist..."
$KAFKA_DIR/kafka-topics.sh --create --topic $TOPIC --bootstrap-server $BROKER_LIST --if-not-exists --partitions 3 --replication-factor 3
echo "Creating topic if it does not exist..."
$KAFKA_DIR/kafka-topics.sh --create --topic $TOPIC --bootstrap-server $BROKER_LIST --if-not-exists --partitions 3 --replication-factor 3
}

# Produce messages to the topic
produce_messages() {
echo "Producing messages to the topic..."
for ((chrono=1; chrono <= 10; chrono++)); do
message="Test message "$chrono
echo "Create "$message
echo $message | $KAFKA_DIR/kafka-console-producer.sh --broker-list $BROKER_LIST --topic $TOPIC
done
echo "Producing messages to the topic..."
for ((chrono=1; chrono <= 10; chrono++)); do
message="Test message "$chrono
echo "Create "$message
echo $message | $KAFKA_DIR/kafka-console-producer.sh --broker-list $BROKER_LIST --topic $TOPIC
done
}
create_topic
produce_messages
Expand All @@ -352,8 +352,8 @@ Go to the [VM instances](https://console.cloud.google.com/compute/instances) pag
CONSUMER_GROUP="test-group"
# Consume messages from the topic
consume_messages() {
echo "Consuming messages from the topic..."
$KAFKA_DIR/kafka-console-consumer.sh --bootstrap-server $BROKER_LIST --topic $TOPIC --from-beginning --timeout-ms 5000 --consumer-property group.id=$CONSUMER_GROUP
echo "Consuming messages from the topic..."
$KAFKA_DIR/kafka-console-consumer.sh --bootstrap-server $BROKER_LIST --topic $TOPIC --from-beginning --timeout-ms 5000 --consumer-property group.id=$CONSUMER_GROUP
}
consume_messages
```
Expand Down
2 changes: 1 addition & 1 deletion tidb-cloud/tidb-cloud-billing-ticdc-rcu.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,4 +36,4 @@ To learn about the supported regions and the price of TiDB Cloud for each TiCDC

If you choose the **Private Link** or **Private Service Connect** network connectivity method, additional **Private Data Link** costs will be incurred. These charges fall under the [Data Transfer Cost](https://www.pingcap.com/tidb-dedicated-pricing-details/#data-transfer-cost) category.

The price of **Private Data Link** is **$0.01/GiB**, the same as **Data Processed** of [AWS Interface Endpoint pricing](https://aws.amazon.com/privatelink/pricing/#Interface_Endpoint_pricing) and **Consumer data processing** of [Google Cloud Private Service Connect pricing](https://cloud.google.com/vpc/pricing#psc-forwarding-rules).
The price of **Private Data Link** is **$0.01/GiB**, the same as **Data Processed** of [AWS Interface Endpoint pricing](https://aws.amazon.com/privatelink/pricing/#Interface_Endpoint_pricing) , **Consumer data processing** of [Google Cloud Private Service Connect pricing](https://cloud.google.com/vpc/pricing#psc-forwarding-rules) and **Inbound/Outbound Data Processed** of [Azure Private Link pricing](https://azure.microsoft.com/en-us/pricing/details/private-link/)
4 changes: 2 additions & 2 deletions tidb-cloud/tidb-cloud-release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,9 +61,9 @@ This page lists the release notes of [TiDB Cloud](https://www.pingcap.com/tidb-c

Private Connect leverages Private Link or Private Service Connect technologies from cloud providers to enable changefeeds in the TiDB Cloud VPC to connect to Kafka in customers' VPCs using private IP addresses, as if those Kafkas were hosted directly within the TiDB Cloud VPC. This feature helps prevent VPC CIDR conflicts and meets security compliance requirements.

- For Apache Kafka in AWS, follow the instructions in [Set Up Self-Hosted Kafka Private Link Service in AWS](/tidb-cloud/setup-self-hosted-kafka-private-link-service.md) to configure the network connection.
- For Apache Kafka in AWS, follow the instructions in [Set Up Self-Hosted Kafka Private Link Service in AWS](/tidb-cloud/setup-aws-self-hosted-kafka-private-link-service.md) to configure the network connection.

- For Apache Kafka in Google Cloud, follow the instructions in [Set Up Self-Hosted Kafka Private Service Connect in Google Cloud](/tidb-cloud/setup-self-hosted-kafka-private-service-connect.md) to configure the network connection.
- For Apache Kafka in Google Cloud, follow the instructions in [Set Up Self-Hosted Kafka Private Service Connect in Google Cloud](/tidb-cloud/setup-gcp-self-hosted-kafka-private-service-connect.md) to configure the network connection.

Note that using this feature incurs additional [Private Data Link costs](/tidb-cloud/tidb-cloud-billing-ticdc-rcu.md#private-data-link-cost).

Expand Down
Loading