Skip to content

Commit

Permalink
DEVX-777: bump version to 5.2.0 (confluentinc#135)
Browse files Browse the repository at this point in the history
  • Loading branch information
ybyzek authored Mar 25, 2019
1 parent 1d0b98e commit 3d3e505
Show file tree
Hide file tree
Showing 23 changed files with 34 additions and 34 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,9 @@ cp-demo also comes with a playbook and video series, and is a great configuratio
| [Hybrid cloud](ccloud/README.md) | [Y](ccloud/README.md) | [Y](ccloud/README.md) | Confluent Cloud | End-to-end demo of a hybrid Kafka Cluster between [Confluent Cloud](https://www.confluent.io/confluent-cloud/) and on-prem using Confluent Replicator
| [KSQL UDF](https://github.com/confluentinc/demo-scene/blob/master/ksql-udf-advanced-example/README.md) | [Y](https://github.com/confluentinc/demo-scene/blob/master/ksql-udf-advanced-example/README.md) | N | Stream Processing | Advanced KSQL [UDF](https://www.confluent.io/blog/build-udf-udaf-ksql-5-0) use case for connected cars
| [KSQL workshop](https://github.com/confluentinc/demo-scene/blob/master/ksql-workshop/) | N | [Y](https://github.com/confluentinc/demo-scene/blob/master/ksql-workshop/) | Stream Processing | showcases Kafka stream processing using KSQL and can run self-guided as a KSQL workshop
| [Microservices ecosystem](microservices-orders/README.md) | [Y](microservices-orders/README.md) | N | Stream Processing | [Microservices Orders Demo Application](https://github.com/confluentinc/kafka-streams-examples/tree/5.1.0-post/src/main/java/io/confluent/examples/streams/microservices) integrated into the Confluent Platform
| [Microservices ecosystem](microservices-orders/README.md) | [Y](microservices-orders/README.md) | N | Stream Processing | [Microservices Orders Demo Application](https://github.com/confluentinc/kafka-streams-examples/tree/5.2.0-post/src/main/java/io/confluent/examples/streams/microservices) integrated into the Confluent Platform
| [MQTT](https://github.com/confluentinc/demo-scene/blob/master/mqtt-connect-connector-demo/README.md) | [Y](https://github.com/confluentinc/demo-scene/blob/master/mqtt-connect-connector-demo/README.md) | N | Data Pipeline | Internet of Things (IoT) integration example using Apache Kafka + Kafka Connect + MQTT Connector + Sensor Data
| [Multi datacenter](https://github.com/confluentinc/cp-docker-images/tree/5.1.1-post/examples/multi-datacenter) | N | [Y](https://github.com/confluentinc/cp-docker-images/tree/5.1.1-post/examples/multi-datacenter) | Confluent Platform | This demo deploys an active-active multi-datacenter design, with two instances of Confluent Replicator copying data bidirectionally between the datacenters
| [Multi datacenter](https://github.com/confluentinc/cp-docker-images/tree/5.2.0-post/examples/multi-datacenter) | N | [Y](https://github.com/confluentinc/cp-docker-images/tree/5.2.0-post/examples/multi-datacenter) | Confluent Platform | This demo deploys an active-active multi-datacenter design, with two instances of Confluent Replicator copying data bidirectionally between the datacenters
| [Music demo](music/README.md) | [Y](music/README.md) | [Y](music/README.md) | Stream Processing | KSQL version of the [Kafka Streams Demo Application](https://docs.confluent.io/current/streams/kafka-streams-examples/docs/index.html)
| [MySQL and Debezium](mysql-debezium/README.md) | [Y](mysql-debezium/README.md) | N | Data Pipelines | End-to-end streaming ETL with KSQL for stream processing using the [Debezium Connector for MySQL](http://debezium.io/docs/connectors/mysql/)
| [Oracle, KSQL, Elasticsearch](https://github.com/confluentinc/demo-scene/blob/master/oracle-ksql-elasticsearch/oracle-ksql-elasticsearch-docker.adoc) | N | Y | Data Pipelines | Stream data from Oracle, enrich and filter with KSQL, and then stream into Elasticsearch
Expand All @@ -43,7 +43,7 @@ cp-demo also comes with a playbook and video series, and is a great configuratio

For local installs:

* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* Env var `CONFLUENT_HOME=/path/to/confluentplatform`
* Env var `PATH` includes `$CONFLUENT_HOME/bin`
* Each demo has its own set of prerequisites as well, documented in each demo's README
Expand Down
2 changes: 1 addition & 1 deletion ccloud/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ It also includes a [script](ccloud-generate-cp-configs.sh) that reads the Conflu
As with the other demos in this repo, you may run the entire demo end-to-end with `./start.sh`, and it runs on your local Confluent Platform install. This requires the following:

* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* [Confluent Cloud CLI](https://docs.confluent.io/current/cloud-quickstart.html#step-2-install-ccloud-cli)
* [An initialized Confluent Cloud cluster used for development only](https://confluent.cloud)
* Maven to compile the data generator, i.e. the `KafkaMusicExampleDriver` class
Expand Down
4 changes: 2 additions & 2 deletions ccloud/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ services:
CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components"
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.1.2.jar
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.2.0.jar
CONNECT_LOG4J_LOGGERS: org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR
# Connect producer
CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
Expand Down Expand Up @@ -264,7 +264,7 @@ services:
CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components"
CONNECT_LOG4J_ROOT_LOGLEVEL: INFO
CONNECT_LOG4J_LOGGERS: org.reflections=ERROR
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.1.2.jar
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.2.0.jar
CONNECT_REQUEST_TIMEOUT_MS: 20000
CONNECT_RETRY_BACKOFF_MS: 500
# Connect worker
Expand Down
2 changes: 1 addition & 1 deletion ccloud/docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ Run demo

**Demo validated with:**

- Confluent Platform 5.1
- Confluent Platform 5.2
- |ccloud|
- |ccloud| CLI
- Java version 1.8.0_162
Expand Down
2 changes: 1 addition & 1 deletion ccloud/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

check_env || exit 1
check_jq || exit 1
check_running_cp 5.1 || exit 1
check_running_cp 5.2 || exit 1
check_ccloud || exit 1

if ! is_ce ; then
Expand Down
2 changes: 1 addition & 1 deletion clickstream/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ You can [run it using Docker](https://docs.confluent.io/current/ksql/docs/tutori
## Prerequisites

* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* `jq` installed on your machine
* [Elasticsearch 5.6.5](https://www.elastic.co/downloads/past-releases/elasticsearch-5-6-5) to export data from Kafka
* If you do not want to use Elasticsearch, comment out ``check_running_elasticsearch`` in the ``start.sh`` script
Expand Down
2 changes: 1 addition & 1 deletion clickstream/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

check_env || exit 1
check_jq || exit 1
check_running_cp 5.1 || exit 1
check_running_cp 5.2 || exit 1
check_running_elasticsearch 5.6.5 || exit 1
check_running_grafana 5.0.3 || exit 1

Expand Down
2 changes: 1 addition & 1 deletion clients/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
# Prerequisites

* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* [Confluent Cloud CLI](https://docs.confluent.io/current/cloud-quickstart.html#step-2-install-ccloud-cli)
* [An initialized Confluent Cloud cluster used for development only](https://confluent.cloud)
* Maven to compile the Java code
6 changes: 3 additions & 3 deletions clients/avro/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<groupId>io.confluent</groupId>
<artifactId>rest-utils-parent</artifactId>
<version>5.1.2</version>
<version>5.2.0</version>
</parent>

<artifactId>java-client-avro-examples</artifactId>
<version>5.1.2</version>
<version>5.2.0</version>

<packaging>jar</packaging>
<properties>
Expand Down Expand Up @@ -99,7 +99,7 @@ http://maven.apache.org/xsd/maven-4.0.0.xsd">
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>5.1.2</version>
<version>5.2.0</version>
<configuration>
<schemaRegistryUrls>
<param>http://localhost:8081</param>
Expand Down
4 changes: 2 additions & 2 deletions clients/cloud/java/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,12 @@
<parent>
<groupId>io.confluent</groupId>
<artifactId>rest-utils-parent</artifactId>
<version>5.1.2</version>
<version>5.2.0</version>
</parent>

<artifactId>clients-example</artifactId>
<packaging>jar</packaging>
<version>5.1.2</version>
<version>5.2.0</version>

<organization>
<name>Confluent, Inc.</name>
Expand Down
2 changes: 1 addition & 1 deletion connect-streams-pipeline/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ For more information, please read [this blogpost](https://www.confluent.io/blog/
# Prerequisites

* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* Maven command `mvn` to compile Java code
* By default the `timeout` command is available on most Linux distributions but not Mac OS. This `timeout` command is used by the bash scripts to terminate consumer processes after a period of time. To install it on a Mac:

Expand Down
6 changes: 3 additions & 3 deletions connect-streams-pipeline/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<groupId>io.confluent</groupId>
<artifactId>rest-utils-parent</artifactId>
<version>5.1.2</version>
<version>5.2.0</version>
</parent>

<artifactId>connect-streams-examples</artifactId>
<version>5.1.2</version>
<version>5.2.0</version>
<packaging>jar</packaging>
<repositories>
<repository>
Expand All @@ -21,7 +21,7 @@ http://maven.apache.org/xsd/maven-4.0.0.xsd">
</repositories>
<properties>
<!-- Keep versions as properties to allow easy modification -->
<licenses.version>5.1.0</licenses.version>
<licenses.version>5.2.0</licenses.version>
<avro.version>1.8.2</avro.version>
<!-- Maven properties for compilation -->
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
Expand Down
2 changes: 1 addition & 1 deletion connect-streams-pipeline/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

check_env || exit 1
check_mvn || exit 1
check_running_cp 5.1 || exit
check_running_cp 5.2 || exit

./stop.sh

Expand Down
2 changes: 1 addition & 1 deletion microservices-orders/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ services:
CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_ZOOKEEPER_CONNECT: 'zookeeper:2181'
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.1.2.jar
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.2.0.jar
CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
CONNECT_CONSUMER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor"
CONNECT_PLUGIN_PATH: /usr/share/java
Expand Down
2 changes: 1 addition & 1 deletion microservices-orders/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ check_jot || exit 1
check_netstat || exit 1
check_running_elasticsearch 5.6.5 || exit 1
check_running_kibana || exit 1
check_running_cp 5.1 || exit 1
check_running_cp 5.2 || exit 1

./stop.sh

Expand Down
2 changes: 1 addition & 1 deletion music/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Follow along with the video [Demo: Build a Streaming Application with KSQL](http

As with the other demos in this repo, you may run the entire demo end-to-end with `./start.sh`, and it runs on your local Confluent Platform install. This requires the following:

* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* Java 1.8 to compile the data generator, i.e. the `KafkaMusicExampleDriver` class
* Maven to compile the data generator, i.e. the `KafkaMusicExampleDriver` class

Expand Down
8 changes: 4 additions & 4 deletions music/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,17 @@

check_env || exit 1
check_mvn || exit 1
check_running_cp 5.1 || exit
check_running_cp 5.2 || exit

./stop.sh

echo "auto.offset.reset=earliest" >> $CONFLUENT_HOME/etc/ksql/ksql-server.properties
confluent start

[[ -d "kafka-streams-examples" ]] || git clone https://github.com/confluentinc/kafka-streams-examples.git
(cd kafka-streams-examples && git checkout 5.1.2-post)
[[ -f "kafka-streams-examples/target/kafka-streams-examples-5.1.2-standalone.jar" ]] || (cd kafka-streams-examples && mvn clean package -DskipTests)
java -cp kafka-streams-examples/target/kafka-streams-examples-5.1.2-standalone.jar io.confluent.examples.streams.interactivequeries.kafkamusic.KafkaMusicExampleDriver &>/dev/null &
(cd kafka-streams-examples && git checkout 5.2.0-post)
[[ -f "kafka-streams-examples/target/kafka-streams-examples-5.2.0-standalone.jar" ]] || (cd kafka-streams-examples && mvn clean package -DskipTests)
java -cp kafka-streams-examples/target/kafka-streams-examples-5.2.0-standalone.jar io.confluent.examples.streams.interactivequeries.kafkamusic.KafkaMusicExampleDriver &>/dev/null &

sleep 5

Expand Down
2 changes: 1 addition & 1 deletion mysql-debezium/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The MySQL Debezium demo shows an end-to-end streaming ETL with KSQL for stream p


* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* MySQL
* [Binary log should be enabled](http://debezium.io/docs/connectors/mysql/)
* [Elasticsearch 5.6.5](https://www.elastic.co/downloads/past-releases/elasticsearch-5-6-5) to export data from Kafka
Expand Down
2 changes: 1 addition & 1 deletion mysql-debezium/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
. ../utils/helper.sh

check_env || exit 1
check_running_cp 5.1 || exit
check_running_cp 5.2 || exit
check_mysql || exit
check_running_elasticsearch 5.6.5 || exit 1
check_running_kibana || exit 1
Expand Down
2 changes: 1 addition & 1 deletion pageviews/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ The Pageviews demo is the automated version of the [Confluent Platform Quickstar
# Prerequisites

* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)

# What Should I see?

Expand Down
2 changes: 1 addition & 1 deletion pageviews/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
. ../utils/helper.sh

check_env || exit 1
check_running_cp 5.1 || exit
check_running_cp 5.2 || exit

./stop.sh

Expand Down
2 changes: 1 addition & 1 deletion wikipedia/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ The Wikipedia demo is the non-Docker version of the [Confluent Platform Demo](ht
# Prerequisites

* [Common demo prerequisites](https://github.com/confluentinc/examples#prerequisites)
* [Confluent Platform 5.1](https://www.confluent.io/download/)
* [Confluent Platform 5.2](https://www.confluent.io/download/)
* [Elasticsearch 5.6.5](https://www.elastic.co/downloads/past-releases/elasticsearch-5-6-5) to export data from Kafka
* If you do not want to use Elasticsearch, comment out ``check_running_elasticsearch`` in the ``start.sh`` script
* [Kibana 5.5.2](https://www.elastic.co/downloads/past-releases/kibana-5-5-2) to visualize data
Expand Down
2 changes: 1 addition & 1 deletion wikipedia/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
. ../utils/helper.sh

check_env || exit 1
check_running_cp 5.1 || exit
check_running_cp 5.2 || exit
check_running_elasticsearch 5.6.5 || exit 1
check_running_kibana || exit 1

Expand Down

0 comments on commit 3d3e505

Please sign in to comment.