Project to simulate IoT sensor sending events to a Kafka cluster.
- Spring Webflux - 2.5.5
- Mapstruct - 1.4.2.Final
- Immutables - 2.8.8
- Gradle - 7.2
- Java 11
- Kafka
- Docker
- Needed in case you want to run Kafka with Docker. In case you want to use the Kafka scripts, you don't need it.
- Java 11
- Need have the JAVA_HOME set in case you want to execute via Gradle
At the project root folder, execute:
./gradlew buildAt the project root folder, execute:
./gradlew test./gradlew test jacocoTestReportPath: ./build/reports/jacoco/jacocoRootReport/html/index.html
-
Install docker: https://docs.docker.com/engine/install/
- On Linux, docker compose need be installed separately: https://docs.docker.com/compose/install/
-
Run the docker compose file
docker-compose -f "docker-compose.yml" up --build -dIt will start:
-
The IOT Producer starts on
localhost:8080with docker andlocalhost:8080using the default profile.- Checkout the OpenApi to know about the endpoint and how execute them.
📝 It also comes with Kafdrop, which is is a web UI for viewing Kafka topics and browsing consumer groups. You can access via browser: http://localhost:9000.
The endpoints are documented using OpenApi and can be found at docs/api/openapi.yml
More about OpenAPI: https://www.openapis.org/.
To generate the sensor events we have two flows: via single sensor or via sensor cluster.
The Single sensor endpoint provides the capability to set up a single sensor (with all the information provided by you) and then send as many request you want.
For example, let's say we have the sensor Living Room Temp and we want to simulate the events for TEMPERATURE every 5 seconds for 10 minutes. This is what the request would look like:
curl --location --request POST 'http://localhost:8080/producer-api/events' \
--header 'Content-Type: application/json' \
--data-raw '[
{
"total": 120,
"type": "TEMPERATURE",
"heartBeat": 5,
"id": 1,
"name": "Living Room Temp",
"clusterId": "1"
}
]'📝 The id is a unique identifier of this sensor and is mandatory.
📝 The clusterId, identifies this sensor as belonging to a cluster. Later you can query all the sensor in the same cluster. This field is optional.
📝 The total is how many events do you want. In our example, we wanted it running for 10 minutes, so: 120 X 5 seconds = 10 minutes
📝 The heartBeat is the interval, in seconds, between each event.
You can also send more than one sensor at the time:
curl --location --request POST 'http://localhost:8080/producer-api/events' \
--header 'Content-Type: application/json' \
--data-raw '[
{
"total": 120,
"type": "TEMPERATURE",
"heartBeat": 5,
"id": 1,
"name": "Living Room Temp",
"clusterId": "1"
},
{
"total": 10,
"type": "HUMIDITY",
"heartBeat": 10,
"id": 2,
"name": "Living Room Humidity",
"clusterId": "1"
}
]'For more info about the endpoint, see openapi.yml
In the sensor cluster, it will simulate a several sensors sending events at the same time.
Taking the same example as before, lets say now we added 100 more sensors for temperature, and we still want to simulate the same: events for TEMPERATURE every 5 seconds for 10 minutes.
The request would look like:
curl --location --request POST 'http://localhost:8080/producer-api/clusters' \
--header 'Content-Type: application/json' \
--data-raw '[
{
"total": 120,
"type": "TEMPERATURE",
"heartBeat": 5,
"clusterSize": 100,
"clusterId": 1
}
]'📝 clusterSize is how many sensor you want running in parallel.
📝 Note that now we don't provide the id, it's because the server is going to random generate one for each sensor. In this case is easier to query the sensor events by clusterId.
Here we can also send more than one type:
curl --location --request POST 'http://localhost:8080/producer-api/clusters' \
--header 'Content-Type: application/json' \
--data-raw '[
{
"total": 200,
"type": "HUMIDITY",
"heartBeat": 3,
"clusterSize": 10,
"clusterId": 1
},
{
"total": 120,
"type": "TEMPERATURE",
"heartBeat": 5,
"clusterSize": 100,
"clusterId": 1
}
]'For more info about the endpoint, see openapi.yml
To distribute the sensor data we are using Kafka streams. When using the docker-compose.yml, it starts 3 kafka brokers:
kafka_0kafka_1kafka_2
In case connecting from outside docker network use
localhost+ the instance port:9092,9094and9095to connect
In case connecting within docker network use instance name + port:
kafka_0:29092,kafka_1:29093andkafka_2:29094. More info: Kafka Listeners - Explained
All the sensor events are forward to the Kafka topic iot-data and they have the following structure:
| Field | Type | Description | Mandatory |
|---|---|---|---|
| id | Long | Sensor ID | Y |
| value | BigDecimal | Sensor measured value | Y |
| timestamp | OffsetDateTime | Event timestamp | Y |
| type | String | Sensor type | Y |
| name | String | Sensor name | Y |
| clusterId | Long | Cluster of which this sensor belongs | N |
