Skip to content

operations-relay42/iot-producer-simulator-api

Repository files navigation

Producer Simulator API

Project to simulate IoT sensor sending events to a Kafka cluster.

Technologies

Getting Started

Requirements

  • Docker
    • Needed in case you want to run Kafka with Docker. In case you want to use the Kafka scripts, you don't need it.
  • Java 11
    • Need have the JAVA_HOME set in case you want to execute via Gradle

Building

At the project root folder, execute:

./gradlew build

Only tests

At the project root folder, execute:

./gradlew test

Jacoco

./gradlew test jacocoTestReport

Path: ./build/reports/jacoco/jacocoRootReport/html/index.html

Running

  1. Install docker: https://docs.docker.com/engine/install/

  2. Run the docker compose file

    docker-compose -f "docker-compose.yml" up --build -d

    Docker compose success status

    It will start:

  3. The IOT Producer starts on localhost:8080 with docker and localhost:8080 using the default profile.

    1. Checkout the OpenApi to know about the endpoint and how execute them.

    📝 It also comes with Kafdrop, which is is a web UI for viewing Kafka topics and browsing consumer groups. You can access via browser: http://localhost:9000.

Documentation

OpenApi

The endpoints are documented using OpenApi and can be found at docs/api/openapi.yml

More about OpenAPI: https://www.openapis.org/.

Flows

To generate the sensor events we have two flows: via single sensor or via sensor cluster.

Single sensor

The Single sensor endpoint provides the capability to set up a single sensor (with all the information provided by you) and then send as many request you want.

For example, let's say we have the sensor Living Room Temp and we want to simulate the events for TEMPERATURE every 5 seconds for 10 minutes. This is what the request would look like:

curl --location --request POST 'http://localhost:8080/producer-api/events' \
--header 'Content-Type: application/json' \
--data-raw '[
    {
        "total": 120,
        "type": "TEMPERATURE",
        "heartBeat": 5,
        "id": 1,
        "name": "Living Room Temp",
        "clusterId": "1"
    }
]'

📝 The id is a unique identifier of this sensor and is mandatory.

📝 The clusterId, identifies this sensor as belonging to a cluster. Later you can query all the sensor in the same cluster. This field is optional.

📝 The total is how many events do you want. In our example, we wanted it running for 10 minutes, so: 120 X 5 seconds = 10 minutes

📝 The heartBeat is the interval, in seconds, between each event.

You can also send more than one sensor at the time:

curl --location --request POST 'http://localhost:8080/producer-api/events' \
--header 'Content-Type: application/json' \
--data-raw '[
    {
        "total": 120,
        "type": "TEMPERATURE",
        "heartBeat": 5,
        "id": 1,
        "name": "Living Room Temp",
        "clusterId": "1"
    },
    {
        "total": 10,
        "type": "HUMIDITY",
        "heartBeat": 10,
        "id": 2,
        "name": "Living Room Humidity",
        "clusterId": "1"
    }
]'

For more info about the endpoint, see openapi.yml

Sensor cluster

In the sensor cluster, it will simulate a several sensors sending events at the same time.

Taking the same example as before, lets say now we added 100 more sensors for temperature, and we still want to simulate the same: events for TEMPERATURE every 5 seconds for 10 minutes.

The request would look like:

curl --location --request POST 'http://localhost:8080/producer-api/clusters' \
--header 'Content-Type: application/json' \
--data-raw '[
    {
        "total": 120,
        "type": "TEMPERATURE",
        "heartBeat": 5,
        "clusterSize": 100,
        "clusterId": 1
    }
]'

📝 clusterSize is how many sensor you want running in parallel.

📝 Note that now we don't provide the id, it's because the server is going to random generate one for each sensor. In this case is easier to query the sensor events by clusterId.

Here we can also send more than one type:

curl --location --request POST 'http://localhost:8080/producer-api/clusters' \
--header 'Content-Type: application/json' \
--data-raw '[
    {
        "total": 200,
        "type": "HUMIDITY",
        "heartBeat": 3,
        "clusterSize": 10,
        "clusterId": 1
    },
    {
        "total": 120,
        "type": "TEMPERATURE",
        "heartBeat": 5,
        "clusterSize": 100,
        "clusterId": 1
    }
]'

For more info about the endpoint, see openapi.yml

Kafka

To distribute the sensor data we are using Kafka streams. When using the docker-compose.yml, it starts 3 kafka brokers:

  • kafka_0
  • kafka_1
  • kafka_2

In case connecting from outside docker network use localhost + the instance port: 9092, 9094 and 9095 to connect

In case connecting within docker network use instance name + port: kafka_0:29092,kafka_1:29093 and kafka_2:29094. More info: Kafka Listeners - Explained

All the sensor events are forward to the Kafka topic iot-data and they have the following structure:

Field Type Description Mandatory
id Long Sensor ID Y
value BigDecimal Sensor measured value Y
timestamp OffsetDateTime Event timestamp Y
type String Sensor type Y
name String Sensor name Y
clusterId Long Cluster of which this sensor belongs N

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •