Skip to content

Commit 4991dd3

Browse files
committed
datastreaming
1 parent 5ea52ee commit 4991dd3

File tree

2 files changed

+41
-8
lines changed

2 files changed

+41
-8
lines changed

doc/specific_iocs/dae/Datastreaming.md

Lines changed: 36 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,30 +23,58 @@ Part of our in-kind contribution to datastreaming is to test the system in produ
2323

2424
![](ISISDSLayout.png)
2525

26-
## The Kafka Clusters
27-
There are two Kafka clusters, production (`livedata.isis.cclrc.ac.uk:9092`) and development (`tenten.isis.cclrc.ac.uk:9092` or `sakura.isis.cclrc.ac.uk:9092` or `hinata.isis.cclrc.ac.uk:9092`). The development cluster is set up to auto-create topics and so when new developer machines are run up all the required topics will be created. However, the production server does not auto-create topics this means that when a new real instrument comes online corresponding topics must be created on this cluster, which is done as part of the install script. Credentials for both clusters can be found in the keeper shared folder.
26+
## The Kafka Cluster
2827

29-
### Grafana dashboard
30-
A Grafana dashboard for the production cluster can be found at `madara.isis.cclrc.ac.uk:3000`. This shows the topic data rate and other useful information. Admin credentials can also be found in the sharepoint.
28+
There is a Kafka cluster at `livedata.isis.cclrc.ac.uk`. Port 9092 is used for the primary Kafka broker. A web interface
29+
is available on port 8080.
30+
31+
The production server does not auto-create topics this
32+
means that when a new real instrument comes online corresponding topics must be created on this cluster, which is done
33+
as part of the install script.
34+
35+
Credentials for the cluster can be found in Keeper, under `ds streaming container user`. The machine is reachable by
36+
SSH with these credentials.
3137

3238
### Deployment
33-
Deployment involves the use of Ansible playbooks, the playbooks and instructions for using these can be found [here.](https://github.com/ISISComputingGroup/ansible-kafka-centos)
39+
40+
Deployment is currently onto a machine running in the SCD cloud. Deployment instructions can be found
41+
[in the `ds-containers` repository](https://github.com/isiscomputinggroup/ds-containers).
3442

3543
## Neutron Data
36-
The ICP on any instrument that is running in full event mode and with a DAE3 is streaming neutron events into Kafka.
44+
45+
The ICP on any instrument that is running in full event mode and with a DAE3 may stream neutron events into Kafka.
46+
47+
This is controlled using flags in the `isisicp.properties` file:
48+
49+
```
50+
isisicp.kafkastream = true
51+
# if not specified, topicprefix will default to instrument name in code
52+
isisicp.kafkastream.topicprefix =
53+
isisicp.kafkastream.broker = livedata.isis.cclrc.ac.uk:9092
54+
isisicp.kafkastream.topic.suffix.runinfo = _runInfo
55+
isisicp.kafkastream.topic.suffix.sampleenv = _sampleEnv
56+
isisicp.kafkastream.topic.suffix.alarms = _alarms
57+
```
3758

3859
## SE Data
60+
3961
See [Forwarding Sample Environment](datastreaming/Datastreaming---Sample-Environment)
4062

4163
## Filewriting
4264

4365
See [File writing](datastreaming/Datastreaming---File-writing)
4466

4567
## System Tests
46-
Currently system tests are being run to confirm that the start/stop run and event data messages are being sent into Kafka and that a Nexus file is being written with these events. The Kafka cluster and filewriter are being run in docker containers for these tests and so must be run on a Windows 10 machine. To run these tests you will need to install [docker for windows and add yourself as a docker-user](https://docs.docker.com/docker-for-windows/install/#install-docker-desktop-on-windows).
68+
69+
Currently system tests are being run to confirm that the start/stop run and event data messages are being sent into
70+
Kafka and that a Nexus file is being written with these events. The Kafka cluster and filewriter are being run in docker
71+
containers for these tests and so must be run on a Windows 10 machine. To run these tests you will need to
72+
install [docker for windows and add yourself as a docker-user](https://docs.docker.com/docker-for-windows/install/#install-docker-desktop-on-windows).
4773

4874
## The future of streaming at ISIS
4975

50-
After the in-kind work finishes and during the handover, there are some proposed changes that affect the layout and integration of data streaming at ISIS. This diagram is subject to change, but shows a brief overview of what the future system might look like:
76+
After the in-kind work finishes and during the handover, there are some proposed changes that affect the layout and
77+
integration of data streaming at ISIS. This diagram is subject to change, but shows a brief overview of what the future
78+
system might look like:
5179

5280
![](FUTUREISISDSLayout.png)

doc/specific_iocs/dae/datastreaming/Datastreaming-How-To.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,11 @@ This is a guide for basic operations using either the development or production
44

55
Note that there are many ways to do the following, what is written here is the way commonly done at ISIS on our development and production clusters. Something like `kafka-tool` is a nice GUI that will list topics, brokers, etc and create or delete topics. You may have more luck running things like `kafkacat`, `kafkacow` or any of the official Kafka scripts under the [Windows subsystem for linux](https://docs.microsoft.com/en-gb/windows/wsl/install-win10)
66

7+
## Configure a new datastreaming cluster
8+
9+
There are instructions for how to set up a new data streaming Kafka/Redpanda cluster
10+
[in the `ds-containers` repository](https://github.com/isiscomputinggroup/ds-containers).
11+
712
## Topic operations
813

914
Pushing to one topic does not necessarily mean that the other topics in the cluster receive the data and replicate it, so use with caution. If you need to create a topic that is replicated through all of the topics you should probably follow [this guide](https://coralogix.com/blog/create-kafka-topics-in-3-easy-steps/) by `ssh` on the actual server machines themselves.

0 commit comments

Comments
 (0)