Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 36 additions & 8 deletions doc/specific_iocs/dae/Datastreaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,30 +23,58 @@ Part of our in-kind contribution to datastreaming is to test the system in produ

![](ISISDSLayout.png)

## The Kafka Clusters
There are two Kafka clusters, production (`livedata.isis.cclrc.ac.uk:9092`) and development (`tenten.isis.cclrc.ac.uk:9092` or `sakura.isis.cclrc.ac.uk:9092` or `hinata.isis.cclrc.ac.uk:9092`). The development cluster is set up to auto-create topics and so when new developer machines are run up all the required topics will be created. However, the production server does not auto-create topics this means that when a new real instrument comes online corresponding topics must be created on this cluster, which is done as part of the install script. Credentials for both clusters can be found in the keeper shared folder.
## The Kafka Cluster

### Grafana dashboard
A Grafana dashboard for the production cluster can be found at `madara.isis.cclrc.ac.uk:3000`. This shows the topic data rate and other useful information. Admin credentials can also be found in the sharepoint.
There is a Kafka cluster at `livedata.isis.cclrc.ac.uk`. Port 9092 is used for the primary Kafka broker. A web interface
is available on port 8080.

The production server does not auto-create topics this
means that when a new real instrument comes online corresponding topics must be created on this cluster, which is done
as part of the install script.

Credentials for the cluster can be found in Keeper, under `ds streaming container user`. The machine is reachable by
SSH with these credentials.

### Deployment
Deployment involves the use of Ansible playbooks, the playbooks and instructions for using these can be found [here.](https://github.com/ISISComputingGroup/ansible-kafka-centos)

Deployment is currently onto a machine running in the SCD cloud. Deployment instructions can be found
[in the `ds-containers` repository](https://github.com/isiscomputinggroup/ds-containers).

## Neutron Data
The ICP on any instrument that is running in full event mode and with a DAE3 is streaming neutron events into Kafka.

The ICP on any instrument that is running in full event mode and with a DAE3 may stream neutron events into Kafka.

This is controlled using flags in the `isisicp.properties` file:

```
isisicp.kafkastream = true
# if not specified, topicprefix will default to instrument name in code
isisicp.kafkastream.topicprefix =
isisicp.kafkastream.broker = livedata.isis.cclrc.ac.uk:9092
isisicp.kafkastream.topic.suffix.runinfo = _runInfo
isisicp.kafkastream.topic.suffix.sampleenv = _sampleEnv
isisicp.kafkastream.topic.suffix.alarms = _alarms
```

## SE Data

See [Forwarding Sample Environment](datastreaming/Datastreaming---Sample-Environment)

## Filewriting

See [File writing](datastreaming/Datastreaming---File-writing)

## System Tests
Currently system tests are being run to confirm that the start/stop run and event data messages are being sent into Kafka and that a Nexus file is being written with these events. The Kafka cluster and filewriter are being run in docker containers for these tests and so must be run on a Windows 10 machine. To run these tests you will need to install [docker for windows and add yourself as a docker-user](https://docs.docker.com/docker-for-windows/install/#install-docker-desktop-on-windows).

Currently system tests are being run to confirm that the start/stop run and event data messages are being sent into
Kafka and that a Nexus file is being written with these events. The Kafka cluster and filewriter are being run in docker
containers for these tests and so must be run on a Windows 10 machine. To run these tests you will need to
install [docker for windows and add yourself as a docker-user](https://docs.docker.com/docker-for-windows/install/#install-docker-desktop-on-windows).

## The future of streaming at ISIS

After the in-kind work finishes and during the handover, there are some proposed changes that affect the layout and integration of data streaming at ISIS. This diagram is subject to change, but shows a brief overview of what the future system might look like:
After the in-kind work finishes and during the handover, there are some proposed changes that affect the layout and
integration of data streaming at ISIS. This diagram is subject to change, but shows a brief overview of what the future
system might look like:

![](FUTUREISISDSLayout.png)
5 changes: 5 additions & 0 deletions doc/specific_iocs/dae/datastreaming/Datastreaming-How-To.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,11 @@ This is a guide for basic operations using either the development or production

Note that there are many ways to do the following, what is written here is the way commonly done at ISIS on our development and production clusters. Something like `kafka-tool` is a nice GUI that will list topics, brokers, etc and create or delete topics. You may have more luck running things like `kafkacat`, `kafkacow` or any of the official Kafka scripts under the [Windows subsystem for linux](https://docs.microsoft.com/en-gb/windows/wsl/install-win10)

## Configure a new datastreaming cluster

There are instructions for how to set up a new data streaming Kafka/Redpanda cluster
[in the `ds-containers` repository](https://github.com/isiscomputinggroup/ds-containers).

## Topic operations

Pushing to one topic does not necessarily mean that the other topics in the cluster receive the data and replicate it, so use with caution. If you need to create a topic that is replicated through all of the topics you should probably follow [this guide](https://coralogix.com/blog/create-kafka-topics-in-3-easy-steps/) by `ssh` on the actual server machines themselves.
Expand Down
1 change: 1 addition & 0 deletions doc/spelling_wordlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -705,6 +705,7 @@ recsim
recurse
redistributable
redistributables
Redpanda
reflectometer
reflectometers
reflectometry
Expand Down