Skip to content

Latest commit

 

History

History
44 lines (36 loc) · 1.9 KB

README.md

File metadata and controls

44 lines (36 loc) · 1.9 KB

Webhook job queue

The webhook job queue is designed to receive notifications from services and store the JSON as a dictionary in a python-rq redis queue. It is designed to work with a Webhook Relay that validates and relays webhooks from known services such as DockerHub, Docker registries, GitHub, and GitLab. However, this is not required and the receiver may listen directly to these services.

A worker must be spawned separately to read from the queue and perform tasks in response to the event. The worker must have a function named webhook.job.

Running

The job queue requires docker-compose and, in its simplest form, can be invoked with docker-compose up. By default, it will bind to localhost:8080 but allow clients from all IP addresses. This may appear odd, but on MacOS and Windows, traffic to the containers will appear as though it's coming from the gateway of the network created by Docker's linux virtualization.

In a production environment without the networking restrictions imposed by MacOS/Windows, you might elect to provide different defaults through the the shell environment. e.g.

ALLOWED_IPS=A.B.C.D LISTEN_IP=0.0.0.0 docker-compose up

where A.B.C.D is an IP address (or CIDR range) from which your webhooks will be sent.

A worker must be spawned to perform tasks by removing the notification data from the redis queue. The redis keystore is configured to listen only to clients on the localhost.

Example worker

To run jobs using the webhooks as input:

  1. Create a file named webhook.py
  2. Define a function within named job that takes a dict as its lone argument
  3. Install python-rq
    • e.g. pip install rq
  4. Run rq worker from within that directory

See the CVMFS-to-Docker converter for a real world example.