Topographical modeling of the Helium Network with applications.
This repository contains a variety of scripts that are in various stages of development. Overall, I am working toward refactoring the code away from ArangoDB and toward a place where all data is drawn from an instance of [helium-transaction-etl
], a lightweight block follower and database. This will hopefully simplify deployments of the various scripts and applications. I'll try to keep this README up-to-date with the latest status of the various components.
Script | Description | Status |
---|---|---|
api_batch.py |
A REST API that serves topographic and witnessing metrics. | Stable. Used in crowdspot. |
app.py |
A streamlit app that displays trilateration and topography results using ArangoDB as a backend. | Functional, but no longer supported. |
app_relational.py |
Same as app.py , but using helium-transaction-etl , a SQL database. |
Stable. (use this version of the app) |
batch_processing.py |
Generates topographic predictions en masse and inserts results into the db populated by helium-transaction-etl . |
Stable |
train.py |
Trains the topographic ML models used by the above tools. Refactoring in progress. | Functional, no longer supported. |
These tools use a Postgres database that is populated with witness data via a helium block follower, blockchain-node
.
- Follow these instructions to run the
helium-transaction-etl
client alongsideblockchain-node
. Allow the service some time to ingest blocks. - Make a copy of
.env.template
called.env
and populate the environment variables to link to your Postgres database and Mapbox API token.
The tools also draw from open-source topographic datasets courtesy of the Space Shuttle Endeavor. You'll need to download the entire map (~18GB) to your server. Further, we host pre-trained models that you can download if you want to skip the training step.
- Create the following folders/subfolders within this repository directory.
mkdir -p static/gis-data/SRTM_GL3
mkdir -p static/trained_models/svm
mkdir -p static/trained_models/gaussian_process
mkdir -p static/trained_models/isolation_forest
- Install the latest version of the AWS CLI. Instructions here.
- (from this directory) Download the SRTM dataset:
aws s3 cp s3://raster/SRTM_GL3/ static/gis-data/SRTM_GL3 --recursive --endpoint-url https://opentopography.s3.sdsc.edu --no-sign-request
- Download the trained models:
wget -O static/trained_models/svm/2022-02-06T16_23_54.mdl https://helium-topography.s3.amazonaws.com/trained_models/svm/2022-02-04T16_31_09.mdl
wget -O static/trained_models/gaussian_process/2022-02-04T16_28_14.mdl https://helium-topography.s3.amazonaws.com/trained_models/gaussian_process/2022-02-04T16_28_14.mdl
wget -O static/trained_models/isolation_forest/2022-02-04T16_31_09.mdl https://helium-topography.s3.amazonaws.com/trained_models/isolation_forest/2022-02-04T16_31_09.mdl
- Initialize, activate, and install
requirements.txt
to a Python 3.7+ virtual environment
virtualenv venv
source venv/bin/activate
(venv) $ pip install -r requirements.txt
You should now be able to run the scripts and apps mentioned above, e.g.
streamlit run app_relational.py
(launches the webapp on port 8501 by default)
python batch_processing.py
Build image with:
make docker-build-api
Then start it with:
make docker-start-api
Navigate to your copy of the helium-topography
repository.
cd /path/to/helium-topography
Stop the container and remove it.
make docker-clean-api
Update the repository.
git pull
Build image.
make docker-build-api
Start the updated Docker container.
make docker-start-api
See logs.
docker logs api
Build image with:
make docker-build-processing
Then start it with:
make docker-start-processing
Navigate to your copy of the helium-topography
repository.
cd /path/to/helium-topography
Stop the container and remove it.
make docker-clean-processing
Update the repository.
git pull
Build image.
make docker-build-processing
Start the updated Docker container.
make docker-start-processing
See logs.
docker logs batch-processing