Django application for the DDP platform's management backend. Exposes API endpoints for the management frontend to communicate with, for the purposes of
- Onboarding an NGO client
- Adding users from the client-organization
- Creating a client's workspace in our Airbyte installation
- Configuring that workspace i.e. setting up sources, destinations and connections
- Configuring data ingest jobs in our Prefect setup
- Connecting to the client's dbt GitHub repository
- Configuring dbt run jobs in our Prefect setup
- REST conventions are being followed.
- CRUD end points for a User resource would look like:
- GET /api/users/
- GET /api/users/user_id
- POST /api/users/
- PUT /api/users/:user_id
- DELETE /api/users/:user_id
- Route parameteres should be named in snake_case as shown above.
- All api docs are at
http://localhost:8002/api/docs
Pep8
has been used to standardized variable names, classes, module names etc.Pylint
is the linting tool used to analyze the code as per Pep8 style.Black
is used as the code formatter.
- Recommended IDE is VsCode.
- Install the pylint extension in vscode and enable it.
- Set the default format provider in vscode as
black
- Update the vscode settings.json as follows
{ "editor.defaultFormatter": null, "python.linting.enabled": true, "python.formatting.provider": "black", "editor.formatOnSave": true }
The project uses uv
as its package manager. You will need to install it on your machine
UV can be installed system-wide using cURL on macOS and Linux:
curl -LsSf https://astral.sh/uv/install.sh | sudo sh
And with Powershell on Windows (make sure you run Powershell with administrator privileges):
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
UV is available via Homebrew as well:
brew install uv
uv sync
- Run "pre-commit install" after activating your virtual env created in above step
- Run "pre-commit run --all-files" to run the formatter
- create
.env
from.env.template
-
create a SQL database and populate its credentials into
.env
-
You can use a postgresql docker image for local development
docker run --name postgres-db -e POSTGRES_PASSWORD=<password> -p 5432:5432 -d <db name>
- Add the environment variable to .env
DBNAME=<db name>
DBHOST=localhost
DBPORT=5432
DBUSER=postgres
DBPASSWORD=<password>
DBADMINUSER=postgres
DBADMINPASSWORD=<password>
- Open a new terminal
- Download run-ab-platform.sh for Airbyte 0.58.0
- Run
./run-ab-platform.sh
to start Airbyte. This is a self-contained application which includes the configuration database - Populate Airbyte connection credentials in the
.env
from Step 2:
AIRBYTE_SERVER_HOST=localhost
AIRBYTE_SERVER_PORT=8000
AIRBYTE_SERVER_APIVER=v1
AIRBYTE_API_TOKEN= <token> # base64 encryption of username:password. Default username and password is airbyte:password and token will be YWlyYnl0ZTpwYXNzd29yZA==
AIRBYTE_DESTINATION_TYPES="Postgres,BigQuery"
- Start Prefect Proxy and populate connection info in
.env
PREFECT_PROXY_API_URL=
- Set
DEV_SECRETS_DIR
in.env
unless you want to use Amazon's Secrets Manager
-
Open a new terminal
-
Create a local
venv
, installdbt
and put its location intoDBT_VENV
in.env
pyenv local 3.10
pyenv exec python -m venv <env-name>
source <env-name>/bin/activate
python -m pip install \
dbt-core \
dbt-postgres \
dbt-bigquery
- Create empty directories for
CLIENTDBT_ROOT
CLIENTDBT_ROOT=
DBT_VENV=<env-name>/bin/activate
- The
SIGNUPCODE
in.env
is for signing up using the frontend. If you are running the frontend, set its URL inFRONTEND_URL
DJANGOSECRET=
-
Create logs folder in
ddpui
-
create
whitelist.py
from.whitelist.template.py
in ddpui > assets folder -
Run DB migrations
python manage.py migrate
-
Seed the DB
python manage.py loaddata seed/*.json
-
Create the system user
python manage.py create-system-orguser
-
Start the server
uvicorn ddpui.asgi:application --port <PORT_TO_LISTEN_ON>
- Run
python manage.py createorganduser <Org Name> <Email address> --role super-admin
- The above command creates a user with super admin role. If we don't provide any role, the default role is of account manager.
- In your virtual environment run:
celery -A ddpui worker -n ddpui
- For windows run:
celery -A ddpui worker -n ddpui -P solo
- To start celery beat run:
celery -A ddpui beat
Follow the steps below:
- Install docker
- Install docker-compose
- create
.env.docker
from.env.template
inside the Docker folder
- Copy the file in ddpui/assets/ to Docker/mount
If using M1-based MacBook run this before building image
export DOCKER_DEFAULT_PLATFORM=linux/amd64
docker build -f Docker/Dockerfile.main --build-arg BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ') -t dalgo_backend_main_image:0.1 .
This will create the main imagedocker build -f Docker/Dockerfile.dev.deploy --build-arg BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ') -t dalgo_backend:0.1 .
docker compose -p dalgo_backend -f Docker/docker-compose.yml --env-file Docker/.env.docker up -d
docker compose -p dalgo_backend -f Docker/docker-compose.yml --env-file Docker/.env.docker down
The platform supports feature flags to control the availability of features at both global and organization-specific levels. Organization-specific flags override global flags.
DATA_QUALITY
- Elementary data quality reportsUSAGE_DASHBOARD
- Superset usage dashboard for orgEMBED_SUPERSET
- Embed superset dashboardsLOG_SUMMARIZATION
- Summarize logs using AIAI_DATA_ANALYSIS
- Enable data analysis using AIDATA_STATISTICS
- Enable detailed data statistics in explore
Global flags apply to all organizations by default:
from ddpui.utils.feature_flags import enable_feature_flag, disable_feature_flag
# Enable a global flag
enable_feature_flag("DATA_QUALITY") # org=None for global
# Disable a global flag
disable_feature_flag("DATA_QUALITY")
Organization-specific flags override global flags for that particular org:
from ddpui.models.org import Org
from ddpui.utils.feature_flags import enable_feature_flag, disable_feature_flag
# Get the organization
org = Org.objects.get(slug="org-slug")
# Enable for specific org (overrides global setting)
enable_feature_flag("DATA_QUALITY", org=org)
# Disable for specific org
disable_feature_flag("DATA_QUALITY", org=org)
Use the Django management command to manage flags via CLI:
# Enable all available global feature flags
python manage.py manage_feature_flags --enable-all-global
# Enable a specific flag globally
python manage.py manage_feature_flags --enable DATA_QUALITY
# Disable a specific flag globally
python manage.py manage_feature_flags --disable DATA_QUALITY
# Enable for a specific organization
python manage.py manage_feature_flags --enable DATA_QUALITY --org-slug org-slug
# Disable for a specific organization
python manage.py manage_feature_flags --disable DATA_QUALITY --org-slug org-slug
Frontend applications can fetch the feature flags for the current organization:
GET /api/organizations/flags
This endpoint returns a JSON object with all feature flags and their status for the authenticated user's organization. The response includes both global flags and any organization-specific overrides.