Skip to content

opendatacube/datacube-core

This branch is 43 commits behind develop.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

39c96e8 · Dec 22, 2024
Dec 22, 2024
Dec 19, 2024
Dec 19, 2024
Dec 22, 2024
Jan 14, 2024
Dec 19, 2024
Dec 19, 2024
Mar 17, 2020
Oct 31, 2022
Jan 30, 2023
May 21, 2020
Jun 1, 2016
Aug 10, 2020
Mar 22, 2023
Mar 18, 2024
Dec 5, 2024
Oct 31, 2022
Aug 6, 2024
Sep 9, 2019
Sep 10, 2020
Dec 19, 2024
May 25, 2021
Dec 3, 2024
Sep 18, 2024
Jun 21, 2023
Jun 21, 2023
Mar 14, 2024
Aug 1, 2023
Oct 24, 2024
Feb 27, 2020
Oct 31, 2022
Jun 29, 2022
Dec 6, 2024
Oct 31, 2022
Dec 17, 2024

Open Data Cube Core

Build Status Coverage Status Documentation Status Discord

Overview

The Open Data Cube Core provides an integrated gridded data analysis environment for decades of analysis ready earth observation satellite and related data from multiple satellite and other acquisition systems.

Documentation

See the user guide for installation and usage of the datacube, and for documentation of the API.

Join our Discord if you need help setting up or using the Open Data Cube.

Please help us to keep the Open Data Cube community open and inclusive by reading and following our Code of Conduct.

This is a 1.9.x series release of the Open Data Cube. If you are migrating from a 1.8.x series release, please refer to the 1.8.x to 1.9.x Migration Notes.

Requirements

System

  • PostgreSQL 15+
  • Python 3.10+

Developer setup

  1. Clone:
    • git clone https://github.com/opendatacube/datacube-core.git
  2. Create a Python environment for using the ODC. We recommend Mambaforge as the easiest way to handle Python dependencies.
mamba env create -f conda-environment.yml
conda activate cubeenv
  1. Install a develop version of datacube-core.
cd datacube-core
pip install --upgrade -e .
  1. Install the pre-commit hooks to help follow ODC coding conventions when committing with git.
pre-commit install
  1. Run unit tests + PyLint

Install test dependencies using:

pip install --upgrade -e '.[test]'

If install for these fails, please lodge them as issues.

Run unit tests with:

./check-code.sh

(this script approximates what is run by GitHub Actions. You can alternatively run pytest yourself).

  1. (or) Run all tests, including integration tests.

    ./check-code.sh integration_tests

    • Assumes the sexistence of two password-less Postgres databases running on localhost called pgintegration and pgisintegration.
    • Otherwise copy integration_tests/integration.conf to ~/.datacube_integration.conf and edit to customise.
    • For instructions on setting up a password-less Postgres database, see
      the developer setup instructions.

Alternatively one can use the opendatacube/datacube-tests docker image to run tests. This docker includes database server pre-configured for running integration tests. Add --with-docker command line option as a first argument to ./check-code.sh script.

./check-code.sh --with-docker integration_tests

To run individual tests in a docker container

docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .

docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name

Developer setup on Ubuntu

Building a Python virtual environment on Ubuntu suitable for development work.

Install dependencies:

sudo apt-get update
sudo apt-get install -y \
    autoconf automake build-essential make cmake \
    graphviz \
    python3-venv \
    python3-dev \
    libpq-dev \
    libyaml-dev \
    libnetcdf-dev \
    libudunits2-dev

Build the python virtual environment:

pyenv="${HOME}/.envs/odc"  # Change to suit your needs
mkdir -p "${pyenv}"
python3 -m venv "${pyenv}"
source "${pyenv}/bin/activate"
pip install -U pip wheel cython numpy
pip install -e '.[dev]'
pip install flake8 mypy pylint autoflake black