Skip to content

opendatacube/datacube-core

Folders and files

NameName
Last commit message
Last commit date

Latest commit

0916236 · Jul 2, 2020
May 21, 2020
Mar 16, 2018
Jul 1, 2020
May 21, 2020
May 21, 2020
Jul 2, 2020
Jan 21, 2019
May 28, 2020
Jul 1, 2020
Mar 17, 2020
Mar 28, 2018
May 21, 2020
Jun 1, 2016
Feb 27, 2020
Jul 2, 2020
Dec 10, 2019
Aug 2, 2017
Sep 26, 2019
Sep 9, 2019
Apr 9, 2020
May 28, 2020
Apr 22, 2020
Aug 22, 2019
Dec 10, 2019
Feb 27, 2020
Jan 9, 2019
Apr 9, 2020
Apr 6, 2020
May 28, 2020

Repository files navigation

Open Data Cube Core

Build Status Coverage Status Documentation Status

Overview

The Open Data Cube Core provides an integrated gridded data analysis environment for decades of analysis ready earth observation satellite and related data from multiple satellite and other acquisition systems.

Documentation

See the user guide for installation and usage of the datacube, and for documentation of the API.

Join our Slack if you need help setting up or using the Open Data Cube.

Please help us to keep the Open Data Cube community open and inclusive by reading and following our Code of Conduct.

Requirements

System

  • PostgreSQL 9.5+
  • Python 3.6+

Developer setup

  1. Clone:
    • git clone https://github.com/opendatacube/datacube-core.git
  2. Create a Python environment to use ODC within, we recommend conda as the easiest way to handle Python dependencies.
conda create -n odc -c conda-forge python=3.6 datacube pre_commit
conda activate odc
  1. Install a develop version of datacube-core.
cd datacube-core
pip install --upgrade -e .
  1. Install the pre-commit hooks to help follow ODC coding conventions when committing with git.
pre-commit install
  1. Run unit tests + PyLint ./check-code.sh

    (this script approximates what is run by Travis. You can alternatively run pytest yourself). Some test dependencies may need to be installed, attempt to install these using:

    pip install --upgrade -e '.[test]'

    If install for these fails please lodge them as issues.

  2. (or) Run all tests, including integration tests.

    ./check-code.sh integration_tests

    • Assumes a password-less Postgres database running on localhost called

    agdcintegration

    • Otherwise copy integration_tests/agdcintegration.conf to ~/.datacube_integration.conf and edit to customise.

Alternatively one can use opendatacube/datacube-tests docker image to run tests. This docker includes database server pre-configured for running integration tests. Add --with-docker command line option as a first argument to ./check-code.sh script.

./check-code.sh --with-docker integration_tests

Developer setup on Ubuntu

Building Python virtual environment on Ubuntu suitable for development work.

Install dependencies:

sudo apt-get update
sudo apt-get install -y \
  autoconf automake build-essential make cmake \
  graphviz \
  python3-venv \
  python3-dev \
  libpq-dev \
  libyaml-dev \
  libnetcdf-dev \
  libudunits2-dev

Building python virtual environment:

pyenv="${HOME}/.envs/odc"  # Change to suit your needs
mkdir -p "${pyenv}"
python3 -m venv "${pyenv}"
source "${pyenv}/bin/activate"
pip install -U pip wheel cython numpy
pip install -e '.[dev]'
pip install flake8 mypy pylint autoflake black