diff --git a/smallbaselineApp_hyp3.ipynb b/smallbaselineApp_hyp3.ipynb index c99b653..c833770 100644 --- a/smallbaselineApp_hyp3.ipynb +++ b/smallbaselineApp_hyp3.ipynb @@ -4,12 +4,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# InSAR time series analysis with HyP3 and MintPy\n", + "# Time series analysis of hyp3 InSAR products by MintPy\n", "\n", - "This notebook shows how to do time-series analysis using HyP3 product with MintPy. It requires `hyp3_sdk` and `MintPy`:\n", - "\n", - "+ run `conda install --yes -c conda-forge hyp3_sdk ipywidgets` to install `hyp3_sdk`\n", - "+ check the [installation page](https://github.com/insarlab/MintPy/blob/main/docs/installation.md) to install `MintPy`" + "This notebook shows how to do time-series analysis with HyP3 InSAR product by MintPy. We assume you have already got the hyp3 InSAR products somewhere. This steps for the analysis are: clip the hyp3 INSAR product, define the config.txt file, run the time series analysis, and display the results. We use 2019 Ridgecrest Earthquake, CA (https://earthquake.usgs.gov/storymap/index-ridgecrest.html) as the example. The example hyp3 INSAR data are at https://jzhu-hyp3-dev.s3.us-west-2.amazonaws.com/hyp3-mintpy-example/2019_ridgecrest.zip. As far as how to produce the hyp3 INSAR product, we provide the detail steps in the tutorial(https://github.com/ASFHyP3/hyp3-docs/blob/develop/docs/tutorials/hyp3_insar_stack_for_ts_analysis.ipynb). \n" ] }, { @@ -18,7 +15,23 @@ "source": [ "## 0. Initial setup of the notebook\n", "\n", - "The cell below performs the intial setup of the notebook and must be **run every time the notebook (re)starts**. It imports necessary modules and defines the processing location." + "To run this notebook, you'll need a conda environment with the required dependencies. You can set up a new environment (recommended) and run the jupyter server like:\n", + "\n", + "conda create -n hyp3-mintpy python=3.8 asf_search hyp3_sdk \"mintpy>=1.3.2\" pandas jupyter ipympl\n", + "\n", + "To make you conda env accessible in the jupyter notebook, you need to do:\n", + "\n", + "conda activate hyp3-mintpy\n", + "conda install -c conda-forge tensorflow\n", + "conda install -c anaconda ipykernel\n", + "python -m ipykernel install --user --name=hyp3-mintpy\n", + "\n", + "To run your notebook, just:\n", + "\n", + "conda activate hyp3-mintpy\n", + "jupyter notebook smallbaselineApp_hyp3.ipynb\n", + "\n", + "For your convinience, we provide the ERA5 data at https://jzhu-hyp3-dev.s3.us-west-2.amazonaws.com/hyp3-mintpy-example/2019_ridgecrest_era5_data.zip, you can download the and unzip the file to the 'your_weather_dir' directory on you local machine, and setup the environment variable. e.g. export WEATHER_DIR='your_weather_dir'.\n" ] }, { @@ -27,53 +40,31 @@ "metadata": {}, "outputs": [], "source": [ - "%matplotlib inline\n", - "import os\n", - "import numpy as np\n", - "import matplotlib.pyplot as plt\n", - "# verify mintpy install\n", - "try:\n", - " #from mintpy.objects.insar_vs_gps import plot_insar_vs_gps_scatter\n", - " #from mintpy.unwrap_error_phase_closure import plot_num_triplet_with_nonzero_integer_ambiguity\n", - " #from mintpy import workflow, view, tsview, plot_network, plot_transection, plot_coherence_matrix\n", - " from mintpy import view, tsview\n", - "except ImportError:\n", - " raise ImportError(\"Can not import mintpy!\")\n", + "from pathlib import Path" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Define the parameters and create directories" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from pathlib import Path\n", "\n", - "# utils function\n", - "def configure_template_file(outName, CONFIG_TXT): \n", - " \"\"\"Write configuration files for MintPy to process HyP3 product\"\"\"\n", - " if os.path.isfile(outName):\n", - " with open(outName, \"w\") as fid:\n", - " fid.write(CONFIG_TXT)\n", - " print('write configuration to file: {}'.format(outName))\n", + "project_name = 'Ridgecrest'\n", "\n", - " else:\n", - " with open(outName, \"a\") as fid:\n", - " fid.write(\"\\n\" + CONFIG_TXT)\n", - " print('add the following to file: \\n{}'.format(outName))\n", + "work_dir = Path.cwd() / project_name\n", "\n", - "# define the work directory\n", - "#work_dir = os.path.abspath(os.path.join(os.getcwd(), 'mintpy')) #OpenSARLab at ASF\n", - "proj_name = 'Ridgecrest'\n", - "proj_dir = os.path.join('/media/jzhu4/data/hyp3-mintpy', proj_name) #Local\n", - "hyp3_dir = os.path.join(proj_dir, 'hyp3')\n", - "work_dir = os.path.join(proj_dir, 'mintpy') #Local\n", + "data_dir = work_dir / 'data'\n", "\n", - "if not os.path.isdir(proj_dir):\n", - " os.makedirs(proj_dir)\n", - " print('Create directory: {}'.format(proj_dir))\n", - " \n", - "if not os.path.isdir(hyp3_dir):\n", - " os.makedirs(hyp3_dir)\n", - " print('Create directory: {}'.format(hyp3_dir))\n", - " \n", - "if not os.path.isdir(work_dir):\n", - " os.makedirs(work_dir)\n", - " print('Create directory: {}'.format(work_dir))\n", - " \n", - "os.chdir(work_dir)\n", - "print('Go to work directory: {}'.format(work_dir))" + "data_dir.mkdir(parents=True, exist_ok=True)\n" ] }, { @@ -87,7 +78,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.1 Prepare the template file" + "### 1.1 Load the hyp3 InSAR data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The example dataset is from 2019 Ridgecrest Earthquake, CA. The dataset can be obtained through either downloading from the stagged server or producing with hyp3-sdk. Here we provide the sample dataset at " ] }, { @@ -96,33 +94,113 @@ "metadata": {}, "outputs": [], "source": [ - "CONFIG_TXT = f'''# vim: set filetype=cfg:\n", - "mintpy.load.processor = hyp3\n", - "##---------interferogram datasets:\n", - "mintpy.load.unwFile = {hyp3_dir}/*/*unw_phase_clip.tif\n", - "mintpy.load.corFile = {hyp3_dir}/*/*corr_clip.tif\n", - "##---------geometry datasets:\n", - "mintpy.load.demFile = {hyp3_dir}/*/*dem_clip.tif\n", - "mintpy.load.incAngleFile = {hyp3_dir}/*/*lv_theta_clip.tif\n", - "mintpy.load.waterMaskFile = {hyp3_dir}/*/*water_mask_clip.tif\n", - "'''\n", - "print(CONFIG_TXT)\n", - "configName = os.path.join(work_dir, \"{}.txt\".format(proj_name))\n", - "configure_template_file(configName, CONFIG_TXT)" + "file = '2019_ridgecrest.zip'\n", + "\n", + "file_url = f'https://jzhu-hyp3-dev.s3.us-west-2.amazonaws.com/hyp3-mintpy-example/{file}'\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!wget {file_url} -P {data_dir}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(f'downloaded file is {data_dir}/{file}')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "import zipfile\n", + "\n", + "def unzip_files(zip_file, data_dir):\n", + " if os.path.isfile(zip_file):\n", + " with zipfile.ZipFile(zip_file, 'r') as fzip:\n", + " fzip.extractall(data_dir)\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "unzip_files(f'{data_dir}/{file}', data_dir)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1.2 Cut geotiff files for mintpy analysis" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.2 Load the data produced from HyP3" + "### Get the minumum overlap of the files" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from pathlib import Path\n", + "from typing import List, Union\n", + "from osgeo import gdal\n", + "\n", + "\n", + "def get_minimum_overlap(filelist: List[Union[str, Path]]) -> List[float]:\n", + " \"\"\"Get the minimum overlap of the geotiff files in the filelist.\n", + " \n", + " Arg:\n", + " filelist: a list of geotiff file names. The file names can be strings or Path objects.\n", + " \n", + " Returns:\n", + " [ulx, uly, lrx, lry], a list which includes the upper-left x, upper-left y, lower-right x, \n", + " and lower-right y.\n", + " \"\"\"\n", + " corners = [gdal.Info(str(dem), format='json')['cornerCoordinates'] for dem in filelist]\n", + "\n", + " ulx = max(corner['upperLeft'][0] for corner in corners)\n", + " uly = min(corner['upperLeft'][1] for corner in corners)\n", + " lrx = min(corner['lowerRight'][0] for corner in corners)\n", + " lry = max(corner['lowerRight'][1] for corner in corners)\n", + " return [ulx, uly, lrx, lry]\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "files = data_dir.glob('*/*_dem.tif')\n", + "\n", + "overlap = get_minimum_overlap(files)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The example dataset is from 2019 Ridgecrest, CA earthquake. The dataset can be obtained through either downloading from the stagged server or producing with hyp3-sdk. As far as producing data from hyp3-sdk, we provide the prep_ts_hyp3 notebook at the tutorial directory of (https://github.com/ASFHyP3/hyp3-docs/tree/develop/docs )." + "### Clip the files with overlap" ] }, { @@ -131,53 +209,46 @@ "metadata": {}, "outputs": [], "source": [ - "# verify / prepare input dataset\n", + "from pathlib import Path\n", + "from typing import List, Union\n", "\n", - "os.chdir(hyp3_dir)\n", + "def clip_hyp3_products_to_minimum_overlap(data_dir: Union[str, Path], overlap: List[float]) -> None:\n", + " \"\"\"Clip all geotiff files in the directory with the overlap.\n", + " \n", + " Args:\n", + " data_dir:\n", + " name of a directory which includes the geotiff files.\n", + " overlap:\n", + " a list which includes the upper-left x, upper-left y, lower-right-x, and lower-tight y.\n", "\n", - "use_staged_data = True\n", + " Returns: None\n", + " \"\"\"\n", "\n", - "zip_file_name ='Ridgecrest.zip'\n", + " files_for_mintpy = ['_water_mask.tif', '_corr.tif', '_unw_phase.tif', '_dem.tif', '_lv_theta.tif', '_lv_phi.tif']\n", "\n", - "if all(os.path.isfile(os.path.join(work_dir, 'inputs', i)) for i in ['ifgramStack.h5', 'geometryGeo.h5']):\n", - " print(\"Required inputs for mintpy already exists.\")\n", + " for extension in files_for_mintpy:\n", "\n", - "else:\n", - " if use_staged_data:\n", - " # Check if a stage file from S3 already exist, if not try and download it\n", - " zip_file = os.path.join(hyp3_dir, zip_file_name)\n", - " if not os.path.isfile(zip_file):\n", - " !wget https://jzhu-hyp3-dev.s3.us-west-2.amazonaws.com/hyp3-mintpy/{zip_file_name}\n", - " #!aws s3 cp s3://jzhu-hyp3-dev/hyp3-mintpy-example/{zip_file_name} {zip_file_name}\n", - " # verify if download was succesfull\n", - " if os.path.isfile(zip_file_name):\n", - " import zipfile, glob\n", - " \n", - " with zipfile.ZipFile(zip_file, 'r') as fzip:\n", - " fzip.extractall(hyp3_dir)\n", - " # unzip zip files extracted from the zip_file\n", - " files = glob.glob(\"./????_*.zip\")\n", - " for file in files:\n", - " with zipfile.ZipFile(file) as f:\n", - " f.extractall(hyp3_dir)\n", - " \n", - " print('S3 pre-staged data retrieval was successfull')\n", + " for file in data_dir.rglob(f'*{extension}'):\n", "\n", - " else:\n", - " msg = 'No staged data. Setting use_staged_data = False and re-run this cell.'\n", - " print(msg)\n", + " dst_file = file.parent / f'{file.stem}_clipped{file.suffix}'\n", "\n", - " else:\n", - " print(\"Using HyP3-sdk to download and prepare the input data for MintPy\")\n", - " print(\"please refer the notebook\")\n", - " os.chdir(os.path.dirname(work_dir))" + " gdal.Translate(destName=str(dst_file), srcDS=str(file), projWin=overlap)\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "clip_hyp3_products_to_minimum_overlap(data_dir, overlap)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3 Run Time-series Analysis application" + "### 1.3 Prepare the template file" ] }, { @@ -186,14 +257,44 @@ "metadata": {}, "outputs": [], "source": [ - "! smallbaselineApp.py --work-dir {work_dir} {configName}" + "mintpy_config = work_dir / 'mintpy_config.txt'\n", + "mintpy_config.write_text(\n", + "f\"\"\"\n", + "mintpy.load.processor = hyp3\n", + "##---------interferogram datasets:\n", + "mintpy.load.unwFile = {data_dir}/*/*_unw_phase_clipped.tif\n", + "mintpy.load.corFile = {data_dir}/*/*_corr_clipped.tif\n", + "##---------geometry datasets:\n", + "mintpy.load.demFile = {data_dir}/*/*_dem_clipped.tif\n", + "mintpy.load.incAngleFile = {data_dir}/*/*_lv_theta_clipped.tif\n", + "mintpy.load.azAngleFile = {data_dir}/*/*_lv_phi_clipped.tif\n", + "mintpy.load.waterMaskFile = {data_dir}/*/*_water_mask_clipped.tif\n", + "\"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1.4 Run Time-series Analysis application" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "! smallbaselineApp.py --work-dir {work_dir} {mintpy_config}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.4 Display the analysis results\n", + "## 2. Display the analysis results\n", "\n", "There are a few scripts used to display the analysis results. There are in the MINTPY_HOME/mintpy. Here we show two majoy disaply scripts." ] @@ -204,7 +305,8 @@ "metadata": {}, "outputs": [], "source": [ - "os.chdir(proj_dir)" + "%matplotlib widget\n", + "from mintpy import view, tsview" ] }, { @@ -215,7 +317,7 @@ }, "outputs": [], "source": [ - "view.main(['mintpy/velocity.h5'])" + "view.main([f'{work_dir}/velocity.h5'])" ] }, { @@ -224,7 +326,7 @@ "metadata": {}, "outputs": [], "source": [ - "tsview.main(['mintpy/timeseries.h5'])" + "tsview.main([f'{work_dir}/timeseries.h5'])" ] }, { @@ -237,9 +339,9 @@ ], "metadata": { "kernelspec": { - "display_name": "mintpy", + "display_name": "hyp3-mintpy", "language": "python", - "name": "mintpy" + "name": "hyp3-mintpy" }, "language_info": { "codemirror_mode": { @@ -251,7 +353,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.10" + "version": "3.8.13" } }, "nbformat": 4,