Skip to content

Commit

Permalink
🔀 Merge branch 'atlxi_dhdt_20201111' (#241)
Browse files Browse the repository at this point in the history
Closes #241 Recalculate ICESat-2 ATL11 height changes up to 20201111.
  • Loading branch information
weiji14 committed Mar 10, 2021
2 parents 8e21f16 + c81f249 commit 2a536bb
Show file tree
Hide file tree
Showing 13 changed files with 754 additions and 636 deletions.
29 changes: 25 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,25 @@
# DeepIceDrain
# DeepIceDrain [[poster]](https://github.com/weiji14/nzasc2021)

Mapping and monitoring deep subglacial water activity
in Antarctica using remote sensing and machine learning.

[![Zenodo Digital Object Identifier](https://zenodo.org/badge/DOI/10.5281/zenodo.4071235.svg)](https://doi.org/10.5281/zenodo.4071235)
![GitHub top language](https://img.shields.io/github/languages/top/weiji14/deepicedrain.svg)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
![Test DeepIceDrain package](https://github.com/weiji14/deepicedrain/workflows/Test%20DeepIceDrain%20package/badge.svg)
[![Test DeepIceDrain package](https://github.com/weiji14/deepicedrain/actions/workflows/python-app.yml/badge.svg)](https://github.com/weiji14/deepicedrain/actions/workflows/python-app.yml)
[![Dependabot Status](https://api.dependabot.com/badges/status?host=github&repo=weiji14/deepicedrain)](https://dependabot.com)
![License](https://img.shields.io/github/license/weiji14/deepicedrain)

| Ice Surface Elevation trends over Antactica | Active Subglacial Lake filling event |
|---|---|
| ![ICESat-2 ATL11 rate of height change over time in Antarctica 2018-10-14 to 2020-09-30](https://user-images.githubusercontent.com/23487320/100858542-fe69ab00-34f2-11eb-9a0f-87805d00b2ed.png) | ![dsm_whillans_ix_cycles_3-8.gif](https://user-images.githubusercontent.com/23487320/97156701-f1fb7f80-17db-11eb-880c-87df2961e1c3.gif) |
| ![ICESat-2 ATL11 rate of height change over time in Antarctica 2018-10-14 to 2020-11-11](https://user-images.githubusercontent.com/23487320/105754590-220b1800-5faf-11eb-8f4c-b99fb7b7449e.png) | ![dsm_whillans_ix_cycles_3-9.gif](https://user-images.githubusercontent.com/23487320/110536564-7b599000-8186-11eb-9ae2-aca8d76f7313.gif) |

![DeepIceDrain Pipeline Part 1 Exploratory Data Analysis](https://yuml.me/diagram/scruffy;dir:LR/class/[Land-Ice-Elevation|atl06_play.ipynb]->[Convert|atl06_to_atl11.ipynb],[Convert]->[Land-Ice-Height-time-series|atl11_play.ipynb])
![DeepIceDrain Pipeline Part 2 Subglacial Lake Analysis](https://yuml.me/diagram/scruffy;dir:LR/class/[Height-Change-over-Time-(dhdt)|atlxi_dhdt.ipynb],[Height-Change-over-Time-(dhdt)]->[Subglacial-Lake-Finder|atlxi_lake.ipynb],[Subglacial-Lake-Finder]->[Crossover-Analysis|atlxi_xover.ipynb])

| Along track view of an ATL11 Ground Track | Elevation time-series at Crossover Points |
|---|---|
| ![alongtrack_whillans_ix_1080_pt3](https://user-images.githubusercontent.com/23487320/102156291-2210f600-3ee2-11eb-8175-e854b70444df.png) | ![crossover_anomaly_whillans_ix_2018-10-14_2020-09-30](https://user-images.githubusercontent.com/23487320/102610765-9bcf0b00-4192-11eb-803b-247ed960f9bb.png) |
| ![alongtrack_whillans_ix_1080_pt3](https://user-images.githubusercontent.com/23487320/110536370-41888980-8186-11eb-96e6-1ce92aa9966b.png) | ![crossover_anomaly_whillans_ix_2018-10-14_2020-11-11](https://user-images.githubusercontent.com/23487320/110536098-efdfff00-8185-11eb-97d9-065dd59b5727.png) |



Expand Down Expand Up @@ -131,3 +132,23 @@ Go check them out if you have time.
- [ATL11](https://github.com/suzanne64/ATL11)
- [ICESAT-2 HackWeek](https://github.com/ICESAT-2HackWeek)
- [icepyx](https://github.com/icesat2py/icepyx)


## Citing

The work in this repository has not been peer-reviewed, but if you do want to
cite it for some reason, use the following BibLaTeX code from this conference
proceedings ([poster presentation](https://github.com/weiji14/nzasc2021)):

@inproceedings{LeongSpatiotemporalvariabilityactive2021,
title = {{Spatiotemporal Variability of Active Subglacial Lakes in Antarctica from 2018-2020 Using ICESat-2 Laser Altimetry}},
author = {Leong, W. J. and Horgan, H. J.},
date = {2021-02-10},
publisher = {{Unpublished}},
location = {{Christchurch, New Zealand}},
doi = {10.13140/RG.2.2.27952.07680},
eventtitle = {{New Zealand Antarctic Science Conference}}},
langid = {english}
}

Python code for the DeepIceDrain package here on Github is also mirrored on Zenodo at https://doi.org/10.5281/zenodo.4071235.
343 changes: 194 additions & 149 deletions antarctic_subglacial_lakes_3031.geojson

Large diffs are not rendered by default.

343 changes: 194 additions & 149 deletions antarctic_subglacial_lakes_4326.geojson

Large diffs are not rendered by default.

12 changes: 6 additions & 6 deletions atl11_play.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -103,15 +103,15 @@
"# Adapted from the intake.open_netcdf._add_path_to_ds function.\n",
"add_path_to_ds = lambda ds: ds.assign_coords(\n",
" coords=intake.source.utils.reverse_format(\n",
" format_string=\"ATL11.001z123/ATL11_{referencegroundtrack:04d}1x_{}_{}_{}.zarr\",\n",
" format_string=\"ATL11.002z123/ATL11_{referencegroundtrack:04d}1x_{}_{}_{}.zarr\",\n",
" resolved_string=ds.encoding[\"source\"],\n",
" )\n",
")\n",
"\n",
"# Load dataset from all Zarr stores\n",
"# Aligning chunks spatially along cycle_number (i.e. time)\n",
"ds: xr.Dataset = xr.open_mfdataset(\n",
" paths=\"ATL11.001z123/ATL11_*_003_01.zarr\",\n",
" paths=\"ATL11.002z123/ATL11_*_002_01.zarr\",\n",
" chunks=\"auto\",\n",
" engine=\"zarr\",\n",
" combine=\"nested\",\n",
Expand All @@ -122,7 +122,7 @@
")\n",
"# ds = ds.unify_chunks().compute()\n",
"# TODO use intake, wait for https://github.com/intake/intake-xarray/issues/70\n",
"# source = intake.open_ndzarr(url=\"ATL11.001z123/ATL11_0*.zarr\")"
"# source = intake.open_ndzarr(url=\"ATL11.002z123/ATL11_0*.zarr\")"
]
},
{
Expand Down Expand Up @@ -187,8 +187,8 @@
},
"outputs": [],
"source": [
"# ds[\"utc_time\"] = ds.delta_time.rename(new_name_or_name_dict=\"utc_time\")\n",
"ds[\"utc_time\"] = deepicedrain.deltatime_to_utctime(dataarray=ds.delta_time)"
"ds[\"utc_time\"] = ds.delta_time.rename(new_name_or_name_dict=\"utc_time\")\n",
"# ds[\"utc_time\"] = deepicedrain.deltatime_to_utctime(dataarray=ds.delta_time)"
]
},
{
Expand Down Expand Up @@ -912,7 +912,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.3"
"version": "3.8.6"
}
},
"nbformat": 4,
Expand Down
12 changes: 6 additions & 6 deletions atl11_play.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
# extension: .py
# format_name: hydrogen
# format_version: '1.3'
# jupytext_version: 1.5.2
# jupytext_version: 1.9.1
# kernelspec:
# display_name: deepicedrain
# language: python
Expand Down Expand Up @@ -57,15 +57,15 @@
# Adapted from the intake.open_netcdf._add_path_to_ds function.
add_path_to_ds = lambda ds: ds.assign_coords(
coords=intake.source.utils.reverse_format(
format_string="ATL11.001z123/ATL11_{referencegroundtrack:04d}1x_{}_{}_{}.zarr",
format_string="ATL11.002z123/ATL11_{referencegroundtrack:04d}1x_{}_{}_{}.zarr",
resolved_string=ds.encoding["source"],
)
)

# Load dataset from all Zarr stores
# Aligning chunks spatially along cycle_number (i.e. time)
ds: xr.Dataset = xr.open_mfdataset(
paths="ATL11.001z123/ATL11_*_003_01.zarr",
paths="ATL11.002z123/ATL11_*_002_01.zarr",
chunks="auto",
engine="zarr",
combine="nested",
Expand All @@ -76,7 +76,7 @@
)
# ds = ds.unify_chunks().compute()
# TODO use intake, wait for https://github.com/intake/intake-xarray/issues/70
# source = intake.open_ndzarr(url="ATL11.001z123/ATL11_0*.zarr")
# source = intake.open_ndzarr(url="ATL11.002z123/ATL11_0*.zarr")
# %% [markdown]
# ## Convert geographic lon/lat to x/y
#
Expand Down Expand Up @@ -110,8 +110,8 @@
# in the future.

# %%
# ds["utc_time"] = ds.delta_time.rename(new_name_or_name_dict="utc_time")
ds["utc_time"] = deepicedrain.deltatime_to_utctime(dataarray=ds.delta_time)
ds["utc_time"] = ds.delta_time.rename(new_name_or_name_dict="utc_time")
# ds["utc_time"] = deepicedrain.deltatime_to_utctime(dataarray=ds.delta_time)


# %% [markdown]
Expand Down
82 changes: 41 additions & 41 deletions atlxi_dhdt.ipynb

Large diffs are not rendered by default.

22 changes: 11 additions & 11 deletions atlxi_dhdt.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
# extension: .py
# format_name: hydrogen
# format_version: '1.3'
# jupytext_version: 1.7.1
# jupytext_version: 1.9.1
# kernelspec:
# display_name: deepicedrain
# language: python
Expand Down Expand Up @@ -54,7 +54,7 @@
import deepicedrain

# %%
client = dask.distributed.Client(n_workers=32, threads_per_worker=1)
client = dask.distributed.Client(n_workers=16, threads_per_worker=1)
client

# %% [markdown]
Expand All @@ -64,14 +64,14 @@
# Xarray open_dataset preprocessor to add fields based on input filename.
add_path_to_ds = lambda ds: ds.assign_coords(
coords=intake.source.utils.reverse_format(
format_string="ATL11.001z123/ATL11_{referencegroundtrack:04d}1x_{}_{}_{}.zarr",
format_string="ATL11.002z123/ATL11_{referencegroundtrack:04d}1x_{}_{}_{}.zarr",
resolved_string=ds.encoding["source"],
)
)

# Load ATL11 data from Zarr
ds: xr.Dataset = xr.open_mfdataset(
paths="ATL11.001z123/ATL11_*_003_01.zarr",
paths="ATL11.002z123/ATL11_*_002_01.zarr",
chunks="auto",
engine="zarr",
combine="nested",
Expand Down Expand Up @@ -158,7 +158,7 @@

# %%
# Get first and last dates to put into our plots
min_date, max_date = ("2018-10-14", "2020-09-30")
min_date, max_date = ("2018-10-14", "2020-11-11")
if min_date is None:
min_delta_time = np.nanmin(ds.delta_time.isel(cycle_number=0).data).compute()
min_utc_time = deepicedrain.deltatime_to_utctime(min_delta_time)
Expand Down Expand Up @@ -208,7 +208,7 @@
# ds_ht.to_zarr(store=f"ATLXI/ds_hrange_time_{placename}.zarr", mode="w", consolidated=True)
ds_ht: xr.Dataset = xr.open_dataset(
filename_or_obj=f"ATLXI/ds_hrange_time_{placename}.zarr",
chunks={"cycle_number": 9},
chunks={"cycle_number": 7},
engine="zarr",
backend_kwargs={"consolidated": True},
)
Expand Down Expand Up @@ -377,12 +377,11 @@
# Save or load dhdt data from Parquet file
for placename in tqdm.tqdm(
iterable=[
"amundsen_sea_embayment",
"siple_coast",
"slessor_downstream",
"whillans_downstream",
"whillans_upstream",
"Recovery",
"siple_coast",
"slessor_downstream",
"amundsen_sea_embayment",
]
):
# TODO make the region detection code below better
Expand All @@ -401,7 +400,7 @@
gdf=regions.loc[placename]
)

if not os.path.exists(f"ATLXI/df_dhdt_{placename}.parquet"):
if not os.path.exists(f"ATLXI/df_dhdt_{placename.lower()}.parquet"):
# Subset dataset to geographic region of interest
ds_subset: xr.Dataset = region.subset(data=ds_dhdt)
# Rename delta_time (timedelta64) to utc_time (datetime64), because that's what it is
Expand All @@ -422,6 +421,7 @@
"utc_time",
],
dropnacols=["dhdt_slope"],
startcol=3,
use_deprecated_int96_timestamps=True,
)
# df_dhdt = pd.read_parquet(f"ATLXI/df_dhdt_{placename}.parquet")
Expand Down
Loading

0 comments on commit 2a536bb

Please sign in to comment.