Skip to content

Commit

Permalink
Mesh API Merge-Back (#4027)
Browse files Browse the repository at this point in the history
* add ugrid mesh-api stubs (#4001)

* add additional mesh stubs (#4005)

* Update mesh-data-model branch (#4009) (#4011)

* Add abstract cube summary (#3987)

Co-authored-by: stephen.worsley <[email protected]>

* add nox session conda list (#3990)

* Added text to state the Python version used to build the docs. (#3989)

* Added text to state the Python version used to build the docs.

* Added footer template that includes the Python version used to build.

* added new line

* Review actions

* added whatsnew

* Iris py38 (#3976)

* support for py38

* update CI and noxfile

* enforce alphabetical xml element attribute order

* full tests for py38 + fix docs-tests

* add whatsnew entry

* update doc-strings + review actions

* Alternate xml handling routine (#29)

* all xml tests pass for nox tests-3.8

* restored docstrings

* move sort_xml_attrs

* make sort_xml_attrs a classmethod

* update sort_xml_attr doc-string

Co-authored-by: Bill Little <[email protected]>

* add jamesp to whatsnew + minor tweak

Co-authored-by: James Penn <[email protected]>

* normalise version to implicit development release number (#3991)

* Gallery: update COP maps example  (#3934)

* update cop maps example

* comment tweaks

* minor comment tweak + whatsnew

* reinstate whatsnew addition

* remove duplicate whatsnew

* don't support mpl v1.2 (#3941)

* Cubesummary tidy (#3988)

* Extra tests; fix for array attributes.

* Docstring for CubeSummary, and remove some unused parts.

* Fix section name capitalisation, in line with existing cube summary.

* Handle array differences; quote strings in extras and if 'awkward'-printing.

* Ensure scalar string coord 'content' prints on one line.

* update intersphinx mapping and matplotlib urls (#4003)

* update intersphinx mapping and matplotlib urls

* use matplotlib intersphinx where possible

* review actions

* review actions

* update readme badges (#4004)

* update readme badges

* pimp twitter badge

* update readme logo img src and href (#4006)

* update setuptools description (#4008)

Co-authored-by: Patrick Peglar <[email protected]>
Co-authored-by: stephen.worsley <[email protected]>
Co-authored-by: tkknight <[email protected]>
Co-authored-by: James Penn <[email protected]>
Co-authored-by: Ruth Comer <[email protected]>

Co-authored-by: Patrick Peglar <[email protected]>
Co-authored-by: stephen.worsley <[email protected]>
Co-authored-by: tkknight <[email protected]>
Co-authored-by: James Penn <[email protected]>
Co-authored-by: Ruth Comer <[email protected]>

* MeshMetadata class. (#4002)

* MeshMetadata class.

* MeshMetadata extra members for dim names.

* Comment for BaseMetadata refactoring.

* add meshmetadata services (#4012)

* Mesh api coord manager (#4015)

* add mesh coordinate manager

* wip

* make shape methods private + reorganise method order

* review actions

* partial mesh

* wip

* Mesh data model to ng vat mesh api (#4023)

* Update mesh-data-model branch (#4009)

* Add abstract cube summary (#3987)

Co-authored-by: stephen.worsley <[email protected]>

* add nox session conda list (#3990)

* Added text to state the Python version used to build the docs. (#3989)

* Added text to state the Python version used to build the docs.

* Added footer template that includes the Python version used to build.

* added new line

* Review actions

* added whatsnew

* Iris py38 (#3976)

* support for py38

* update CI and noxfile

* enforce alphabetical xml element attribute order

* full tests for py38 + fix docs-tests

* add whatsnew entry

* update doc-strings + review actions

* Alternate xml handling routine (#29)

* all xml tests pass for nox tests-3.8

* restored docstrings

* move sort_xml_attrs

* make sort_xml_attrs a classmethod

* update sort_xml_attr doc-string

Co-authored-by: Bill Little <[email protected]>

* add jamesp to whatsnew + minor tweak

Co-authored-by: James Penn <[email protected]>

* normalise version to implicit development release number (#3991)

* Gallery: update COP maps example  (#3934)

* update cop maps example

* comment tweaks

* minor comment tweak + whatsnew

* reinstate whatsnew addition

* remove duplicate whatsnew

* don't support mpl v1.2 (#3941)

* Cubesummary tidy (#3988)

* Extra tests; fix for array attributes.

* Docstring for CubeSummary, and remove some unused parts.

* Fix section name capitalisation, in line with existing cube summary.

* Handle array differences; quote strings in extras and if 'awkward'-printing.

* Ensure scalar string coord 'content' prints on one line.

* update intersphinx mapping and matplotlib urls (#4003)

* update intersphinx mapping and matplotlib urls

* use matplotlib intersphinx where possible

* review actions

* review actions

* update readme badges (#4004)

* update readme badges

* pimp twitter badge

* update readme logo img src and href (#4006)

* update setuptools description (#4008)

Co-authored-by: Patrick Peglar <[email protected]>
Co-authored-by: stephen.worsley <[email protected]>
Co-authored-by: tkknight <[email protected]>
Co-authored-by: James Penn <[email protected]>
Co-authored-by: Ruth Comer <[email protected]>

* Master to mesh data model (#4022)

* Add abstract cube summary (#3987)

Co-authored-by: stephen.worsley <[email protected]>

* add nox session conda list (#3990)

* Added text to state the Python version used to build the docs. (#3989)

* Added text to state the Python version used to build the docs.

* Added footer template that includes the Python version used to build.

* added new line

* Review actions

* added whatsnew

* Iris py38 (#3976)

* support for py38

* update CI and noxfile

* enforce alphabetical xml element attribute order

* full tests for py38 + fix docs-tests

* add whatsnew entry

* update doc-strings + review actions

* Alternate xml handling routine (#29)

* all xml tests pass for nox tests-3.8

* restored docstrings

* move sort_xml_attrs

* make sort_xml_attrs a classmethod

* update sort_xml_attr doc-string

Co-authored-by: Bill Little <[email protected]>

* add jamesp to whatsnew + minor tweak

Co-authored-by: James Penn <[email protected]>

* normalise version to implicit development release number (#3991)

* Gallery: update COP maps example  (#3934)

* update cop maps example

* comment tweaks

* minor comment tweak + whatsnew

* reinstate whatsnew addition

* remove duplicate whatsnew

* don't support mpl v1.2 (#3941)

* Cubesummary tidy (#3988)

* Extra tests; fix for array attributes.

* Docstring for CubeSummary, and remove some unused parts.

* Fix section name capitalisation, in line with existing cube summary.

* Handle array differences; quote strings in extras and if 'awkward'-printing.

* Ensure scalar string coord 'content' prints on one line.

* update intersphinx mapping and matplotlib urls (#4003)

* update intersphinx mapping and matplotlib urls

* use matplotlib intersphinx where possible

* review actions

* review actions

* update readme badges (#4004)

* update readme badges

* pimp twitter badge

* update readme logo img src and href (#4006)

* update setuptools description (#4008)

* cirrus-ci compute credits (#4007)

* update release process (#4010)

* Stop using deprecated aliases of builtin types (#3997)

* Stopped using deprecated aliases of builtin types.
This is required to avoid warnings starting with NumPy 1.20.0.

* Update lib/iris/tests/test_cell.py

Co-authored-by: Bill Little <[email protected]>

* Update lib/iris/tests/test_cell.py

Co-authored-by: Bill Little <[email protected]>

* Updated whatsnew.

Co-authored-by: Bill Little <[email protected]>

* celebrate first time iris contributors (#4013)

* Docs unreleased banner (#3999)

* baseline

* removed debug comments

* reverted

* remove line

* Testing

* testing extensions

* testing rtd_version

* fixed if

* removed line

* tidy up

* tidy comments

* debug of pre-existing rtd variables

* added reminder

* testing

* testing still

* updated comments

* added whatsnew

* expanded the if conditiion

* review actions

* Update layout.html

Remove alternative banner that used the RestructuredText notation.

* review actions

* drop __unicode__ method usage (#4018)

* cirrus-ci conditional tasks (#4019)

* cirrus-ci conditional tasks

* use bc for bash arithmetic

* revert back to sed

* use expr

* reword

* minor documentation changes

* review actions

* make iris.common.metadata._hexdigest public (#4020)

Co-authored-by: Patrick Peglar <[email protected]>
Co-authored-by: stephen.worsley <[email protected]>
Co-authored-by: tkknight <[email protected]>
Co-authored-by: James Penn <[email protected]>
Co-authored-by: Ruth Comer <[email protected]>
Co-authored-by: Alexander Kuhn-Regnier <[email protected]>

Co-authored-by: Patrick Peglar <[email protected]>
Co-authored-by: stephen.worsley <[email protected]>
Co-authored-by: tkknight <[email protected]>
Co-authored-by: James Penn <[email protected]>
Co-authored-by: Ruth Comer <[email protected]>
Co-authored-by: Alexander Kuhn-Regnier <[email protected]>

* Connectivity manager (#4017)

* ConnectivityManager first pass.

* ConnectivityManager align with proposed CoordManager.

* Connectivity Manager review actions.

* Connectivity Manager more review changes.

* Use metadata_manager for Mesh location dimension.

* Mesh dimension name abstraction.

* Align Cooord and Connectivity Managers filters methods.

* Completed Mesh class.

* filter_cf improvements.

* Moved filter_cf.

* Mesh connectivity manager namedtuples comment.

* Mesh removed trailing underscores.

* Mesh _set_dimension_names improvements.

* Mesh import rationalisation.

* Mesh connectivity manager remove NDIM.

* Connectivity manager use lazy indices_by_src().

* Connectivity manager clearer removal syntax.

* Connectivity manager don't override __init__.

* Connectivity manager correct base class syntax.

* Metadata filter hexdigest reference fix.

* test_MeshMetadata fix.

* Rename filter to metadata_filter.

* minor fixes (#4025)

* minor fixes

* wip

* add mesh pickle support (#4026)

Co-authored-by: Bill Little <[email protected]>
Co-authored-by: Patrick Peglar <[email protected]>
Co-authored-by: stephen.worsley <[email protected]>
Co-authored-by: tkknight <[email protected]>
Co-authored-by: James Penn <[email protected]>
Co-authored-by: Ruth Comer <[email protected]>
Co-authored-by: Alexander Kuhn-Regnier <[email protected]>
  • Loading branch information
8 people authored Feb 23, 2021
1 parent 59fa3f1 commit 520d135
Show file tree
Hide file tree
Showing 7 changed files with 2,525 additions and 135 deletions.
76 changes: 38 additions & 38 deletions docs/src/userguide/cube_statistics.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@ Collapsing Entire Data Dimensions

In the :doc:`subsetting_a_cube` section we saw how to extract a subset of a
cube in order to reduce either its dimensionality or its resolution.
Instead of simply extracting a sub-region of the data,
we can produce statistical functions of the data values
across a particular dimension,
Instead of simply extracting a sub-region of the data,
we can produce statistical functions of the data values
across a particular dimension,
such as a 'mean over time' or 'minimum over latitude'.

.. _cube-statistics_forecast_printout:
Expand Down Expand Up @@ -57,9 +57,9 @@ For instance, suppose we have a cube:
um_version: 7.3


In this case we have a 4 dimensional cube;
to mean the vertical (z) dimension down to a single valued extent
we can pass the coordinate name and the aggregation definition to the
In this case we have a 4 dimensional cube;
to mean the vertical (z) dimension down to a single valued extent
we can pass the coordinate name and the aggregation definition to the
:meth:`Cube.collapsed() <iris.cube.Cube.collapsed>` method:

>>> import iris.analysis
Expand Down Expand Up @@ -88,8 +88,8 @@ we can pass the coordinate name and the aggregation definition to the
mean: model_level_number


Similarly other analysis operators such as ``MAX``, ``MIN`` and ``STD_DEV``
can be used instead of ``MEAN``, see :mod:`iris.analysis` for a full list
Similarly other analysis operators such as ``MAX``, ``MIN`` and ``STD_DEV``
can be used instead of ``MEAN``, see :mod:`iris.analysis` for a full list
of currently supported operators.

For an example of using this functionality, the
Expand All @@ -103,14 +103,14 @@ in the gallery takes a zonal mean of an ``XYT`` cube by using the
Area Averaging
^^^^^^^^^^^^^^

Some operators support additional keywords to the ``cube.collapsed`` method.
For example, :func:`iris.analysis.MEAN <iris.analysis.MEAN>` supports
a weights keyword which can be combined with
Some operators support additional keywords to the ``cube.collapsed`` method.
For example, :func:`iris.analysis.MEAN <iris.analysis.MEAN>` supports
a weights keyword which can be combined with
:func:`iris.analysis.cartography.area_weights` to calculate an area average.

Let's use the same data as was loaded in the previous example.
Since ``grid_latitude`` and ``grid_longitude`` were both point coordinates
we must guess bound positions for them
Let's use the same data as was loaded in the previous example.
Since ``grid_latitude`` and ``grid_longitude`` were both point coordinates
we must guess bound positions for them
in order to calculate the area of the grid boxes::

import iris.analysis.cartography
Expand Down Expand Up @@ -155,24 +155,24 @@ including an example on taking a :ref:`global area-weighted mean
Partially Reducing Data Dimensions
----------------------------------

Instead of completely collapsing a dimension, other methods can be applied
to reduce or filter the number of data points of a particular dimension.
Instead of completely collapsing a dimension, other methods can be applied
to reduce or filter the number of data points of a particular dimension.


Aggregation of Grouped Data
^^^^^^^^^^^^^^^^^^^^^^^^^^^

The :meth:`Cube.aggregated_by <iris.cube.Cube.aggregated_by>` operation
combines data for all points with the same value of a given coordinate.
To do this, you need a coordinate whose points take on only a limited set
of different values -- the *number* of these then determines the size of the
The :meth:`Cube.aggregated_by <iris.cube.Cube.aggregated_by>` operation
combines data for all points with the same value of a given coordinate.
To do this, you need a coordinate whose points take on only a limited set
of different values -- the *number* of these then determines the size of the
reduced dimension.
The :mod:`iris.coord_categorisation` module can be used to make such
'categorical' coordinates out of ordinary ones: The most common use is
to aggregate data over regular *time intervals*,
The :mod:`iris.coord_categorisation` module can be used to make such
'categorical' coordinates out of ordinary ones: The most common use is
to aggregate data over regular *time intervals*,
such as by calendar month or day of the week.

For example, let's create two new coordinates on the cube
For example, let's create two new coordinates on the cube
to represent the climatological seasons and the season year respectively::

import iris
Expand All @@ -188,8 +188,8 @@ to represent the climatological seasons and the season year respectively::

.. note::

The 'season year' is not the same as year number, because (e.g.) the months
Dec11, Jan12 + Feb12 all belong to 'DJF-12'.
The 'season year' is not the same as year number, because (e.g.) the months
Dec11, Jan12 + Feb12 all belong to 'DJF-12'.
See :meth:`iris.coord_categorisation.add_season_year`.


Expand All @@ -206,10 +206,10 @@ to represent the climatological seasons and the season year respectively::
iris.coord_categorisation.add_season_year(cube, 'time', name='season_year')

annual_seasonal_mean = cube.aggregated_by(
['clim_season', 'season_year'],
['clim_season', 'season_year'],
iris.analysis.MEAN)


Printing this cube now shows that two extra coordinates exist on the cube:

.. doctest:: aggregation
Expand Down Expand Up @@ -238,20 +238,20 @@ These two coordinates can now be used to aggregate by season and climate-year:
.. doctest:: aggregation

>>> annual_seasonal_mean = cube.aggregated_by(
... ['clim_season', 'season_year'],
... ['clim_season', 'season_year'],
... iris.analysis.MEAN)
>>> print(repr(annual_seasonal_mean))
<iris 'Cube' of surface_temperature / (K) (time: 19; latitude: 18; longitude: 432)>
The primary change in the cube is that the cube's data has been
reduced in the 'time' dimension by aggregation (taking means, in this case).
This has collected together all data points with the same values of season and

The primary change in the cube is that the cube's data has been
reduced in the 'time' dimension by aggregation (taking means, in this case).
This has collected together all data points with the same values of season and
season-year.
The results are now indexed by the 19 different possible values of season and
season-year in a new, reduced 'time' dimension.

We can see this by printing the first 10 values of season+year
from the original cube: These points are individual months,
We can see this by printing the first 10 values of season+year
from the original cube: These points are individual months,
so adjacent ones are often in the same season:

.. doctest:: aggregation
Expand All @@ -271,7 +271,7 @@ so adjacent ones are often in the same season:
djf 2007
djf 2007

Compare this with the first 10 values of the new cube's coordinates:
Compare this with the first 10 values of the new cube's coordinates:
All the points now have distinct season+year values:

.. doctest:: aggregation
Expand All @@ -294,7 +294,7 @@ All the points now have distinct season+year values:

Because the original data started in April 2006 we have some incomplete seasons
(e.g. there were only two months worth of data for 'mam-2006').
In this case we can fix this by removing all of the resultant 'times' which
In this case we can fix this by removing all of the resultant 'times' which
do not cover a three month period (note: judged here as > 3*28 days):

.. doctest:: aggregation
Expand All @@ -306,7 +306,7 @@ do not cover a three month period (note: judged here as > 3*28 days):
>>> full_season_means
<iris 'Cube' of surface_temperature / (K) (time: 17; latitude: 18; longitude: 432)>

The final result now represents the seasonal mean temperature for 17 seasons
The final result now represents the seasonal mean temperature for 17 seasons
from jja-2006 to jja-2010:

.. doctest:: aggregation
Expand Down
165 changes: 159 additions & 6 deletions lib/iris/common/metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,22 +27,26 @@


__all__ = [
"SERVICES_COMBINE",
"SERVICES_DIFFERENCE",
"SERVICES_EQUAL",
"SERVICES",
"AncillaryVariableMetadata",
"BaseMetadata",
"CellMeasureMetadata",
"CoordMetadata",
"CubeMetadata",
"DimCoordMetadata",
"hexdigest",
"metadata_filter",
"metadata_manager_factory",
"SERVICES",
"SERVICES_COMBINE",
"SERVICES_DIFFERENCE",
"SERVICES_EQUAL",
]


# https://www.unidata.ucar.edu/software/netcdf/docs/netcdf_data_set_components.html#object_name

from ..util import guess_coord_axis

_TOKEN_PARSE = re.compile(r"""^[a-zA-Z0-9][\w\.\+\-@]*$""")

# Configure the logger.
Expand Down Expand Up @@ -194,9 +198,18 @@ def func(field):
return result

# Note that, for strict we use "_fields" not "_members".
# The "circular" and "src_dim" members do not participate in strict equivalence.
# TODO: refactor so that 'non-participants' can be held in their specific subclasses.
# Certain members never participate in strict equivalence, so
# are filtered out.
fields = filter(
lambda field: field not in ("circular", "src_dim"),
lambda field: field
not in (
"circular",
"src_dim",
"node_dimension",
"edge_dimension",
"face_dimension",
),
self._fields,
)
result = all([func(field) for field in fields])
Expand Down Expand Up @@ -1330,6 +1343,146 @@ def equal(self, other, lenient=None):
return super().equal(other, lenient=lenient)


def metadata_filter(
instances,
item=None,
standard_name=None,
long_name=None,
var_name=None,
attributes=None,
axis=None,
):
"""
Filter a collection of objects by their metadata to fit the given metadata
criteria. Criteria can be one or both of: specific properties / other objects
carrying metadata to be matched.
Args:
* instances
One or more objects to be filtered.
Kwargs:
* item
Either
(a) a :attr:`standard_name`, :attr:`long_name`, or
:attr:`var_name`. Defaults to value of `default`
(which itself defaults to `unknown`) as defined in
:class:`~iris.common.CFVariableMixin`.
(b) a 'coordinate' instance with metadata equal to that of
the desired coordinates. Accepts either a
:class:`~iris.coords.DimCoord`, :class:`~iris.coords.AuxCoord`,
:class:`~iris.aux_factory.AuxCoordFactory`,
:class:`~iris.common.CoordMetadata` or
:class:`~iris.common.DimCoordMetadata` or
:class:`~iris.experimental.ugrid.ConnectivityMetadata`.
* standard_name
The CF standard name of the desired coordinate. If None, does not
check for standard name.
* long_name
An unconstrained description of the coordinate. If None, does not
check for long_name.
* var_name
The netCDF variable name of the desired coordinate. If None, does
not check for var_name.
* attributes
A dictionary of attributes desired on the coordinates. If None,
does not check for attributes.
* axis
The desired coordinate axis, see
:func:`~iris.util.guess_coord_axis`. If None, does not check for
axis. Accepts the values 'X', 'Y', 'Z' and 'T' (case-insensitive).
Returns:
A list of the objects supplied in the ``instances`` argument, limited
to only those that matched the given criteria.
"""
name = None
obj = None

if isinstance(item, str):
name = item
else:
obj = item

# apply de morgan's law for one less logical operation
if not (isinstance(instances, str) or isinstance(instances, Iterable)):
instances = [instances]

result = instances

if name is not None:
result = [instance for instance in result if instance.name() == name]

if standard_name is not None:
result = [
instance
for instance in result
if instance.standard_name == standard_name
]

if long_name is not None:
result = [
instance for instance in result if instance.long_name == long_name
]

if var_name is not None:
result = [
instance for instance in result if instance.var_name == var_name
]

if attributes is not None:
if not isinstance(attributes, Mapping):
msg = (
"The attributes keyword was expecting a dictionary "
"type, but got a %s instead." % type(attributes)
)
raise ValueError(msg)

def attr_filter(instance):
return all(
k in instance.attributes
and hexdigest(instance.attributes[k]) == hexdigest(v)
for k, v in attributes.items()
)

result = [instance for instance in result if attr_filter(instance)]

if axis is not None:
axis = axis.upper()

def get_axis(instance):
if hasattr(instance, "axis"):
axis = instance.axis.upper()
else:
axis = guess_coord_axis(instance)
return axis

result = [
instance for instance in result if get_axis(instance) == axis
]

if obj is not None:
if hasattr(obj, "__class__") and issubclass(
obj.__class__, BaseMetadata
):
target_metadata = obj
else:
target_metadata = obj.metadata

result = [
instance
for instance in result
if instance.metadata == target_metadata
]

return result


def metadata_manager_factory(cls, **kwargs):
"""
A class instance factory function responsible for manufacturing
Expand Down
Loading

0 comments on commit 520d135

Please sign in to comment.