Skip to content

Commit d592623

Browse files
committed
Merge remote-tracking branch 'upstream/master' into coarsen-reduce
* upstream/master: (22 commits) Resolve the version issues on RTD (pydata#3589) Add bottleneck & rasterio git tip to upstream-dev CI (pydata#3585) update whats-new.rst (pydata#3581) Examples for quantile (pydata#3576) add cftime intersphinx entries (pydata#3577) Add pyXpcm to Related Projects doc page (pydata#3578) Reimplement quantile with apply_ufunc (pydata#3559) add environment file for binderized examples (pydata#3568) Add drop to api.rst under pending deprecations (pydata#3561) replace duplicate method _from_vars_and_coord_names (pydata#3565) propagate indexes in to_dataset, from_dataset (pydata#3519) Switch examples to notebooks + scipy19 docs improvements (pydata#3557) fix whats-new.rst (pydata#3554) Tweaks to release instructions (pydata#3555) Clarify conda environments for new contributors (pydata#3551) Revert to dev version 0.14.1 whatsnew (pydata#3547) sparse option to reindex and unstack (pydata#3542) Silence sphinx warnings (pydata#3516) Numpy 1.18 support (pydata#3537) ...
2 parents efbfa88 + ed05f98 commit d592623

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+2149
-1541
lines changed

.binder/environment.yml

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
name: xarray-examples
2+
channels:
3+
- conda-forge
4+
dependencies:
5+
- python=3.7
6+
- boto3
7+
- bottleneck
8+
- cartopy
9+
- cdms2
10+
- cfgrib
11+
- cftime
12+
- coveralls
13+
- dask
14+
- distributed
15+
- dask_labextension
16+
- h5netcdf
17+
- h5py
18+
- hdf5
19+
- iris
20+
- lxml # Optional dep of pydap
21+
- matplotlib
22+
- nc-time-axis
23+
- netcdf4
24+
- numba
25+
- numpy
26+
- pandas
27+
- pint
28+
- pip
29+
- pydap
30+
- pynio
31+
- rasterio
32+
- scipy
33+
- seaborn
34+
- sparse
35+
- toolz
36+
- xarray
37+
- zarr
38+
- pip:
39+
- numbagg

HOW_TO_RELEASE renamed to HOW_TO_RELEASE.md

Lines changed: 40 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,11 @@
1-
How to issue an xarray release in 15 easy steps
1+
How to issue an xarray release in 14 easy steps
22

33
Time required: about an hour.
44

55
1. Ensure your master branch is synced to upstream:
6-
git pull upstream master
6+
```
7+
git pull upstream master
8+
```
79
2. Look over whats-new.rst and the docs. Make sure "What's New" is complete
810
(check the date!) and consider adding a brief summary note describing the
911
release at the top.
@@ -12,37 +14,53 @@ Time required: about an hour.
1214
- Function/method references should include links to the API docs.
1315
- Sometimes notes get added in the wrong section of whats-new, typically
1416
due to a bad merge. Check for these before a release by using git diff,
15-
e.g., ``git diff v0.X.Y whats-new.rst`` where 0.X.Y is the previous
17+
e.g., `git diff v0.X.Y whats-new.rst` where 0.X.Y is the previous
1618
release.
1719
3. If you have any doubts, run the full test suite one final time!
18-
py.test
20+
```
21+
pytest
22+
```
1923
4. On the master branch, commit the release in git:
24+
```
2025
git commit -a -m 'Release v0.X.Y'
26+
```
2127
5. Tag the release:
28+
```
2229
git tag -a v0.X.Y -m 'v0.X.Y'
30+
```
2331
6. Build source and binary wheels for pypi:
32+
```
2433
git clean -xdf # this deletes all uncommited changes!
2534
python setup.py bdist_wheel sdist
35+
```
2636
7. Use twine to register and upload the release on pypi. Be careful, you can't
2737
take this back!
38+
```
2839
twine upload dist/xarray-0.X.Y*
40+
```
2941
You will need to be listed as a package owner at
3042
https://pypi.python.org/pypi/xarray for this to work.
3143
8. Push your changes to master:
44+
```
3245
git push upstream master
3346
git push upstream --tags
47+
```
3448
9. Update the stable branch (used by ReadTheDocs) and switch back to master:
49+
```
3550
git checkout stable
3651
git rebase master
3752
git push upstream stable
3853
git checkout master
39-
It's OK to force push to 'stable' if necessary.
40-
We also update the stable branch with `git cherrypick` for documentation
41-
only fixes that apply the current released version.
54+
```
55+
It's OK to force push to 'stable' if necessary. (We also update the stable
56+
branch with `git cherrypick` for documentation only fixes that apply the
57+
current released version.)
4258
10. Add a section for the next release (v.X.(Y+1)) to doc/whats-new.rst.
4359
11. Commit your changes and push to master again:
44-
git commit -a -m 'Revert to dev version'
60+
```
61+
git commit -a -m 'New whatsnew section'
4562
git push upstream master
63+
```
4664
You're done pushing to master!
4765
12. Issue the release on GitHub. Click on "Draft a new release" at
4866
https://github.com/pydata/xarray/releases. Type in the version number, but
@@ -53,11 +71,22 @@ Time required: about an hour.
5371
14. Issue the release announcement! For bug fix releases, I usually only email
5472
[email protected]. For major/feature releases, I will email a broader
5573
list (no more than once every 3-6 months):
56-
57-
58-
74+
75+
76+
77+
78+
79+
5980
Google search will turn up examples of prior release announcements (look for
6081
"ANN xarray").
82+
You can get a list of contributors with:
83+
```
84+
git log "$(git tag --sort="v:refname" | sed -n 'x;$p').." --format="%aN" | sort -u
85+
```
86+
or by replacing `v0.X.Y` with the _previous_ release in:
87+
```
88+
git log v0.X.Y.. --format="%aN" | sort -u
89+
```
6190
6291
Note on version numbering:
6392

ci/azure/install.yml

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,16 +16,18 @@ steps:
1616
--pre \
1717
--upgrade \
1818
matplotlib \
19+
numpy \
1920
pandas \
2021
scipy
21-
# numpy \ # FIXME https://github.com/pydata/xarray/issues/3409
2222
pip install \
2323
--no-deps \
2424
--upgrade \
2525
git+https://github.com/dask/dask \
2626
git+https://github.com/dask/distributed \
2727
git+https://github.com/zarr-developers/zarr \
28-
git+https://github.com/Unidata/cftime
28+
git+https://github.com/Unidata/cftime \
29+
git+https://github.com/mapbox/rasterio \
30+
git+https://github.com/pydata/bottleneck
2931
condition: eq(variables['UPSTREAM_DEV'], 'true')
3032
displayName: Install upstream dev dependencies
3133

ci/requirements/doc.yml

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,16 +6,20 @@ dependencies:
66
- python=3.7
77
- bottleneck
88
- cartopy
9+
- cfgrib
910
- h5netcdf
11+
- ipykernel
1012
- ipython
1113
- iris
14+
- jupyter_client
15+
- nbsphinx
1216
- netcdf4
1317
- numpy
1418
- numpydoc
1519
- pandas<0.25 # Hack around https://github.com/pydata/xarray/issues/3369
1620
- rasterio
1721
- seaborn
1822
- sphinx
19-
- sphinx-gallery
2023
- sphinx_rtd_theme
24+
- xarray
2125
- zarr

ci/requirements/py36.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ dependencies:
2525
- nc-time-axis
2626
- netcdf4
2727
- numba
28-
- numpy<1.18 # FIXME https://github.com/pydata/xarray/issues/3409
28+
- numpy
2929
- pandas
3030
- pint
3131
- pip

ci/requirements/py37.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ dependencies:
2525
- nc-time-axis
2626
- netcdf4
2727
- numba
28-
- numpy<1.18 # FIXME https://github.com/pydata/xarray/issues/3409
28+
- numpy
2929
- pandas
3030
- pint
3131
- pip

doc/README.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
:orphan:
2+
13
xarray
24
------
35

doc/api-hidden.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22
.. This extra page is a work around for sphinx not having any support for
33
.. hiding an autosummary table.
44
5+
:orphan:
6+
57
.. currentmodule:: xarray
68

79
.. autosummary::
@@ -30,9 +32,11 @@
3032
core.groupby.DatasetGroupBy.first
3133
core.groupby.DatasetGroupBy.last
3234
core.groupby.DatasetGroupBy.fillna
35+
core.groupby.DatasetGroupBy.quantile
3336
core.groupby.DatasetGroupBy.where
3437

3538
Dataset.argsort
39+
Dataset.astype
3640
Dataset.clip
3741
Dataset.conj
3842
Dataset.conjugate
@@ -71,6 +75,7 @@
7175
core.groupby.DataArrayGroupBy.first
7276
core.groupby.DataArrayGroupBy.last
7377
core.groupby.DataArrayGroupBy.fillna
78+
core.groupby.DataArrayGroupBy.quantile
7479
core.groupby.DataArrayGroupBy.where
7580

7681
DataArray.argsort

doc/api.rst

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -675,3 +675,12 @@ arguments for the ``from_store`` and ``dump_to_store`` Dataset methods:
675675
backends.FileManager
676676
backends.CachingFileManager
677677
backends.DummyFileManager
678+
679+
Deprecated / Pending Deprecation
680+
================================
681+
682+
Dataset.drop
683+
DataArray.drop
684+
Dataset.apply
685+
core.groupby.DataArrayGroupBy.apply
686+
core.groupby.DatasetGroupBy.apply

doc/combining.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -255,11 +255,11 @@ Combining along multiple dimensions
255255
``combine_nested``.
256256

257257
For combining many objects along multiple dimensions xarray provides
258-
:py:func:`~xarray.combine_nested`` and :py:func:`~xarray.combine_by_coords`. These
258+
:py:func:`~xarray.combine_nested` and :py:func:`~xarray.combine_by_coords`. These
259259
functions use a combination of ``concat`` and ``merge`` across different
260260
variables to combine many objects into one.
261261

262-
:py:func:`~xarray.combine_nested`` requires specifying the order in which the
262+
:py:func:`~xarray.combine_nested` requires specifying the order in which the
263263
objects should be combined, while :py:func:`~xarray.combine_by_coords` attempts to
264264
infer this ordering automatically from the coordinates in the data.
265265

@@ -310,4 +310,4 @@ These functions can be used by :py:func:`~xarray.open_mfdataset` to open many
310310
files as one dataset. The particular function used is specified by setting the
311311
argument ``'combine'`` to ``'by_coords'`` or ``'nested'``. This is useful for
312312
situations where your data is split across many files in multiple locations,
313-
which have some known relationship between one another.
313+
which have some known relationship between one another.

doc/computation.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -325,8 +325,8 @@ Broadcasting by dimension name
325325
``DataArray`` objects are automatically align themselves ("broadcasting" in
326326
the numpy parlance) by dimension name instead of axis order. With xarray, you
327327
do not need to transpose arrays or insert dimensions of length 1 to get array
328-
operations to work, as commonly done in numpy with :py:func:`np.reshape` or
329-
:py:const:`np.newaxis`.
328+
operations to work, as commonly done in numpy with :py:func:`numpy.reshape` or
329+
:py:data:`numpy.newaxis`.
330330

331331
This is best illustrated by a few examples. Consider two one-dimensional
332332
arrays with different sizes aligned along different dimensions:
@@ -566,7 +566,7 @@ to set ``axis=-1``. As an example, here is how we would wrap
566566
567567
Because ``apply_ufunc`` follows a standard convention for ufuncs, it plays
568568
nicely with tools for building vectorized functions, like
569-
:func:`numpy.broadcast_arrays` and :func:`numpy.vectorize`. For high performance
569+
:py:func:`numpy.broadcast_arrays` and :py:class:`numpy.vectorize`. For high performance
570570
needs, consider using Numba's :doc:`vectorize and guvectorize <numba:user/vectorize>`.
571571

572572
In addition to wrapping functions, ``apply_ufunc`` can automatically parallelize

doc/conf.py

Lines changed: 20 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,16 @@
1515

1616
import datetime
1717
import os
18+
import pathlib
1819
import subprocess
1920
import sys
2021
from contextlib import suppress
2122

23+
# make sure the source version is preferred (#3567)
24+
root = pathlib.Path(__file__).absolute().parent.parent
25+
os.environ["PYTHONPATH"] = str(root)
26+
sys.path.insert(0, str(root))
27+
2228
import xarray
2329

2430
allowed_failures = set()
@@ -76,20 +82,24 @@
7682
"numpydoc",
7783
"IPython.sphinxext.ipython_directive",
7884
"IPython.sphinxext.ipython_console_highlighting",
79-
"sphinx_gallery.gen_gallery",
85+
"nbsphinx",
8086
]
8187

8288
extlinks = {
8389
"issue": ("https://github.com/pydata/xarray/issues/%s", "GH"),
8490
"pull": ("https://github.com/pydata/xarray/pull/%s", "PR"),
8591
}
8692

87-
sphinx_gallery_conf = {
88-
"examples_dirs": "gallery",
89-
"gallery_dirs": "auto_gallery",
90-
"backreferences_dir": False,
91-
"expected_failing_examples": list(allowed_failures),
92-
}
93+
nbsphinx_timeout = 600
94+
nbsphinx_execute = "always"
95+
nbsphinx_prolog = """
96+
{% set docname = env.doc2path(env.docname, base=None) %}
97+
98+
You can run this notebook in a `live session <https://mybinder.org/v2/gh/pydata/xarray/doc/examples/master?urlpath=lab/tree/doc/{{ docname }}>`_ |Binder| or view it `on Github <https://github.com/pydata/xarray/blob/master/doc/{{ docname }}>`_.
99+
100+
.. |Binder| image:: https://mybinder.org/badge.svg
101+
:target: https://mybinder.org/v2/gh/pydata/xarray/master?urlpath=lab/tree/doc/{{ docname }}
102+
"""
93103

94104
autosummary_generate = True
95105
autodoc_typehints = "none"
@@ -137,7 +147,7 @@
137147

138148
# List of patterns, relative to source directory, that match files and
139149
# directories to ignore when looking for source files.
140-
exclude_patterns = ["_build"]
150+
exclude_patterns = ["_build", "**.ipynb_checkpoints"]
141151

142152
# The reST default role (used for this markup: `text`) to use for all
143153
# documents.
@@ -346,4 +356,6 @@
346356
"scipy": ("https://docs.scipy.org/doc/scipy/reference", None),
347357
"numba": ("https://numba.pydata.org/numba-doc/latest", None),
348358
"matplotlib": ("https://matplotlib.org", None),
359+
"dask": ("https://docs.dask.org/en/latest", None),
360+
"cftime": ("https://unidata.github.io/cftime", None),
349361
}

doc/contributing.rst

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,9 @@ We'll now kick off a two-step process:
151151
.. code-block:: none
152152
153153
# Create and activate the build environment
154-
conda env create -f ci/requirements/py36.yml
154+
# This is for Linux and MacOS. On Windows, use py37-windows.yml instead.
155+
conda env create -f ci/requirements/py37.yml
156+
155157
conda activate xarray-tests
156158
157159
# or with older versions of Anaconda:

doc/dask.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -285,7 +285,7 @@ automate `embarrassingly parallel
285285
<https://en.wikipedia.org/wiki/Embarrassingly_parallel>`__ "map" type operations
286286
where a function written for processing NumPy arrays should be repeatedly
287287
applied to xarray objects containing Dask arrays. It works similarly to
288-
:py:func:`dask.array.map_blocks` and :py:func:`dask.array.atop`, but without
288+
:py:func:`dask.array.map_blocks` and :py:func:`dask.array.blockwise`, but without
289289
requiring an intermediate layer of abstraction.
290290

291291
For the best performance when using Dask's multi-threaded scheduler, wrap a

0 commit comments

Comments
 (0)