-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[docs] add Dataset.assign_coords example (#6336) #6558
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @gregbehm ! One small comment
(also a self-serving shout-out for https://github.com/max-sixty/pytest-accept if you want to replace the values without copying & pasting them...)
xarray/core/common.py
Outdated
>>> np.random.seed(0) | ||
>>> ds = xr.Dataset( | ||
... data_vars=dict( | ||
... temperature=(["x", "y", "time"], 15 + 8 * np.random.randn(2, 2, 3)), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've generally tried to not use np.random
, both because of the seed issue, and because it can be difficult for people to track the numbers through the example. What do you think about replacing it with np.arange(12).reshape(2,2,3)
? Or something similar.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestions @max-sixty. I borrowed my example from the top-level xarray.Dataset
examples, but your idea makes good sense. I'm quite new to xarray
and wanted to keep my example similar to others already in the docs. Happy to submit another change if you think that will improve things.
Also thanks for pointing our https://github.com/max-sixty/pytest-accept. I'll take a look!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we've been gradually going through the examples and changing them, but doc PRs are rarer than code (and so very much appreciated!), and we still have quite a few left over on random
.
If you're up for making the change that would be great. If you don't think you'll get to it, this is still a good improvement. Thanks @gregbehm !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@max-sixty, I think working through the docs and adding examples will be a great way for me to learn xarray
, so I'll put the change on my to-do list and try to get to it soon.
One more noob question: how do we connect this PR with the issue that prompted it, #6336 so that can get closed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would be great!
It used to be that saying "closes #6336" worked, but I've seen that not work at times. Often someone will just manually close it after merging...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It used to be that saying
closes #6336
worked
this does work, but only if you use exactly that spelling in markdown. Unfortunately, github displays urls to issues / PRs the same way, hence the confusion.
If it did work, "closes" will become some kind of help link (not sure what the official name is), and on the right you see a linked issue (section "development"). In general, I think you just have to make sure the issue is linked in that section for it to work.
* remove np.random uses * fix one copy-paste artifact 'description: Temperature data'
* remove np.random uses * fix one copy-paste artifact 'description: Temperature data'
* fix time period typo
Thanks @gregbehm . I see this is your first contribution. Welcome to xarray! |
Thanks @gregbehm ! |
Thanks for the help @max-sixty and @dcherian. I'm hoping to contribute more, starting with the docs. |
* main: (24 commits) Fix overflow issue in decode_cf_datetime for dtypes <= np.uint32 (pydata#6598) Enable flox in GroupBy and resample (pydata#5734) Add setuptools as dependency in ASV benchmark CI (pydata#6609) change polyval dim ordering (pydata#6601) re-add timedelta support for polyval (pydata#6599) Minor Dataset.map docstr clarification (pydata#6595) New inline_array kwarg for open_dataset (pydata#6566) Fix polyval overloads (pydata#6593) Restore old MultiIndex dropping behaviour (pydata#6592) [docs] add Dataset.assign_coords example (pydata#6336) (pydata#6558) Fix zarr append dtype checks (pydata#6476) Add missing space in exception message (pydata#6590) Doc Link to accessors list in extending-xarray.rst (pydata#6587) Fix Dataset/DataArray.isel with drop=True and scalar DataArray indexes (pydata#6579) Add some warnings about rechunking to the docs (pydata#6569) [pre-commit.ci] pre-commit autoupdate (pydata#6584) terminology.rst: fix link to Unidata's "netcdf_dataset_components" (pydata#6583) Allow string formatting of scalar DataArrays (pydata#5981) Fix mypy issues & reenable in tests (pydata#6581) polyval: Use Horner's algorithm + support chunked inputs (pydata#6548) ...
commit 398f1b6 Author: dcherian <deepak@cherian.net> Date: Fri May 20 08:47:56 2022 -0600 Backward compatibility dask commit bde40e4 Merge: 0783df3 4cae8d0 Author: dcherian <deepak@cherian.net> Date: Fri May 20 07:54:48 2022 -0600 Merge branch 'main' into dask-datetime-to-numeric * main: concatenate docs style (pydata#6621) Typing for open_dataset/array/mfdataset and to_netcdf/zarr (pydata#6612) {full,zeros,ones}_like typing (pydata#6611) commit 0783df3 Merge: 5cff4f1 8de7061 Author: dcherian <deepak@cherian.net> Date: Sun May 15 21:03:50 2022 -0600 Merge branch 'main' into dask-datetime-to-numeric * main: (24 commits) Fix overflow issue in decode_cf_datetime for dtypes <= np.uint32 (pydata#6598) Enable flox in GroupBy and resample (pydata#5734) Add setuptools as dependency in ASV benchmark CI (pydata#6609) change polyval dim ordering (pydata#6601) re-add timedelta support for polyval (pydata#6599) Minor Dataset.map docstr clarification (pydata#6595) New inline_array kwarg for open_dataset (pydata#6566) Fix polyval overloads (pydata#6593) Restore old MultiIndex dropping behaviour (pydata#6592) [docs] add Dataset.assign_coords example (pydata#6336) (pydata#6558) Fix zarr append dtype checks (pydata#6476) Add missing space in exception message (pydata#6590) Doc Link to accessors list in extending-xarray.rst (pydata#6587) Fix Dataset/DataArray.isel with drop=True and scalar DataArray indexes (pydata#6579) Add some warnings about rechunking to the docs (pydata#6569) [pre-commit.ci] pre-commit autoupdate (pydata#6584) terminology.rst: fix link to Unidata's "netcdf_dataset_components" (pydata#6583) Allow string formatting of scalar DataArrays (pydata#5981) Fix mypy issues & reenable in tests (pydata#6581) polyval: Use Horner's algorithm + support chunked inputs (pydata#6548) ... commit 5cff4f1 Merge: dfe200d 6144c61 Author: Maximilian Roos <5635139+max-sixty@users.noreply.github.com> Date: Sun May 1 15:16:33 2022 -0700 Merge branch 'main' into dask-datetime-to-numeric commit dfe200d Author: dcherian <deepak@cherian.net> Date: Sun May 1 11:04:03 2022 -0600 Minor cleanup commit 35ed378 Author: dcherian <deepak@cherian.net> Date: Sun May 1 10:57:36 2022 -0600 Support dask arrays in datetime_to_numeric
Closes #6336
Adds
Dataset.assign_coords
example to https://docs.xarray.dev/en/latest/generated/xarray.Dataset.assign_coords.html