Skip to content

Commit

Permalink
Add icephys meta pt 2 (#1349)
Browse files Browse the repository at this point in the history
Use  extension in docs to simplify linking to common targets

Move advanced data I/O tutorials to their own section and move parallel I/O tutorial

Update Changelog for tutorials

Updated deprecation warning in SweepTable tutorial

Fix minor spelling error in make_test_files

Several corrections and enhancments for the icephys tutorial

Moved functions to create icephys test file from tests to pynwb

Moved functions to create icephys test file from tests to pynwb

Start for new tutorial to show conversion of icephys tables to pandas dataframes

Test to_hierarchical_dataframe and to_denormalized_dataframe functions

Updated tutorial to convert icephys tabels to pandas

Fix minor spelling error in NWBFile docstring

Add TimeseriesReferenceVectorData.get method to mask missing values on load

Update IntracellularRecordingsTable. Update add_recording defaults for startindex and count to None and update to_dataframe method to handle missing values in TimeSeriesReferenceVectorData

Update create_icephys_testfile to create testdata with missing stimuli

Update existing tests to match the new behavior

Update icephys tutorial for match new behavior

Update icephys pandas tutorial to demonstrate new behavior

Remove debug print statements

Add table augmentation and queries to the icephys_pandas tutorial

Fix flake8

Update to use keyword args

Clarify text in icephys tutorial

Fix flake8 on tutorial

Fix flake8

Update to use dev branch of nwb-schema

Set default name and default description for TimeSeriesReferenceVectorData

added icephys_testutils to pynwb.testing.__init__

Added test for IntracellularRecordingsTable.to_dataframe with options

Always use MaskedArrya in TimeSeriesReferenceVectorData instead of MaskedConstant

Add tests for TimeSeriesReferenceVectorData.get

Updated change log

Fix minor docstring issue in TimeSeriesReferenceData.get

Clarify docstring for TimeSeriesReferenceData.get

Fix build warnings in docs (#1380)

* Remove unused and outdated convert.rst from the docs  Fix #1378
* Fix extra numbered footnote reference in ophys tutorial
* Change code block highlighting from c to bash to avoid build warning
* Fix duplicate target here warning in 3_spec_api.rst
* Fix missing section label for crossreferencing between the extension tutorial and gallery
* Updated Changelog

Update CHANGELOG.md

Update icephys.py

Update docs/gallery/domain/plot_icephys.py

Update plot_icephys.py

Minor text fixes

Update plot_icephys_pandas.py

Minor text edits

Update icephys_testutils.py

Minor docstring edits

Update file.py

Fix typos

Update docs/gallery/domain/plot_icephys.py Replace master with main

Mention use of create_icephys_testfile function

Co-authored-by: Ryan Ly <rly@lbl.gov>

Remove old comment in TimeSeriesReferenceVectorData

Co-authored-by: Ryan Ly <rly@lbl.gov>

Fix typo in changelog

Fix broken link target

Fix flake8 in docs/gallery

Minor text fixes

Minor text and formatting fixes

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Use namedtuple instead of numpy masked structed array to represent values of TimeSeriesReferenceVectorData

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update tests/unit/test_icephys_metadata_tables.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Fix bad indent in test

Update base.py - minor text edits

Use hdmf 3.1.1

Enhance introspection, slicing, and data addition for TimeSeriesReference and TimeSeriesReferenceVectorData

Updated icephys query tutorial to use latest get_linked_tables behavior and improve rendering

Fix flake8 in gallery

Fix spelling in error message

Update src/pynwb/base.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update src/pynwb/base.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update src/pynwb/base.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update src/pynwb/base.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update src/pynwb/base.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Update src/pynwb/base.py

Co-authored-by: Ryan Ly <rly@lbl.gov>

Fix bug in TimeSeriesReference.timestamps and add comments

Fix gallery tests to handle allensdk pinning pynwb/hdmf

Fix rebase
  • Loading branch information
oruebel authored and rly committed Aug 10, 2021
1 parent 5bc29ec commit edf8565
Show file tree
Hide file tree
Showing 40 changed files with 2,010 additions and 894 deletions.
47 changes: 31 additions & 16 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,31 +2,46 @@

## PyNWB 2.0.0 (Upcoming)

### New features
-
-
-
- Drop Python 3.6 support, add Python 3.9 support. @rly (#1377)
- Update requirements to allow compatibility with HDMF 3 and h5py 3. @rly (#1377)
### Breaking changes:
- ``SweepTable`` has been deprecated in favor of the new icephys metadata tables. Use of ``SweepTable``
is still possible but no longer recommended. @oruebel (#1349)

### New features:
- Added new intracellular electrophysiology hierarchical table structure from ndx-icephys-meta to NWB core.
This includes the new types ``TimeSeriesReferenceVectorData``, ``IntracellularRecordingsTable``,
``SimultaneousRecordingsTable``, ``SequentialRecordingsTable``, ``RepetitionsTable`` and
``ExperimentalConditionsTable`` as well as corresponding updates to ``NWBFile`` to support interaction
with the new tables. @oruebel (#1349)
- Added support for nwb-schema 2.4.0. See [Release Notes](https://nwb-schema.readthedocs.io/en/latest/format_release_notes.html)
for more details. @oruebel (#1349)
- Dropped Python 3.6 support, added Python 3.9 support. @rly (#1377)
- Updated requirements to allow compatibility with HDMF 3 and h5py 3. @rly (#1377)

### Tutorial enhancements:
- Added new tutorial for intracellular electrophysiology to describe the use of the new metadata tables
and declared the previous tutoral using ``SweepTable`` as deprecated. @oruebel (#1349)
- Added new tutorial for querying intracellular electrophysiology metadata
(``docs/gallery/domain/plot_icephys_pandas.py``). @oruebel (#1349, #1383)
- Added thumbnails for tutorials to improve presentation of online docs. @oruebel (#1349)
- Used `sphinx.ext.extlinks` extension in docs to simplify linking to common targets. @oruebel (#1349)
- Created new section for advanced I/O tutorials and moved parallel I/O tutorial to its own file. @oruebel (#1349)

### Minor new features:
- Add RRID for citing PyNWB to the docs. @oruebel (#1372)
- Update CI and tests to handle deprecations in libraries. @rly (#1377)
- Add test utilities for icephys (``pynwb.testing.icephys_testutils``) to ease creation of test data
for tests and tutorials. @oruebel (#1349, #1383)

### Bug fixes:
- Enforce electrode ID uniqueness during insertion into table. @CodyCBakerPhD (#1344)
- Fix integration tests with invalid test data that will be caught by future hdmf validator version.
- Updated behavior of ``make clean`` command for docs to ensure tutorial files are cleaned up. @oruebel (#1349)
- Enforced electrode ID uniqueness during insertion into table. @CodyCBakerPhD (#1344)
- Fixed integration tests with invalid test data that will be caught by future hdmf validator version.
@dsleiter, @rly (#1366, #1376)

### Bug fixes
-
-
-
-
- Fixed build warnings in docs @oruebel (#1380)

## PyNWB 1.5.1 (May 24, 2021)

## Bug fix:
### Bug fixes:
- Raise minimum version of pandas from 0.23 to 1.0.5 to be compatible with numpy 1.20, and raise minimum version of
HDMF to use the corresponding change in HDMF. @rly (#1363)
- Update documentation and update structure of requirements files. @rly (#1363)
Expand Down Expand Up @@ -69,7 +84,7 @@
- Add capability to add a row to a column after IO.
- Add method `AbstractContainer.get_fields_conf`.
- Add functionality for storing external resource references.
- Add method `hdmf.utils.get_docval_macro` to get a tuple of the current values for a docval_macro, e.g., 'array_data'
- Add method `hdmf.utils.get_docval_macro` to get a tuple of the current values for a docval_macro, e.g., 'array_data'
and 'scalar_data'.
- `DynamicTable` can be automatically generated using `get_class`. Now the HDMF API can read files with extensions
that contain a DynamicTable without needing to import the extension first.
Expand Down
7 changes: 7 additions & 0 deletions docs/gallery/advanced_io/README.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@


.. _general-tutorials:


Advanced I/O
------------
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
'''
Advanced HDF5 I/O
=====================
Defining HDF5 Dataset I/O Settings (chunking, compression, etc.)
================================================================
The HDF5 storage backend supports a broad range of advanced dataset I/O options, such as,
chunking and compression. Here we demonstrate how to use these features
Expand All @@ -18,7 +18,7 @@
# Before we get started, lets create an NWBFile for testing so that we can add our data to it.
#

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_advnaced_hdf5_io.png'
# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_h5dataio.png'
from datetime import datetime
from dateutil.tz import tzlocal
from pynwb import NWBFile
Expand Down Expand Up @@ -217,72 +217,6 @@
# will be ignored as the h5py.Dataset will either be linked to or copied as on write.
#

####################
# Parallel I/O using MPI
# ----------------------
#
# The HDF5 storage backend supports parallel I/O using the Message Passing Interface (MPI).
# Using this feature requires that you install ``hdf5`` and ``h5py`` against an MPI driver, and you
# install ``mpi4py``. The basic installation of pynwb will not work. Setup can be tricky, and
# is outside the scope of this tutorial (for now), and the following assumes that you have
# HDF5 installed in a MPI configuration. Here we:
#
# 1. **Instantiate a dataset for parallel write**: We create TimeSeries with 4 timestamps that we
# will write in parallel
#
# 2. **Write to that file in parallel using MPI**: Here we assume 4 MPI ranks while each rank writes
# the data for a different timestamp.
#
# 3. **Read from the file in parallel using MPI**: Here each of the 4 MPI ranks reads one time
# step from the file
#
# .. code-block:: python
#
# from mpi4py import MPI
# import numpy as np
# from dateutil import tz
# from pynwb import NWBHDF5IO, NWBFile, TimeSeries
# from datetime import datetime
# from hdmf.data_utils import DataChunkIterator
#
# start_time = datetime(2018, 4, 25, 2, 30, 3, tzinfo=tz.gettz('US/Pacific'))
# fname = 'test_parallel_pynwb.nwb'
# rank = MPI.COMM_WORLD.rank # The process ID (integer 0-3 for 4-process run)
#
# # Create file on one rank. Here we only instantiate the dataset we want to
# # write in parallel but we do not write any data
# if rank == 0:
# nwbfile = NWBFile('aa', 'aa', start_time)
# data = DataChunkIterator(data=None, maxshape=(4,), dtype=np.dtype('int'))
#
# nwbfile.add_acquisition(TimeSeries('ts_name', description='desc', data=data,
# rate=100., unit='m'))
# with NWBHDF5IO(fname, 'w') as io:
# io.write(nwbfile)
#
# # write to dataset in parallel
# with NWBHDF5IO(fname, 'a', comm=MPI.COMM_WORLD) as io:
# nwbfile = io.read()
# print(rank)
# nwbfile.acquisition['ts_name'].data[rank] = rank
#
# # read from dataset in parallel
# with NWBHDF5IO(fname, 'r', comm=MPI.COMM_WORLD) as io:
# print(io.read().acquisition['ts_name'].data[rank])

####################
# To specify details about chunking, compression and other HDF5-specific I/O options,
# we can wrap data via ``H5DataIO``, e.g,
#
# .. code-block:: python
#
# data = H5DataIO(DataChunkIterator(data=None, maxshape=(100000, 100),
# dtype=np.dtype('float')),
# chunks=(10, 10), maxshape=(None, None))
#
# would initialize your dataset with a shape of (100000, 100) and maxshape of (None, None)
# and your own custom chunking of (10, 10).

####################
# Disclaimer
# ----------------
Expand Down
File renamed without changes.
File renamed without changes.
81 changes: 81 additions & 0 deletions docs/gallery/advanced_io/parallelio.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
'''
Parallel I/O using MPI
======================
The HDF5 storage backend supports parallel I/O using the Message Passing Interface (MPI).
Using this feature requires that you install ``hdf5`` and ``h5py`` against an MPI driver, and you
install ``mpi4py``. The basic installation of pynwb will not work. Setup can be tricky, and
is outside the scope of this tutorial (for now), and the following assumes that you have
HDF5 installed in a MPI configuration.
'''

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_parallelio.png'

####################
# Here we:
#
# 1. **Instantiate a dataset for parallel write**: We create TimeSeries with 4 timestamps that we
# will write in parallel
#
# 2. **Write to that file in parallel using MPI**: Here we assume 4 MPI ranks while each rank writes
# the data for a different timestamp.
#
# 3. **Read from the file in parallel using MPI**: Here each of the 4 MPI ranks reads one time
# step from the file
#
# .. code-block:: python
#
# from mpi4py import MPI
# import numpy as np
# from dateutil import tz
# from pynwb import NWBHDF5IO, NWBFile, TimeSeries
# from datetime import datetime
# from hdmf.data_utils import DataChunkIterator
#
# start_time = datetime(2018, 4, 25, 2, 30, 3, tzinfo=tz.gettz('US/Pacific'))
# fname = 'test_parallel_pynwb.nwb'
# rank = MPI.COMM_WORLD.rank # The process ID (integer 0-3 for 4-process run)
#
# # Create file on one rank. Here we only instantiate the dataset we want to
# # write in parallel but we do not write any data
# if rank == 0:
# nwbfile = NWBFile('aa', 'aa', start_time)
# data = DataChunkIterator(data=None, maxshape=(4,), dtype=np.dtype('int'))
#
# nwbfile.add_acquisition(TimeSeries('ts_name', description='desc', data=data,
# rate=100., unit='m'))
# with NWBHDF5IO(fname, 'w') as io:
# io.write(nwbfile)
#
# # write to dataset in parallel
# with NWBHDF5IO(fname, 'a', comm=MPI.COMM_WORLD) as io:
# nwbfile = io.read()
# print(rank)
# nwbfile.acquisition['ts_name'].data[rank] = rank
#
# # read from dataset in parallel
# with NWBHDF5IO(fname, 'r', comm=MPI.COMM_WORLD) as io:
# print(io.read().acquisition['ts_name'].data[rank])

####################
# To specify details about chunking, compression and other HDF5-specific I/O options,
# we can wrap data via ``H5DataIO``, e.g,
#
# .. code-block:: python
#
# data = H5DataIO(DataChunkIterator(data=None, maxshape=(100000, 100),
# dtype=np.dtype('float')),
# chunks=(10, 10), maxshape=(None, None))
#
# would initialize your dataset with a shape of (100000, 100) and maxshape of (None, None)
# and your own custom chunking of (10, 10).

####################
# Disclaimer
# ----------------
#
# External links included in the tutorial are being provided as a convenience and for informational purposes only;
# they do not constitute an endorsement or an approval by the authors of any of the products, services or opinions of
# the corporation or organization or individual. The authors bear no responsibility for the accuracy, legality or
# content of the external site or for that of subsequent links. Contact the external site for answers to questions
# regarding its content.
2 changes: 2 additions & 0 deletions docs/gallery/domain/brain_observatory.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@
# physiology submodule (pynwb.ophys). We will use the allensdk as a read API, while leveraging the pynwb data model and
# write api to transform and write the data back to disk.
#
# .. note: Using the latest allensdk package requires Python 3.6 or higher.

########################################
# .. raw:: html
# :url: https://gist.githubusercontent.com/nicain/82e6b3d8f9ff5b85ef01a582e41e2389/raw/
Expand Down
7 changes: 3 additions & 4 deletions docs/gallery/domain/icephys.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
The following tutorial describes storage of intracellular electrophysiology data in NWB using the
SweepTable to manage recordings.
.. note::
The use of SweepTable has been deprecated as of PyNWB >v1.4 in favor of the new hierarchical
.. warning::
The use of SweepTable has been deprecated as of PyNWB >v2.0 in favor of the new hierarchical
intracellular electrophysiology metadata tables to allow for a more complete description of
intracellular electrophysiology experiments. See the :doc:`Intracellular electrophysiology <plot_icephys>`
tutorial for details.
Expand Down Expand Up @@ -190,8 +190,7 @@
# PatchClampSeries which belongs to a certain sweep number via
# :py:meth:`~pynwb.icephys.SweepTable.get_series`.
#
# The following call will return the voltage clamp data, of two timeseries
# The following call will return the voltage clamp data of two timeseries
# consisting of acquisition and stimulus, from sweep 1.

series = nwbfile.sweep_table.get_series(1)
print("hello world")
2 changes: 1 addition & 1 deletion docs/gallery/domain/ophys.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@
# Storing fluorescence measurements
# ---------------------------------
#
# Now that ROIs are stored, you can store fluorescence (or dF/F [#]_) data for these regions of interest.
# Now that ROIs are stored, you can store fluorescence (or dF/F) data for these regions of interest.
# This type of data is stored using the :py:class:`~pynwb.ophys.RoiResponseSeries` class. You will not need
# to instantiate this class directly to create objects of this type, but it is worth noting that this is the
# class you will work with after you read data back in.
Expand Down
Loading

0 comments on commit edf8565

Please sign in to comment.