Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

removed the use of encoded single apostrophe #3261

Merged
merged 8 commits into from
Jul 20, 2023
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion doxygen/aliases
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ ALIASES += ref_rfc20120523="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/paged_a
ALIASES += ref_rfc20120501="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/HDF5FileImageOperations.pdf\">HDF5 File Image Operations</a>"
ALIASES += ref_rfc20120305="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/RFC%20PHDF5%20Consistency%20Semantics%20MC%20120328.docx.pdf\">Enabling a Strict Consistency Semantics Model in Parallel HDF5</a>"
ALIASES += ref_rfc20120220="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/h5repack_improve_hyperslab_over_chunked_dataset_v1.pdf\"><tt>h5repack</tt>: Improved Hyperslab selections for Large Chunked Datasets</a>"
ALIASES += ref_rfc20120120="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/2012-1-25-Maintainers-guide-for-datatype.docx.pdf\">A Maintainers Guide for the Datatype Module in HDF5 Library</a>"
ALIASES += ref_rfc20120120="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/2012-1-25-Maintainers-guide-for-datatype.docx.pdf\">A Maintainer's Guide for the Datatype Module in HDF5 Library</a>"
ALIASES += ref_rfc20120104="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/RFC_actual_io_v4-1_done.docx.pdf\">Actual I/O Mode</a>"
ALIASES += ref_rfc20111119="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/RFC-H5Ocompare-review_v6.pdf\">New public functions to handle comparison</a>"
ALIASES += ref_rfc20110825="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/2011-08-31-RFC_H5Ocopy_Named_DT_v2.docx.pdf\">Merging Named Datatypes in H5Ocopy()</a>"
Expand Down
18 changes: 9 additions & 9 deletions hl/src/H5DOpublic.h
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ extern "C" {
* HDF5 functions described is this section are implemented in the HDF5 High-level
* library as optimized functions. These functions generally require careful setup
* and testing as they enable an application to bypass portions of the HDF5
* librarys I/O pipeline for performance purposes.
* library's I/O pipeline for performance purposes.
*
* These functions are distributed in the standard HDF5 distribution and are
* available any time the HDF5 High-level library is available.
Expand Down Expand Up @@ -113,7 +113,7 @@ H5_HLDLL herr_t H5DOappend(hid_t dset_id, hid_t dxpl_id, unsigned axis, size_t e
* \param[in] dxpl_id Transfer property list identifier for
* this I/O operation
* \param[in] filters Mask for identifying the filters in use
* \param[in] offset Logical position of the chunks first element
* \param[in] offset Logical position of the chunk's first element
* in the dataspace
* \param[in] data_size Size of the actual data to be written in bytes
* \param[in] buf Buffer containing data to be written to the chunk
Expand All @@ -131,20 +131,20 @@ H5_HLDLL herr_t H5DOappend(hid_t dset_id, hid_t dxpl_id, unsigned axis, size_t e
* logical \p offset in a chunked dataset \p dset_id from the application
* memory buffer \p buf to the dataset in the file. Typically, the data
* in \p buf is preprocessed in memory by a custom transformation, such as
* compression. The chunk will bypass the librarys internal data
* compression. The chunk will bypass the library's internal data
* transfer pipeline, including filters, and will be written directly to the file.
*
* \p dxpl_id is a data transfer property list identifier.
*
* \p filters is a mask providing a record of which filters are used
* with the chunk. The default value of the mask is zero (\c 0),
* indicating that all enabled filters are applied. A filter is skipped
* if the bit corresponding to the filters position in the pipeline
* if the bit corresponding to the filter's position in the pipeline
* (<tt>0 ≤ position < 32</tt>) is turned on. This mask is saved
* with the chunk in the file.
*
* \p offset is an array specifying the logical position of the first
* element of the chunk in the datasets dataspace. The length of the
* element of the chunk in the dataset's dataspace. The length of the
* offset array must equal the number of dimensions, or rank, of the
* dataspace. The values in \p offset must not exceed the dimension limits
* and must specify a point that falls on a dataset chunk boundary.
Expand Down Expand Up @@ -189,7 +189,7 @@ H5_HLDLL herr_t H5DOwrite_chunk(hid_t dset_id, hid_t dxpl_id, uint32_t filters,
* \param[in] dset_id Identifier for the dataset to be read
* \param[in] dxpl_id Transfer property list identifier for
* this I/O operation
* \param[in] offset Logical position of the chunks first
* \param[in] offset Logical position of the chunk's first
element in the dataspace
* \param[in,out] filters Mask for identifying the filters used
* with the chunk
Expand All @@ -209,19 +209,19 @@ H5_HLDLL herr_t H5DOwrite_chunk(hid_t dset_id, hid_t dxpl_id, uint32_t filters,
* by its logical \p offset in a chunked dataset \p dset_id
* from the dataset in the file into the application memory
* buffer \p buf. The data in \p buf is read directly from the file
* bypassing the librarys internal data transfer pipeline,
* bypassing the library's internal data transfer pipeline,
* including filters.
*
* \p dxpl_id is a data transfer property list identifier.
*
* The mask \p filters indicates which filters are used with the
* chunk when written. A zero value indicates that all enabled filters
* are applied on the chunk. A filter is skipped if the bit corresponding
* to the filters position in the pipeline
* to the filter's position in the pipeline
* (<tt>0 ≤ position < 32</tt>) is turned on.
*
* \p offset is an array specifying the logical position of the first
* element of the chunk in the datasets dataspace. The length of the
* element of the chunk in the dataset's dataspace. The length of the
* offset array must equal the number of dimensions, or rank, of the
* dataspace. The values in \p offset must not exceed the dimension
* limits and must specify a point that falls on a dataset chunk boundary.
Expand Down
6 changes: 3 additions & 3 deletions hl/src/H5LDpublic.h
Original file line number Diff line number Diff line change
Expand Up @@ -50,18 +50,18 @@ H5_HLDLL herr_t H5LDget_dset_dims(hid_t did, hsize_t *cur_dims);
*-------------------------------------------------------------------------
* \ingroup H5LT
*
* \brief Returns the size in bytes of the datasets datatype
* \brief Returns the size in bytes of the dataset's datatype
*
* \param[in] did The dataset identifier
* \param[in] fields The pointer to a comma-separated list of fields for a compound datatype
*
* \return If successful, returns the size in bytes of the
* datasets datatype. Otherwise, returns 0.
* dataset's datatype. Otherwise, returns 0.
*
* \details H5LDget_dset_type_size() allows the user to find out the datatype
* size for the dataset associated with \p did. If the
* parameter \p fields is NULL, this routine just returns the size
* of the datasets datatype. If the dataset has a compound datatype
* of the dataset's datatype. If the dataset has a compound datatype
* and \p fields is non-NULL, this routine returns the size of the
* datatype(s) for the selected fields specified in \p fields.
* Note that ’,’ is the separator for the fields of a compound
Expand Down
18 changes: 9 additions & 9 deletions hl/src/H5LTpublic.h
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,7 @@ H5_HLDLL herr_t H5LTmake_dataset(hid_t loc_id, const char *dset_name, int rank,
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be \e character, #H5T_NATIVE_CHAR.
* The dataset's datatype will be \e character, #H5T_NATIVE_CHAR.
*
*/
H5_HLDLL herr_t H5LTmake_dataset_char(hid_t loc_id, const char *dset_name, int rank, const hsize_t *dims,
Expand All @@ -232,7 +232,7 @@ H5_HLDLL herr_t H5LTmake_dataset_char(hid_t loc_id, const char *dset_name, int r
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be <em>short signed integer</em>,
* The dataset's datatype will be <em>short signed integer</em>,
* #H5T_NATIVE_SHORT.
*
*/
Expand All @@ -257,7 +257,7 @@ H5_HLDLL herr_t H5LTmake_dataset_short(hid_t loc_id, const char *dset_name, int
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be <em>native signed integer</em>,
* The dataset's datatype will be <em>native signed integer</em>,
* #H5T_NATIVE_INT.
*
* \version Fortran subroutine modified in this release to accommodate
Expand Down Expand Up @@ -285,7 +285,7 @@ H5_HLDLL herr_t H5LTmake_dataset_int(hid_t loc_id, const char *dset_name, int ra
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be <em>long signed integer</em>,
* The dataset's datatype will be <em>long signed integer</em>,
* #H5T_NATIVE_LONG.
*
*/
Expand All @@ -310,7 +310,7 @@ H5_HLDLL herr_t H5LTmake_dataset_long(hid_t loc_id, const char *dset_name, int r
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be <em>native floating point</em>,
* The dataset's datatype will be <em>native floating point</em>,
* #H5T_NATIVE_FLOAT.
*
* \version 1.8.7 Fortran subroutine modified in this release to accommodate
Expand Down Expand Up @@ -338,7 +338,7 @@ H5_HLDLL herr_t H5LTmake_dataset_float(hid_t loc_id, const char *dset_name, int
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be
* The dataset's datatype will be
* <em>native floating-point double</em>, #H5T_NATIVE_DOUBLE.
*
* \version 1.8.7 Fortran subroutine modified in this release to accommodate
Expand All @@ -364,7 +364,7 @@ H5_HLDLL herr_t H5LTmake_dataset_double(hid_t loc_id, const char *dset_name, int
* named \p dset_name attached to the object specified by
* the identifier \p loc_id.
*
* The datasets datatype will be <em>C string</em>, #H5T_C_S1.
* The dataset's datatype will be <em>C string</em>, #H5T_C_S1.
*
*/
H5_HLDLL herr_t H5LTmake_dataset_string(hid_t loc_id, const char *dset_name, const char *buf);
Expand Down Expand Up @@ -1496,7 +1496,7 @@ H5_HLDLL herr_t H5LTfind_attribute(hid_t loc_id, const char *name);
* final component of \p path resolves to an HDF5 object;
* if not, the final component is a dangling link.
*
* The meaning of the functions return value depends on the
* The meaning of the function's return value depends on the
* value of \p check_object_valid:
*
* If \p check_object_valid is set to \c FALSE, H5LTpath_valid()
Expand All @@ -1516,7 +1516,7 @@ H5_HLDLL herr_t H5LTfind_attribute(hid_t loc_id, const char *name);
* \p path can be any one of the following:
*
* - An absolute path, which starts with a slash (\c /)
* indicating the files root group, followed by the members
* indicating the file's root group, followed by the members
* - A relative path with respect to \p loc_id
* - A dot (\c .), if \p loc_id is the object identifier for
* the object itself.
Expand Down
28 changes: 14 additions & 14 deletions src/H5Amodule.h
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@
* data, and the attribute creation property list.
*
* The following steps are required to create and write an HDF5 attribute:
* \li Obtain the object identifier for the attributes primary data object
* \li Obtain the object identifier for the attribute's primary data object
* \li Define the characteristics of the attribute and specify the attribute creation property list
* <ul> <li> Define the datatype</li>
* <li> Define the dataspace</li>
Expand All @@ -88,12 +88,12 @@
* \li Close the primary data object (if appropriate)
*
* The following steps are required to open and read/write an existing attribute. Since HDF5 attributes
* allow no partial I/O, you need specify only the attribute and the attributes memory datatype to read it:
* \li Obtain the object identifier for the attributes primary data object
* \li Obtain the attributes name or index
* allow no partial I/O, you need specify only the attribute and the attribute's memory datatype to read it:
* \li Obtain the object identifier for the attribute's primary data object
* \li Obtain the attribute's name or index
* \li Open the attribute
* \li Get attribute dataspace and datatype (optional)
* \li Specify the attributes memory type
* \li Specify the attribute's memory type
* \li Read and/or write the attribute data
* \li Close the attribute
* \li Close the primary data object (if appropriate)
Expand Down Expand Up @@ -126,7 +126,7 @@
*
* HDF5 attributes are sometimes discussed as name/value pairs in the form name=value.
*
* An attributes name is a null-terminated ASCII or UTF-8 character string. Each attribute attached to an
* An attribute's name is a null-terminated ASCII or UTF-8 character string. Each attribute attached to an
* object has a unique name.
*
* The value portion of the attribute contains one or more data elements of the same datatype.
Expand All @@ -148,8 +148,8 @@
* hid_t access_plist)
* \endcode
* loc_id identifies the object (dataset, group, or committed datatype) to which the attribute is to be
* attached. name, type_id, space_id, and create_plist convey, respectively, the attributes name, datatype,
* dataspace, and attribute creation property list. The attributes name must be locally unique: it must be
* attached. name, type_id, space_id, and create_plist convey, respectively, the attribute's name, datatype,
* dataspace, and attribute creation property list. The attribute's name must be locally unique: it must be
* unique within the context of the object to which it is attached.
*
* \ref H5Acreate creates the attribute in memory. The attribute does not exist in the file until
Expand All @@ -175,27 +175,27 @@
*
* To access an attribute by its name, use the \ref H5Aopen_by_name function. \ref H5Aopen_by_name returns an
* attribute identifier that can then be used by any function that must access an attribute such as \ref
* H5Aread. Use the function \ref H5Aget_name to determine an attributes name.
* H5Aread. Use the function \ref H5Aget_name to determine an attribute's name.
*
* To access an attribute by its index value, use the \ref H5Aopen_by_idx function. To determine an attribute
* index value when it is not already known, use the H5Oget_info function. \ref H5Aopen_by_idx is generally
* used in the course of opening several attributes for later access. Use \ref H5Aiterate if the intent is to
* perform the same operation on every attribute attached to an object.
*
* \subsubsection subsubsec_attribute_work_info Obtaining Information Regarding an Objects Attributes
* \subsubsection subsubsec_attribute_work_info Obtaining Information Regarding an Object's Attributes
*
* In the course of working with HDF5 attributes, one may need to obtain any of several pieces of information:
* \li An attribute name
* \li The dataspace of an attribute
* \li The datatype of an attribute
* \li The number of attributes attached to an object
*
* To obtain an attributes name, call H5Aget_name with an attribute identifier, attr_id:
* To obtain an attribute's name, call H5Aget_name with an attribute identifier, attr_id:
* \code
* ssize_t H5Aget_name (hid_t attr_id, size_t buf_size, char *buf)
* \endcode
* As with other attribute functions, attr_id identifies the attribute; buf_size defines the size of the
* buffer; and buf is the buffer to which the attributes name will be read.
* buffer; and buf is the buffer to which the attribute's name will be read.
*
* If the length of the attribute name, and hence the value required for buf_size, is unknown, a first call
* to \ref H5Aget_name will return that size. If the value of buf_size used in that first call is too small,
Expand All @@ -213,7 +213,7 @@
* step in determining attribute index values. If the call returns N, the attributes attached to the object
* object_id have index values of 0 through N-1.
*
* \subsubsection subsubsec_attribute_work_iterate Iterating across an Objects Attributes
* \subsubsection subsubsec_attribute_work_iterate Iterating across an Object's Attributes
*
* It is sometimes useful to be able to perform the identical operation across all of the attributes attached
* to an object. At the simplest level, you might just want to open each attribute. At a higher level, you
Expand All @@ -235,7 +235,7 @@
* null pointer, then all attributes have been processed, and the iterative process is complete.
*
* op_func is a user-defined operation that adheres to the \ref H5A_operator_t prototype. This prototype and
* certain requirements imposed on the operators behavior are described in the \ref H5Aiterate entry in the
* certain requirements imposed on the operator's behavior are described in the \ref H5Aiterate entry in the
* \ref RM.
*
* op_data is also user-defined to meet the requirements of op_func. Beyond providing a parameter with which
Expand Down
8 changes: 4 additions & 4 deletions src/H5D.c
Original file line number Diff line number Diff line change
Expand Up @@ -791,13 +791,13 @@ H5Dget_create_plist(hid_t dset_id)
*
* The chunk cache parameters in the returned property lists will be
* those used by the dataset. If the properties in the file access
* property list were used to determine the datasets chunk cache
* property list were used to determine the dataset's chunk cache
* configuration, then those properties will be present in the
* returned dataset access property list. If the dataset does not
* use a chunked layout, then the chunk cache properties will be set
* to the default. The chunk cache properties in the returned list
* are considered to be “set”, and any use of this list will override
* the corresponding properties in the files file access property
* the corresponding properties in the file's file access property
* list.
*
* All link access properties in the returned list will be set to the
Expand Down Expand Up @@ -2325,7 +2325,7 @@ H5Dget_num_chunks(hid_t dset_id, hid_t fspace_id, hsize_t *nchunks /*out*/)
* hid_t dset_id; IN: Chunked dataset ID
* hid_t fspace_id; IN: File dataspace ID
* hsize_t index; IN: Index of written chunk
* hsize_t *offset OUT: Logical position of the chunks
* hsize_t *offset OUT: Logical position of the chunk's
* first element in the dataspace
* unsigned *filter_mask OUT: Mask for identifying the filters in use
* haddr_t *addr OUT: Address of the chunk
Expand Down Expand Up @@ -2395,7 +2395,7 @@ H5Dget_chunk_info(hid_t dset_id, hid_t fspace_id, hsize_t chk_index, hsize_t *of
*
* Parameters:
* hid_t dset_id; IN: Chunked dataset ID
* hsize_t *offset IN: Logical position of the chunks
* hsize_t *offset IN: Logical position of the chunk's
* first element in the dataspace
* unsigned *filter_mask OUT: Mask for identifying the filters in use
* haddr_t *addr OUT: Address of the chunk
Expand Down
Loading