From 7559b86fbb1c08e302d2b63b0506b3602fc8ce09 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Thu, 17 Nov 2022 14:56:03 -0700 Subject: [PATCH 01/20] Restore release date for 4.9.1 RC1 in release notes. --- RELEASE_NOTES.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/RELEASE_NOTES.md b/RELEASE_NOTES.md index f810060538..c45d6de6a2 100644 --- a/RELEASE_NOTES.md +++ b/RELEASE_NOTES.md @@ -18,6 +18,10 @@ This file contains a high-level description of this package's evolution. Release * [Bug Fix] Fix a race condition when testing missing filters. See [Github #2557](https://github.com/Unidata/netcdf-c/pull/2557). * [Bug Fix] Fix some race conditions due to use of a common file in multiple shell scripts . See [Github #2552](https://github.com/Unidata/netcdf-c/pull/2552). + + +### 4.9.1 - Release Candidate 1 - October 24, 2022 + * [Enhancement][Documentation] Add Plugins Quick Start Guide. See [GitHub #2524](https://github.com/Unidata/netcdf-c/pull/2524) for more information. * [Enhancement] Add new entries in `netcdf_meta.h`, `NC_HAS_BLOSC` and `NC_HAS_BZ2`. See [Github #2511](https://github.com/Unidata/netcdf-c/issues/2511) and [Github #2512](https://github.com/Unidata/netcdf-c/issues/2512) for more information. * [Enhancement] Add new options to `nc-config`: `--has-multifilters`, `--has-stdfilters`, `--has-quantize`, `--plugindir`. See [Github #2509](https://github.com/Unidata/netcdf-c/pull/2509) for more information. From 19b8ae47bf90cf14c39d0da476609781203ffaa1 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Thu, 17 Nov 2022 14:56:54 -0700 Subject: [PATCH 02/20] Added target release date for RC2. --- RELEASE_NOTES.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/RELEASE_NOTES.md b/RELEASE_NOTES.md index c45d6de6a2..5db4a84998 100644 --- a/RELEASE_NOTES.md +++ b/RELEASE_NOTES.md @@ -7,7 +7,7 @@ This file contains a high-level description of this package's evolution. Release ## 4.9.1 - T.B.D. -### 4.9.1 - Release Candidate 2 - TBD +### 4.9.1 - Release Candidate 2 - November 18, 2022 #### Known Issues From 087d3b6c37841433977a1408b8ba30f425adf4e4 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Fri, 18 Nov 2022 11:34:09 -0700 Subject: [PATCH 03/20] Supported headers for hdf4 are not installed in actions, and there does not appear (currently) to be an easy way to reinstall these. --- .github/workflows/run_tests_ubuntu.yml | 48 +++++--------------------- 1 file changed, 9 insertions(+), 39 deletions(-) diff --git a/.github/workflows/run_tests_ubuntu.yml b/.github/workflows/run_tests_ubuntu.yml index 618c13ed2f..eb01f06b3e 100644 --- a/.github/workflows/run_tests_ubuntu.yml +++ b/.github/workflows/run_tests_ubuntu.yml @@ -1,5 +1,5 @@ ### -# Build hdf4, hdf5 dependencies and cache them in a combined directory. +# Build hdf5 dependencies and cache them in a combined directory. ### name: Run Ubuntu/Linux netCDF Tests @@ -25,7 +25,7 @@ jobs: run: sudo apt update && sudo apt install -y libaec-dev zlib1g-dev automake autoconf libcurl4-openssl-dev libjpeg-dev wget curl bzip2 m4 flex bison cmake libzip-dev doxygen ### - # Installing libhdf4 and libhdf5 + # Installing libhdf5 ### - name: Cache libhdf5-${{ matrix.hdf5 }} id: cache-hdf5 @@ -39,13 +39,6 @@ jobs: if: steps.cache-hdf5.outputs.cache-hit != 'true' run: | set -x - wget https://support.hdfgroup.org/ftp/HDF/releases/HDF4.2.15/src/hdf-4.2.15.tar.bz2 - tar -jxf hdf-4.2.15.tar.bz2 - pushd hdf-4.2.15 - ./configure --prefix=${HOME}/environments/${{ matrix.hdf5 }} --disable-static --enable-shared --disable-fortran --disable-netcdf --with-szlib - make -j - make install -j - popd wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-$(echo ${{ matrix.hdf5 }} | cut -d. -f 1,2)/hdf5-${{ matrix.hdf5 }}/src/hdf5-${{ matrix.hdf5 }}.tar.bz2 tar -jxf hdf5-${{ matrix.hdf5 }}.tar.bz2 pushd hdf5-${{ matrix.hdf5 }} @@ -72,7 +65,7 @@ jobs: run: sudo apt update && sudo apt install -y libaec-dev zlib1g-dev automake autoconf libcurl4-openssl-dev libjpeg-dev wget curl bzip2 m4 flex bison cmake libzip-dev mpich libmpich-dev ### - # Installing libhdf4 and libhdf5 + # Installing libhdf5 ### - name: Cache libhdf5-parallel-${{ matrix.hdf5 }} id: cache-hdf5 @@ -86,13 +79,6 @@ jobs: if: steps.cache-hdf5.outputs.cache-hit != 'true' run: | set -x - wget https://support.hdfgroup.org/ftp/HDF/releases/HDF4.2.15/src/hdf-4.2.15.tar.bz2 - tar -jxf hdf-4.2.15.tar.bz2 - pushd hdf-4.2.15 - CC=mpicc ./configure --prefix=${HOME}/environments/${{ matrix.hdf5 }} --disable-static --enable-shared --disable-fortran --disable-netcdf --with-szlib --enable-parallel - make -j - make install -j - popd wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-$(echo ${{ matrix.hdf5 }} | cut -d. -f 1,2)/hdf5-${{ matrix.hdf5 }}/src/hdf5-${{ matrix.hdf5 }}.tar.bz2 tar -jxf hdf5-${{ matrix.hdf5 }}.tar.bz2 pushd hdf5-${{ matrix.hdf5 }} @@ -164,7 +150,7 @@ jobs: - name: Configure shell: bash -l {0} - run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} ./configure --enable-hdf4 --enable-hdf5 --enable-dap --disable-dap-remote-tests --enable-doxygen --enable-external-server-tests + run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} ./configure --enable-hdf5 --enable-dap --disable-dap-remote-tests --enable-doxygen --enable-external-server-tests if: ${{ success() }} - name: Look at config.log if error @@ -240,7 +226,7 @@ jobs: - name: Configure shell: bash -l {0} - run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} CC=mpicc ./configure --enable-hdf4 --enable-hdf5 --enable-dap --disable-dap-remote-tests --enable-parallel-tests --enable-pnetcdf + run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} CC=mpicc ./configure --enable-hdf5 --enable-dap --disable-dap-remote-tests --enable-parallel-tests --enable-pnetcdf if: ${{ success() }} - name: Look at config.log if error @@ -322,7 +308,7 @@ jobs: run: | mkdir build cd build - LD_LIBRARY_PATH=${LD_LIBRARY_PATH} cmake .. -DENABLE_HDF4=TRUE -DCMAKE_PREFIX_PATH=${CMAKE_PREFIX_PATH} -DENABLE_DAP=TRUE -DENABLE_HDF5=TRUE -DENABLE_NCZARR=TRUE -D ENABLE_DAP_LONG_TESTS=TRUE + LD_LIBRARY_PATH=${LD_LIBRARY_PATH} cmake .. -DCMAKE_PREFIX_PATH=${CMAKE_PREFIX_PATH} -DENABLE_DAP=TRUE -DENABLE_HDF5=TRUE -DENABLE_NCZARR=TRUE -D ENABLE_DAP_LONG_TESTS=TRUE - name: Print Summary shell: bash -l {0} @@ -402,7 +388,7 @@ jobs: run: | mkdir build cd build - LD_LIBRARY_PATH=${LD_LIBRARY_PATH} cmake .. -DCMAKE_C_COMPILER=mpicc -DENABLE_HDF4=TRUE -DCMAKE_PREFIX_PATH=${CMAKE_PREFIX_PATH} -DENABLE_DAP=TRUE -DENABLE_HDF5=TRUE -DENABLE_NCZARR=TRUE -D ENABLE_DAP_LONG_TESTS=TRUE -DENABLE_PNETCDF=TRUE + LD_LIBRARY_PATH=${LD_LIBRARY_PATH} cmake .. -DCMAKE_C_COMPILER=mpicc -DCMAKE_PREFIX_PATH=${CMAKE_PREFIX_PATH} -DENABLE_DAP=TRUE -DENABLE_HDF5=TRUE -DENABLE_NCZARR=TRUE -D ENABLE_DAP_LONG_TESTS=TRUE -DENABLE_PNETCDF=TRUE - name: Print Summary shell: bash -l {0} @@ -458,11 +444,9 @@ jobs: - run: echo "LDFLAGS=-L${HOME}/environments/${{ matrix.hdf5 }}/lib" >> $GITHUB_ENV - run: echo "LD_LIBRARY_PATH=${HOME}/environments/${{ matrix.hdf5 }}/lib" >> $GITHUB_ENV - run: | - echo "ENABLE_HDF4=--disable-hdf4" >> $GITHUB_ENV echo "ENABLE_HDF5=--disable-hdf5" >> $GITHUB_ENV if: matrix.use_nc4 == 'nc3' - run: | - echo "ENABLE_HDF4=--enable-hdf4" >> $GITHUB_ENV echo "ENABLE_HDF5=--enable-hdf5" >> $GITHUB_ENV if: matrix.use_nc4 == 'nc4' - run: echo "ENABLE_DAP=--disable-dap" >> $GITHUB_ENV @@ -499,7 +483,7 @@ jobs: - name: Configure shell: bash -l {0} - run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} ./configure ${ENABLE_HDF4} ${ENABLE_HDF5} ${ENABLE_DAP} ${ENABLE_NCZARR} + run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} ./configure ${ENABLE_HDF5} ${ENABLE_DAP} ${ENABLE_NCZARR} if: ${{ success() }} - name: Look at config.log if error @@ -526,18 +510,6 @@ jobs: run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} make check -j if: ${{ success() }} - # - name: Make Distcheck - # shell: bash -l {0} - # run: CFLAGS=${CFLAGS} LDFLAGS=${LDFLAGS} LD_LIBRARY_PATH=${LD_LIBRARY_PATH} DISTCHECK_CONFIGURE_FLAGS="${ENABLE_HDF4} ${ENABLE_HDF5} ${ENABLE_DAP} ${ENABLE_NCZARR}" make distcheck - # if: ${{ success() }} - - #- name: Start SSH Debug - # uses: luchihoratiu/debug-via-ssh@main - # with: - # NGROK_AUTH_TOKEN: ${{ secrets.NGROK_AUTH_TOKEN }} - # SSH_PASS: ${{ secrets.SSH_PASS }} - # if: ${{ failure() }} - nc-cmake: needs: [ nc-cmake-tests-oneoff-serial, nc-ac-tests-oneoff-serial, nc-cmake-tests-oneoff-parallel, nc-ac-tests-oneoff-parallel ] @@ -564,11 +536,9 @@ jobs: - run: echo "CMAKE_PREFIX_PATH=${HOME}/environments/${{ matrix.hdf5 }}/" >> $GITHUB_ENV - run: echo "LD_LIBRARY_PATH=${HOME}/environments/${{ matrix.hdf5 }}/lib" >> $GITHUB_ENV - run: | - echo "ENABLE_HDF4=OFF" >> $GITHUB_ENV echo "ENABLE_HDF5=OFF" >> $GITHUB_ENV if: matrix.use_nc4 == 'nc3' - run: | - echo "ENABLE_HDF4=ON" >> $GITHUB_ENV echo "ENABLE_HDF5=ON" >> $GITHUB_ENV if: matrix.use_nc4 == 'nc4' - run: echo "ENABLE_DAP=OFF" >> $GITHUB_ENV @@ -605,7 +575,7 @@ jobs: run: | mkdir build cd build - LD_LIBRARY_PATH=${LD_LIBRARY_PATH} cmake .. -DENABLE_HDF4=${ENABLE_HDF4} -DCMAKE_PREFIX_PATH=${CMAKE_PREFIX_PATH} -DENABLE_DAP=${ENABLE_DAP} -DENABLE_HDF5=${ENABLE_HDF5} -DENABLE_NCZARR=${ENABLE_NCZARR} + LD_LIBRARY_PATH=${LD_LIBRARY_PATH} cmake .. -DCMAKE_PREFIX_PATH=${CMAKE_PREFIX_PATH} -DENABLE_DAP=${ENABLE_DAP} -DENABLE_HDF5=${ENABLE_HDF5} -DENABLE_NCZARR=${ENABLE_NCZARR} - name: Print Summary shell: bash -l {0} From 74b4aae6c75d6a77de98194d4b1ed43151c1b3b4 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Fri, 18 Nov 2022 15:08:24 -0700 Subject: [PATCH 04/20] Update release date. --- RELEASE_NOTES.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/RELEASE_NOTES.md b/RELEASE_NOTES.md index 5db4a84998..662e800b4a 100644 --- a/RELEASE_NOTES.md +++ b/RELEASE_NOTES.md @@ -7,7 +7,7 @@ This file contains a high-level description of this package's evolution. Release ## 4.9.1 - T.B.D. -### 4.9.1 - Release Candidate 2 - November 18, 2022 +### 4.9.1 - Release Candidate 2 - November 21, 2022 #### Known Issues From 573e8924e1d842dcbce7f12ed8a6bda7354bb65b Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 21 Nov 2022 16:24:16 -0700 Subject: [PATCH 05/20] DAP4 is back on for cmake-based builds. --- CMakeLists.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CMakeLists.txt b/CMakeLists.txt index 400a54dc22..55140a78cf 100644 --- a/CMakeLists.txt +++ b/CMakeLists.txt @@ -1062,7 +1062,7 @@ ENDIF() IF(ENABLE_DAP) SET(USE_DAP ON CACHE BOOL "") SET(ENABLE_DAP2 ON CACHE BOOL "") - SET(ENABLE_DAP4 OFF CACHE BOOL "") + SET(ENABLE_DAP4 ON CACHE BOOL "") IF(NOT ENABLE_HDF5) SET(ENABLE_DAP4 OFF CACHE BOOL "") From a03bb5e60165b11be11f8c1e8e492a274a742011 Mon Sep 17 00:00:00 2001 From: Dennis Heimbigner Date: Sun, 18 Dec 2022 13:18:00 -0700 Subject: [PATCH 06/20] Fix infinite loop in file inferencing re: Issue https://github.com/Unidata/netcdf-c/issues/2573 The file type inferencer in libdispatch/dinference.c has a simple forward inference mechanism so that the occurrence of certain mode values in a URL fragment implies inclusion of additional mode values. This kind of inference is notorious for leading to cycles if not careful. Unfortunately, this occurred in the one in dinference.c. This was fixed by providing a more complicated, but more reliable inference mechanism. ## Misc. Other Changes * Found and fixed a couple of memory leaks. * There is a recent problem in building HDF4 support on github actions. Fixed by using the internal HDF4 xdr capability. * Some filter-related code was not being properly ifdef'd with ENABLE_NCZARRA_FILTERS. --- .github/workflows/run_tests_osx.yml | 2 +- .github/workflows/run_tests_ubuntu.yml | 6 +- .github/workflows/run_tests_win_cygwin.yml | 2 +- .github/workflows/run_tests_win_mingw.yml | 2 +- libdispatch/dfile.c | 23 +-- libdispatch/dinfermodel.c | 205 +++++++++++++++------ libdispatch/ncbytes.c | 4 +- libdispatch/nclist.c | 4 +- libnczarr/zsync.c | 11 +- libnczarr/zvar.c | 6 + 10 files changed, 178 insertions(+), 87 deletions(-) diff --git a/.github/workflows/run_tests_osx.yml b/.github/workflows/run_tests_osx.yml index fc73521c85..11864a8012 100644 --- a/.github/workflows/run_tests_osx.yml +++ b/.github/workflows/run_tests_osx.yml @@ -7,7 +7,7 @@ name: Run macOS-based netCDF Tests -on: [pull_request, workflow_dispatch] +on: [pull_request,workflow_dispatch] jobs: diff --git a/.github/workflows/run_tests_ubuntu.yml b/.github/workflows/run_tests_ubuntu.yml index 618c13ed2f..550faed042 100644 --- a/.github/workflows/run_tests_ubuntu.yml +++ b/.github/workflows/run_tests_ubuntu.yml @@ -4,7 +4,7 @@ name: Run Ubuntu/Linux netCDF Tests -on: [pull_request, workflow_dispatch] +on: [pull_request,workflow_dispatch] jobs: @@ -42,7 +42,7 @@ jobs: wget https://support.hdfgroup.org/ftp/HDF/releases/HDF4.2.15/src/hdf-4.2.15.tar.bz2 tar -jxf hdf-4.2.15.tar.bz2 pushd hdf-4.2.15 - ./configure --prefix=${HOME}/environments/${{ matrix.hdf5 }} --disable-static --enable-shared --disable-fortran --disable-netcdf --with-szlib + ./configure --prefix=${HOME}/environments/${{ matrix.hdf5 }} --disable-static --enable-shared --disable-fortran --disable-netcdf --with-szlib --enable-hdf4-xdr make -j make install -j popd @@ -89,7 +89,7 @@ jobs: wget https://support.hdfgroup.org/ftp/HDF/releases/HDF4.2.15/src/hdf-4.2.15.tar.bz2 tar -jxf hdf-4.2.15.tar.bz2 pushd hdf-4.2.15 - CC=mpicc ./configure --prefix=${HOME}/environments/${{ matrix.hdf5 }} --disable-static --enable-shared --disable-fortran --disable-netcdf --with-szlib --enable-parallel + CC=mpicc ./configure --prefix=${HOME}/environments/${{ matrix.hdf5 }} --disable-static --enable-shared --disable-fortran --disable-netcdf --with-szlib --enable-parallel --enable-hdf4-xdr make -j make install -j popd diff --git a/.github/workflows/run_tests_win_cygwin.yml b/.github/workflows/run_tests_win_cygwin.yml index 361e6265b7..147216e768 100644 --- a/.github/workflows/run_tests_win_cygwin.yml +++ b/.github/workflows/run_tests_win_cygwin.yml @@ -1,6 +1,6 @@ name: Run Cygwin-based tests -on: [pull_request, workflow_dispatch] +on: [pull_request,workflow_dispatch] env: SHELLOPTS: igncr diff --git a/.github/workflows/run_tests_win_mingw.yml b/.github/workflows/run_tests_win_mingw.yml index 5ff67e1a16..d872128597 100644 --- a/.github/workflows/run_tests_win_mingw.yml +++ b/.github/workflows/run_tests_win_mingw.yml @@ -9,7 +9,7 @@ name: Run MSYS2, MinGW64-based Tests env: CPPFLAGS: "-D_BSD_SOURCE" -on: [pull_request, workflow_dispatch] +on: [pull_request,workflow_dispatch] jobs: diff --git a/libdispatch/dfile.c b/libdispatch/dfile.c index ae756a8f27..a53be47893 100644 --- a/libdispatch/dfile.c +++ b/libdispatch/dfile.c @@ -1839,19 +1839,16 @@ NC_create(const char *path0, int cmode, size_t initialsz, TRACE(nc_create); if(path0 == NULL) - return NC_EINVAL; + {stat = NC_EINVAL; goto done;} /* Check mode flag for sanity. */ - if ((stat = check_create_mode(cmode))) - return stat; + if ((stat = check_create_mode(cmode))) goto done; /* Initialize the library. The available dispatch tables * will depend on how netCDF was built * (with/without netCDF-4, DAP, CDMREMOTE). */ - if(!NC_initialized) - { - if ((stat = nc_initialize())) - return stat; + if(!NC_initialized) { + if ((stat = nc_initialize())) goto done; } { @@ -1863,10 +1860,7 @@ NC_create(const char *path0, int cmode, size_t initialsz, memset(&model,0,sizeof(model)); newpath = NULL; - if((stat = NC_infermodel(path,&cmode,1,useparallel,NULL,&model,&newpath))) { - nullfree(newpath); - goto done; - } + if((stat = NC_infermodel(path,&cmode,1,useparallel,NULL,&model,&newpath))) goto done; if(newpath) { nullfree(path); path = newpath; @@ -1918,7 +1912,7 @@ NC_create(const char *path0, int cmode, size_t initialsz, dispatcher = NC3_dispatch_table; break; default: - return NC_ENOTNC; + {stat = NC_ENOTNC; goto done;} } /* Create the NC* instance and insert its dispatcher and model */ @@ -1937,6 +1931,7 @@ NC_create(const char *path0, int cmode, size_t initialsz, } done: nullfree(path); + nullfree(newpath); return stat; } @@ -1980,12 +1975,12 @@ NC_open(const char *path0, int omode, int basepe, size_t *chunksizehintp, TRACE(nc_open); if(!NC_initialized) { stat = nc_initialize(); - if(stat) return stat; + if(stat) goto done; } /* Check inputs. */ if (!path0) - return NC_EINVAL; + {stat = NC_EINVAL; goto done;} /* Capture the inmemory related flags */ mmap = ((omode & NC_MMAP) == NC_MMAP); diff --git a/libdispatch/dinfermodel.c b/libdispatch/dinfermodel.c index 74fd55a4fc..ff3e8e9770 100644 --- a/libdispatch/dinfermodel.c +++ b/libdispatch/dinfermodel.c @@ -143,7 +143,15 @@ static const struct MACRODEF { {NULL,NULL,{NULL}} }; -/* Mode inferences: if mode contains key, then add the inference and infer again */ +/* +Mode inferences: if mode contains key value, then add the inferred value; +Warning: be careful how this list is constructed to avoid infinite inferences. +In order to (mostly) avoid that consequence, any attempt to +infer a value that is already present will be ignored. +This effectively means that the inference graph +must be a DAG and may not have cycles. +You have been warned. +*/ static const struct MODEINFER { char* key; char* inference; @@ -151,6 +159,7 @@ static const struct MODEINFER { {"zarr","nczarr"}, {"xarray","zarr"}, {"noxarray","nczarr"}, +{"noxarray","zarr"}, {NULL,NULL} }; @@ -202,6 +211,7 @@ static int processmacros(NClist** fraglistp); static char* envvlist2string(NClist* pairs, const char*); static void set_default_mode(int* cmodep); static int parseonchar(const char* s, int ch, NClist* segments); +static int mergelist(NClist** valuesp); static int openmagic(struct MagicFile* file); static int readmagic(struct MagicFile* file, long pos, char* magic); @@ -217,8 +227,9 @@ static int parsepair(const char* pair, char** keyp, char** valuep); static NClist* parsemode(const char* modeval); static const char* getmodekey(const NClist* envv); static int replacemode(NClist* envv, const char* newval); -static int inferone(const char* mode, NClist* newmodes); +static void infernext(NClist* current, NClist* next); static int negateone(const char* mode, NClist* modes); +static void cleanstringlist(NClist* strs, int caseinsensitive); /* If the path looks like a URL, then parse it, reformat it. @@ -416,28 +427,6 @@ envvlist2string(NClist* envv, const char* delim) return result; } -/* Convert a list into a comma'd string */ -static char* -list2string(NClist* list) -{ - int i; - NCbytes* buf = NULL; - char* result = NULL; - - if(list == NULL || nclistlength(list)==0) return strdup(""); - buf = ncbytesnew(); - for(i=0;i 0) ncbytescat(buf,","); - ncbytescat(buf,m); - } - result = ncbytesextract(buf); - ncbytesfree(buf); - if(result == NULL) result = strdup(""); - return result; -} - /* Given a mode= argument, fill in the impl */ static int processmodearg(const char* arg, NCmodel* model) @@ -504,9 +493,10 @@ processinferences(NClist* fraglenv) { int stat = NC_NOERR; const char* modeval = NULL; - NClist* modes = NULL; NClist* newmodes = nclistnew(); - int i,inferred = 0; + NClist* currentmodes = NULL; + NClist* nextmodes = nclistnew(); + int i; char* newmodeval = NULL; if(fraglenv == NULL || nclistlength(fraglenv) == 0) goto done; @@ -515,22 +505,53 @@ processinferences(NClist* fraglenv) if((modeval = getmodekey(fraglenv))==NULL) goto done; /* Get the mode as list */ - modes = parsemode(modeval); - - /* Repeatedly walk the mode list until no more new positive inferences */ - do { - for(i=0;ikey;tests++) { - if(strcasecmp(tests->key,mode)==0) { - /* Append the inferred mode; dups removed later */ - nclistpush(newmodes,strdup(tests->inference)); - changed = 1; + int i; + for(i=0;ikey;tests++) { + if(strcasecmp(tests->key,cur)==0) { + /* Append the inferred mode unless dup */ + if(!nclistmatch(next,tests->inference,1)) + nclistpush(next,strdup(tests->inference)); + } } } - return changed; } +/* +Given a list of strings, remove nulls and duplicates +*/ static int -mergekey(NClist** valuesp) +mergelist(NClist** valuesp) { int i,j; int stat = NC_NOERR; @@ -686,12 +714,12 @@ cleanfragments(NClist** fraglenvp) /* collect all unique keys */ collectallkeys(fraglenv,allkeys); - /* Collect all values for same key across all fragments */ + /* Collect all values for same key across all fragment pairs */ for(i=0;i 0) ncbytescat(buf,","); + ncbytescat(buf,m); + } + result = ncbytesextract(buf); + ncbytesfree(buf); + if(result == NULL) result = strdup(""); + return result; +} + +#if 0 +/* Given a comma separated string, remove duplicates; mostly used to cleanup mode list */ +static char* +cleancommalist(const char* commalist, int caseinsensitive) +{ + NClist* tmp = nclistnew(); + char* newlist = NULL; + if(commalist == NULL || strlen(commalist)==0) return nulldup(commalist); + (void)parseonchar(commalist,',',tmp);/* split on commas */ + cleanstringlist(tmp,caseinsensitive); + newlist = list2string(tmp); + nclistfreeall(tmp); + return newlist; +} +#endif + +/* Given a list of strings, remove nulls and duplicated */ +static void +cleanstringlist(NClist* strs, int caseinsensitive) +{ + int i,j; + if(nclistlength(strs) == 0) return; + /* Remove nulls */ + for(i=nclistlength(strs)-1;i>=0;i--) { + if(nclistget(strs,i)==NULL) nclistremove(strs,i); + } + /* Remove duplicates*/ + for(i=0;ii;j--) { + int match; + const char* candidate = nclistget(strs,j); + if(caseinsensitive) + match = (strcasecmp(value,candidate) == 0); + else + match = (strcmp(value,candidate) == 0); + if(match) {char* dup = nclistremove(strs,j); nullfree(dup);} + } + } +} + + /**************************************************/ /** * @internal Given an existing file, figure out its format and return @@ -1502,8 +1595,10 @@ printlist(NClist* list, const char* tag) { int i; fprintf(stderr,"%s:",tag); - for(i=0;ilength == 0) return ncbytesfail(); diff --git a/libdispatch/nclist.c b/libdispatch/nclist.c index 49f0dded45..b5f815864c 100644 --- a/libdispatch/nclist.c +++ b/libdispatch/nclist.c @@ -183,6 +183,7 @@ nclistremove(NClist* l, size_t i) return elem; } +/* Match on == */ int nclistcontains(NClist* l, void* elem) { @@ -193,7 +194,7 @@ nclistcontains(NClist* l, void* elem) return 0; } -/* Return 1/0 */ +/* Match on str(case)cmp */ int nclistmatch(NClist* l, const char* elem, int casesensitive) { @@ -230,7 +231,6 @@ nclistelemremove(NClist* l, void* elem) return found; } - /* Extends nclist to include a unique operator which remove duplicate values; NULL values removed return value is always 1. diff --git a/libnczarr/zsync.c b/libnczarr/zsync.c index b3a93ee767..f237cd2b61 100644 --- a/libnczarr/zsync.c +++ b/libnczarr/zsync.c @@ -1429,6 +1429,7 @@ define_vars(NC_FILE_INFO_T* file, NC_GRP_INFO_T* grp, NClist* varnames) char* varpath = NULL; char* key = NULL; NCZ_FILE_INFO_T* zinfo = NULL; + NC_VAR_INFO_T* var = NULL; NCZ_VAR_INFO_T* zvar = NULL; NCZMAP* map = NULL; NCjson* jvar = NULL; @@ -1460,7 +1461,6 @@ define_vars(NC_FILE_INFO_T* file, NC_GRP_INFO_T* grp, NClist* varnames) /* Load each var in turn */ for(i = 0; i < nclistlength(varnames); i++) { - NC_VAR_INFO_T* var; const char* varname = nclistget(varnames,i); if((stat = nc4_var_list_add2(grp, varname, &var))) goto done; @@ -1477,10 +1477,6 @@ define_vars(NC_FILE_INFO_T* file, NC_GRP_INFO_T* grp, NClist* varnames) /* Indicate we do not have quantizer yet */ var->quantize_mode = -1; - /* Set filter list */ - assert(var->filters == NULL); - var->filters = (void*)nclistnew(); - /* Construct var path */ if((stat = NCZ_varkey(var,&varpath))) goto done; @@ -1697,9 +1693,9 @@ define_vars(NC_FILE_INFO_T* file, NC_GRP_INFO_T* grp, NClist* varnames) object MUST contain a "id" key identifying the codec to be used. */ /* Do filters key before compressor key so final filter chain is in correct order */ { +#ifdef ENABLE_NCZARR_FILTERS if(var->filters == NULL) var->filters = (void*)nclistnew(); if(zvar->incompletefilters == NULL) zvar->incompletefilters = (void*)nclistnew(); -#ifdef ENABLE_NCZARR_FILTERS { int k; chainindex = 0; /* track location of filter in the chain */ if((stat = NCZ_filter_initialize())) goto done; @@ -1722,8 +1718,8 @@ define_vars(NC_FILE_INFO_T* file, NC_GRP_INFO_T* grp, NClist* varnames) /* From V2 Spec: A JSON object identifying the primary compression codec and providing configuration parameters, or ``null`` if no compressor is to be used. */ { - if(var->filters == NULL) var->filters = (void*)nclistnew(); #ifdef ENABLE_NCZARR_FILTERS + if(var->filters == NULL) var->filters = (void*)nclistnew(); if((stat = NCZ_filter_initialize())) goto done; if((stat = NCJdictget(jvar,"compressor",&jfilter))) goto done; if(jfilter != NULL && NCJsort(jfilter) != NCJ_NULL) { @@ -1752,6 +1748,7 @@ define_vars(NC_FILE_INFO_T* file, NC_GRP_INFO_T* grp, NClist* varnames) nullfree(shapes); shapes = NULL; if(formatv1) {NCJreclaim(jncvar); jncvar = NULL;} NCJreclaim(jvar); jvar = NULL; + var = NULL; } done: diff --git a/libnczarr/zvar.c b/libnczarr/zvar.c index 12014bae87..2a98646e36 100644 --- a/libnczarr/zvar.c +++ b/libnczarr/zvar.c @@ -391,9 +391,11 @@ NCZ_def_var(int ncid, const char *name, nc_type xtype, int ndims, var->meta_read = NC_TRUE; var->atts_read = NC_TRUE; +#ifdef ENABLE_NCZARR_FILTERS /* Set the filter list */ assert(var->filters == NULL); var->filters = (void*)nclistnew(); +#endif /* Point to the type, and increment its ref. count */ var->type_info = type; @@ -558,10 +560,12 @@ ncz_def_var_extra(int ncid, int varid, int *shuffle, int *unused1, /* Can't turn on parallel and deflate/fletcher32/szip/shuffle * before HDF5 1.10.3. */ +#ifdef ENABLE_NCZARR_FILTERS #ifndef HDF5_SUPPORTS_PAR_FILTERS if (h5->parallel == NC_TRUE) if (nclistlength(((NClist*)var->filters)) > 0 || fletcher32 || shuffle) {retval = NC_EINVAL; goto done;} +#endif #endif /* If the HDF5 dataset has already been created, then it is too @@ -628,8 +632,10 @@ ncz_def_var_extra(int ncid, int varid, int *shuffle, int *unused1, * no filters in use for this data. */ if (storage != NC_CHUNKED) { +#ifdef NCZARR_FILTERS if (nclistlength(((NClist*)var->filters)) > 0) {retval = NC_EINVAL; goto done;} +#endif for (d = 0; d < var->ndims; d++) if (var->dim[d]->unlimited) {retval = NC_EINVAL; goto done;} From 9226b52ca5b16ca7ba62a5080b2da0367c3cd1d8 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 19 Dec 2022 10:23:10 -0700 Subject: [PATCH 07/20] Add an old static file. --- docs/static-pages/orgs.html | 417 ++++++++++++++++++++++++++++++++++++ 1 file changed, 417 insertions(+) create mode 100644 docs/static-pages/orgs.html diff --git a/docs/static-pages/orgs.html b/docs/static-pages/orgs.html new file mode 100644 index 0000000000..2e5c84df66 --- /dev/null +++ b/docs/static-pages/orgs.html @@ -0,0 +1,417 @@ + + + + +Organizations in which NetCDF is Used + + + + + + + + + + + + + +

Organizations in which NetCDF is Used

+The following list of organizations was created by sorting the organizational +affiliations of authors of questions or comments about netCDF sent to support@unidata.ucar.edu. +
    +
  • Accu-Weather
  • +
  • Advanced Visual Systems, Inc.
  • +
  • Aerometrics, Inc.
  • +
  • Aerospace and Mechanical Engineering, University of Notre Dame
  • +
  • Alfred-Wegener-Institute for Polar and Marine Research
  • +
  • American Cyanamid Company
  • +
  • Analytical Innovations, Inc.
  • +
  • Analytical Services & Materials, Inc.
  • +
  • Applied Research Associates
  • +
  • Applied Research Corp., Goddard Space Flight Center
  • +
  • Armstrong Labs, Tyndall AFB
  • +
  • Astro Space Center, Moscow
  • +
  • Astronomical Institute, Czech Academy of Sciences
  • +
  • Atmospheric Environment Service, CANADA
  • +
  • Atmospheric Release Advisory Capability, Lawrence Livermore National Laboratory
  • +
  • Atmospheric Research Laboratory, Scripps Institution of Oceanography
  • +
  • Atmospheric Sciences, Yonsei University
  • +
  • Atmospheric, Oceanic and Planetary Physics, Clarendon Laboratory
  • +
  • Auburn
  • +
  • Aurora Simulation, Inc.
  • +
  • Australian Geological Survey
  • +
  • BB&N
  • +
  • BMRC
  • +
  • Battelle / Pacific Northwest Laboratories
  • +
  • Battelle Marine Sciences Laboratory
  • +
  • Bay Area Air Quality Management District
  • +
  • Bio-Rad Semiconductor Division, CD Systems
  • +
  • Biophysics Lab., University of Nijmegen, The Netherlands
  • +
  • Branch of Atlantic Marine Geology, US Geological Survey
  • +
  • Bristol-Myers Squibb
  • +
  • Brookhaven National Labs
  • +
  • CARS, University of Chicago
  • +
  • CEA/CESTA, France
  • +
  • CIBNOR
  • +
  • CICESE/Depto. Oceanografia Fisica, Mexico
  • +
  • CIRES, University of Colorado
  • +
  • CIRES/Center for the Study of Earth from Space
  • +
  • CMU
  • +
  • CSIRO
  • +
  • CSIRO Division of Atmospheric Research
  • +
  • CSIRO Division of Oceanography
  • +
  • CSIRO Mathematical and Information Sciences
  • +
  • California Space Institute, Scripps Institution of Oceanography
  • +
  • California State University at Chico
  • +
  • Canadian Climate Centre
  • +
  • Celestin Company
  • +
  • Center for Analysis and Prediction of Storms, University of Oklahoma
  • +
  • Center for Digital Systems Engineering, Research Triangle Institute
  • +
  • Center for Global Atmospheric Modelling, University of Reading
  • +
  • Center for Nondestructive Evaluation, Iowa State University
  • +
  • Centre d'Oceanologie de Marseille
  • +
  • Centro Nacional de Datos Oceanograficos de Chile
  • +
  • Centro de Neurociencias de Cuba
  • +
  • CERFACS (European Center for Research and Advanced Training in Scientific + Computation), France
  • +
  • Checkmate Engineering, Inc.
  • +
  • ChemSoft, Inc.
  • +
  • ChemWare, Inc.
  • +
  • City University of Hong Kong
  • +
  • Climate Diagnostics Center
  • +
  • Climate Research Division, Scripps Institution of Oceanography
  • +
  • Climate and Radiation Branch, NASA Goddard Space Flight Center
  • +
  • College of Oceanography, Oregon State
  • +
  • Columbia University
  • +
  • Commonwealth Bureau of Meteorology, Australia
  • +
  • Complutense University (MADRID- SPAIN)
  • +
  • Cornell
  • +
  • Cray Computer Corporation
  • +
  • Cray Research Australia
  • +
  • Cray Research, Inc.
  • +
  • DCI Systems & User Support Group, RAL
  • +
  • DLR Institute of Fluid Mechanics, Gottingen, Germany
  • +
  • Dalhousie University, Halifax
  • +
  • Danish Meteorological Institute
  • +
  • Defence Research Establishment Pacific
  • +
  • Delft University of Technology, Netherlands
  • +
  • Department of Applied Mathematics, University of Washington
  • +
  • Department of Atmospheric Science, Colorado State University
  • +
  • Department of Atmospheric Sciences, UCLA
  • +
  • Department of Atmospheric Sciences, University of Washington
  • +
  • Department of Chemistry, Rutgers University
  • +
  • Department of Chemistry, University of Western Ontario
  • +
  • Department of Computer Science, Western Washington University
  • +
  • Department of Earth Sciences, University of Wales College of Cardiff
  • +
  • Department of Geology, Istanbul Technical University, Turkey
  • +
  • Department of Geology, University of Illinois
  • +
  • Department of Geophysics and Planetary Sciences, Tel-Aviv University
  • +
  • Department of Hydrology and Water Resources, University of Arizona
  • +
  • Department of Meteorology, Texas A&M University
  • +
  • Department of Oceanography, Dalhousie University
  • +
  • Department of Rangeland Ecology and Management, Texas A&M University
  • +
  • Department of Structural Biology, Biomolecular Engineering Research Institute
  • +
  • Dept of Atmospheric and Oceanic Sciences, McGill University
  • +
  • Deutsche Forschungsanstalt für Luft- und Raumfahrt e.V..
  • +
  • Dickens Data Systems, Inc.
  • +
  • Digital Equipment Corporation
  • +
  • Division of Ocean and Atmospheric Science, Hokkaido University
  • +
  • Dow Chemical
  • +
  • Earth Sciences Division, Raytheon ITSS at NASA Ames Research
  • +
  • ENTERPRISE Products
  • +
  • ETH Zurich
  • +
  • Earth System Science Laboratory, University of Alabama in Huntsville
  • +
  • Electricite de France
  • +
  • Energy & System Engineering Group, Japan NUS Co., Ltd.
  • +
  • Ensign Geophysics Ltd.
  • +
  • Environment Waikato, New Zealand
  • +
  • Federal Geographic Data Committee
  • +
  • Fieldview Product Manager, Intelligent Light
  • +
  • Finnigan-MAT
  • +
  • Florida State University
  • +
  • Forschungszentrum Juelich Gmbh (KFA)
  • +
  • Fortner Research LLC
  • +
  • Fraunhofer Institute for Atmospheric Environmental Research
  • +
  • Fundacion Centro de Estudios Ambientales del Mediterraneo
  • +
  • GNU
  • +
  • General Atomics
  • +
  • General Motors R&D Center
  • +
  • General Science Corporation
  • +
  • GeoForschungsZentrum Potsdam
  • +
  • Geophysical Department, Utrecht University
  • +
  • Geophysical Institute, University of Alaska, Fairbanks
  • +
  • Geoterrex-Dighem Pty Limited, Australia
  • +
  • German Aerospace Research Establishment (DLR)
  • +
  • German Climate Compute Center
  • +
  • German Remote Sensing Data Center (DFD),
  • +
  • Glaciology Laboratory, Grenoble, France
  • +
  • Global Climate Research Division, LLNL
  • +
  • Goddard Space Flight Center
  • +
  • Grupo CLIMA - IMFIA, Uruguay
  • +
  • Harris ISD
  • +
  • Harvard Seismology
  • +
  • Hatfield Marine Science Center, Newport, Oregon
  • +
  • Hewlett-Packard
  • +
  • Hughes Aircraft Company
  • +
  • IBM
  • +
  • IDRIS/Support Visualisation & Video, France
  • +
  • IFREMER (The French Institute of Research and Exploitation of the Sea)
  • +
  • IKU Petroleum Research, Trondheim, Norway
  • +
  • IRPEM-CNR
  • +
  • Illinois State Water Survey
  • +
  • Imperial College of Science, Technology, and Medicine, London
  • +
  • Infometrix
  • +
  • Institut d'Astronomie et de Geophysique, Belgium
  • +
  • Institut für Flugmechanik
  • +
  • Institut für Geophysik, Universität Göttingen
  • +
  • Institut für Meteorologie und Klimaforschung
  • +
  • Institut für Stratosphaerische Chemie
  • +
  • Institute for Atmospheric Science, ETH, Zurich
  • +
  • Institute for Stratospheric Chemistry(ICG-1), Institute for the Chemistry + and Dynamics of the Geosphere
  • +
  • Institute for Tropospheric Research
  • +
  • Institute of Applied Computer Science (IAI), KfK Research Centre - Karlsruhe
  • +
  • Instituto Andaluz de Ciencias de la Tierra, Granada, Spain
  • +
  • Instituto Oceanografico da USP
  • +
  • Instituto de Oceanografia, Universidade de Lisboa
  • +
  • Iowa State
  • +
  • JASCO Corporation, Hachioji Tokyo 192 JAPAN
  • +
  • Jaime I University
  • +
  • Joint Institute for the Study of the Atmosphere and Ocean (JISAO)
  • +
  • KEO Consultants
  • +
  • KODAK
  • +
  • Kaiser Aluminum
  • +
  • Koninklijk Nederlands Meteorologisch Instituut (KNMI)
  • +
  • Koninklijke/Shell-Laboratorium Amsterdam
  • +
  • LABTECH
  • +
  • Laboratoire de Dynamique Moleculaire, Institut de Biologie Structurale
  • +
  • Laboratoire de Météorologie Dynamique du CNRS, France
  • +
  • Laboratory for Plasma Studies, Cornell University
  • +
  • Laboratory of Molecular Biophysics
  • +
  • Lamont-Doherty Earth Observatory of Columbia University
  • +
  • Lawrence Berkeley Laboratory (LBL)
  • +
  • Lawrence Livermore National Laboratory (LLNL)
  • +
  • Litton TASC
  • +
  • Lockheed Martin Technical Services
  • +
  • Lockheed Martin/GES
  • +
  • Los Alamos National Laboratory (LANL)
  • +
  • Louisiana State University
  • +
  • M.D. Anderson Cancer Center
  • +
  • MAPS geosystems
  • +
  • MIT Lincoln Laboratory
  • +
  • MIT Plasma Fusion Center
  • +
  • MUMM (CAMME)
  • +
  • Marine Biological Laboratory, Woods Hole
  • +
  • Massachusetts Institute of Technology
  • +
  • Maurice-Lamontagne Institute, Department of Fisheries and Oceans Canada
  • +
  • Memorial Sloan-Kettering Cancer Center (MSKCC)
  • +
  • Mesonet, University of Oklahoma
  • +
  • Meteorological Systems and Technology (METSYS) South Africa
  • +
  • Michigan State University, Physics Department
  • +
  • Michigan State University, Geography/Fisheries and Wildlife Department
  • +
  • Microelectronics Center of North Carolina (MCNC)
  • +
  • Minnesota Supercomputer Center
  • +
  • Mote Marine Laboratory
  • +
  • Multimedia Lab, University of Zurich, Switzerland
  • +
  • NASA / GSFC
  • +
  • NASA / Goddard Institute for Space Studies
  • +
  • NASA / JPL
  • +
  • NASA Ames Research Center
  • +
  • NASA Dryden FRC
  • +
  • NCAR / ACD
  • +
  • NCAR / ATD
  • +
  • NCAR / CGD
  • +
  • NCAR / HAO
  • +
  • NCAR / MMM
  • +
  • NCAR / RAF
  • +
  • NCAR / RAP
  • +
  • NCAR / SCD
  • +
  • NCSA-University of Illinois at Urbana-Champaign
  • +
  • NIST
  • +
  • NMFS
  • +
  • NOAA / AOML / CIMAS, Hurricane Research Division
  • +
  • NOAA / Arkansas-Red Basin River Forecast Center
  • +
  • NOAA / CDC
  • +
  • NOAA / CRD
  • +
  • NOAA / ERL / FSL
  • +
  • NOAA / ETL
  • +
  • NOAA / FSL
  • +
  • NOAA / Geophysical Fluid Dynamics Laboratory
  • +
  • NOAA / NGDC
  • +
  • NOAA / NGDC / Paleoclimatology Group
  • +
  • NOAA / PMEL
  • +
  • NOAA / PMEL / OCRD
  • +
  • Nansen Environmental and Remote Sensing Centre (NERSC), Norway
  • +
  • National Center for Atmospheric Research (NCAR)
  • +
  • National Energy Research Supercomputer Center (NERSC)
  • +
  • National Fisheries, University of Pusan
  • +
  • National Institute of Health
  • +
  • National Research Council of Canada
  • +
  • National Severe Storms Laboratory
  • +
  • National Weather Service
  • +
  • National Weather Service, Camp Springs, MD
  • +
  • National Weather Service, Juneau, Alaska
  • +
  • Natural Resources Conservation Service, U.S. Department of Agriculture
  • +
  • Naval Postgraduate School
  • +
  • Naval Research Laboratory
  • +
  • North Carolina State University
  • +
  • North Carolina Supercomputing Center/MCNC Environmental Programs
  • +
  • Northwest Research Associates, Inc.
  • +
  • Nova University
  • +
  • Numerical Algorithms Group (NAG)
  • +
  • Oak Ridge National Laboratory
  • +
  • Observation Center for Prediction of Earthquakes
  • +
  • Ocean Science & Technology
  • +
  • Oceanography, University College/Australian Defence Force Academy
  • +
  • Office of Fusion Energy Sciences, DOE
  • +
  • Oklahoma Climate Survey
  • +
  • Oklahoma Mesonet
  • +
  • Old Dominion University
  • +
  • Oregon Graduate Institute
  • +
  • Oregon State University
  • +
  • Orkustofnun (National Energy Authority), Reykjavik, Iceland
  • +
  • PE Nelson Systems, Inc.
  • +
  • PNNL
  • +
  • POSTECH
  • +
  • PPPL
  • +
  • Pacific Fisheries Environmental Group
  • +
  • Pacific Tsunami Warning Center
  • +
  • Parallel Computing Group, University of Geneva CUI
  • +
  • Pennsylvania State University/Applied Research Laboratory
  • +
  • Phillips Laboratory/GPIA, Hanscom AFB
  • +
  • Physics Department, Lawrence Livermore National Laboratory
  • +
  • Pittsburgh Supercomputing Center
  • +
  • Plymouth State College, Plymouth NH
  • +
  • Positron Imaging Laboratories, Montreal Neurological Institute
  • +
  • Princeton
  • +
  • Project Centre for Ecosystem Research at the University of Kiel
  • +
  • Pure Atria
  • +
  • Queensland Insitute of Natural Science
  • +
  • RMIT Applied Physics
  • +
  • RSI
  • +
  • Raytheon Co.
  • +
  • Research Centre Karlsruhe, Institute of Applied Computer Science
  • +
  • Research Systems
  • +
  • River Forecast Center, TULSA
  • +
  • Rosenstiel School of Marine and Atmospheric Science (RSMAS), University + of Miami
  • +
  • Royal Observatory, Hong Kong
  • +
  • Rutgers University
  • +
  • SAIC
  • +
  • SCIEX
  • +
  • SSEC, University of Wisconsin
  • +
  • SSESCO
  • +
  • SUNY Albany
  • +
  • SYSECA
  • +
  • San Diego Supercomputer Center
  • +
  • Sandia National Laboratories
  • +
  • Scripps Institution of Oceanography
  • +
  • Semichem Technical Support
  • +
  • Shimadzu Corporation
  • +
  • Siemens Power Corp
  • +
  • Silicon Graphics Inc.
  • +
  • SoftShell International, Ltd.
  • +
  • Software Development Centre, Delft Hydraulics
  • +
  • Software Engineering Research Group, Michigan State University
  • +
  • Soil Conservation Service, U.S. Department of Agriculture
  • +
  • Southeastern Regional Climate Center
  • +
  • Southern Regional Climate Center, Louisiana State University
  • +
  • Southwest Research Institute
  • +
  • Space Research Institute, Moscow
  • +
  • Stanford University
  • +
  • StatSci
  • +
  • Stratospheric Research group, Free University Berlin
  • +
  • Supercomputer Computations Research Institute
  • +
  • Synap Corporation
  • +
  • Technical University of Madrid/Computer Science School
  • +
  • Tera Research, Inc.
  • +
  • Texas A&M Ranching Systems Group
  • +
  • Texas A&M University
  • +
  • Texas A&M University at Tallahassee
  • +
  • Texas Instruments, Inc.
  • +
  • The Auroral Observatory, University of Troms, Norway
  • +
  • Theoretical Physics, Fermilab
  • +
  • Thomson-CSF / SYSECA
  • +
  • Tokyo Metropolitan University
  • +
  • Tulsa District, U.S. Army Corps of Engineers
  • +
  • U.S. Air Force
  • +
  • U.S. Army Corps of Engineers
  • +
  • U.S. Department of Agriculture / ARS
  • +
  • U.S. Department of Energy
  • +
  • U.S. Enviromental Protection Agency
  • +
  • U.S. Geological Survey, Woods Hole
  • +
  • U.S. Navy
  • +
  • U.S. Patent Office
  • +
  • UCAR / GPS/MET
  • +
  • UMD
  • +
  • UPRC
  • +
  • University of Alaska
  • +
  • University of Alberta
  • +
  • University of Arizona
  • +
  • University of Bergen, Norway
  • +
  • University of Bern
  • +
  • University of British Columbia
  • +
  • University of Caen, France
  • +
  • University of California / LLNL
  • +
  • University of California, Davis
  • +
  • University of California, Irvine
  • +
  • University of California, Los Angeles
  • +
  • University of California, San Diego
  • +
  • University of California, Santa Barbara / Institute for Computational Earth + System Science
  • +
  • University of California, Santa Cruz
  • +
  • University of Cambridge, UK
  • +
  • University of Chicago
  • +
  • University of Colorado
  • +
  • University of Cyprus
  • +
  • University of Delaware
  • +
  • University of Denver
  • +
  • University of Florida
  • +
  • University of Hawaii
  • +
  • University of Illinois
  • +
  • University of Kansas
  • +
  • University of Manitoba
  • +
  • University of Maryland
  • +
  • University of Massachussetts
  • +
  • University of Miami / RSMAS
  • +
  • University of Michigan
  • +
  • University of Minnesota / Department of Geology and Geophysics
  • +
  • University of Minnesota Supercomputer Institute
  • +
  • University of Montana
  • +
  • University of Nebraska, Lincoln
  • +
  • University of New Hampshire
  • +
  • University of North Dakota
  • +
  • University of Oklahoma
  • +
  • University of Rhode Island, Graduate School of Oceanography
  • +
  • University of South Florida
  • +
  • University of Sydney / School of Mathematics
  • +
  • University of Texas, Austin
  • +
  • University of Texas, Houston
  • +
  • University of Tokyo / Earthquake Research Institute
  • +
  • University of Toronto
  • +
  • University of Utrecht (The Netherlands)
  • +
  • University of Victoria / School of Earth and Ocean Sciences
  • +
  • University of Virginia
  • +
  • University of Washington
  • +
  • University of Western Ontario
  • +
  • University of Wisconsin
  • +
  • University of Zurich, Switzerland
  • +
  • University of the Witwatersrand / Climatology Research Group
  • +
  • Universität Goettingen, Institut für Geophysik
  • +
  • Utah Water Research Laboratory
  • +
  • Vanderbilt
  • +
  • Varian Chromatography Systems
  • +
  • Victoria University of Wellington / Institute of Geophysics
  • +
  • Virginia Tech Department of Computer Science
  • +
  • Visualization and Imaging Team, Idaho National Engineering Lab
  • +
  • Wadia Institute of Himalayan Geology
  • +
  • Woods Hole Oceanographic Institution
  • +
  • Wyle Laboratories
  • +
  • Yale University
  • +
+ + + + + \ No newline at end of file From dd99d60294663c1f7d2d4291d98a8ad8469bca19 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 19 Dec 2022 14:55:31 -0700 Subject: [PATCH 08/20] Added another old static html page in preparation for updating. --- docs/static-pages/standards.html | 72 ++++++++++++++++++++++++++++++++ 1 file changed, 72 insertions(+) create mode 100644 docs/static-pages/standards.html diff --git a/docs/static-pages/standards.html b/docs/static-pages/standards.html new file mode 100644 index 0000000000..b0ea4edfa9 --- /dev/null +++ b/docs/static-pages/standards.html @@ -0,0 +1,72 @@ + + + + +NetCDF Standards +

Status of standards body endorsements of netCDF and related conventions

+ +

The netCDF format has been endorsed by several standards bodies: +

+
    + +
  • On 2009-02-05, the NASA Earth Science Data Systems (ESDS) Standards Process Group + officially endorsed the document ESDS-RFC-011, NetCDF Classic and 64-bit Offset File + Formats, as an appropriate standard for NASA Earth Science + data.
  • + +
  • On 2010-03-12, the Integrated Ocean Observing System + (IOOS) Data Management and Communications (DMAC) Subsystem endorsed netCDF with Climate and Forecast + (CF) conventions as a preferred data format.
  • + +
  • On 2010-09-27, the Steering Committee of the Federal Geographic Data Committee (FGDC) officially endorsed + netCDF as a "Common Encoding Standard (CES)". +
  • + +
  • On 2010-10-18, the ESDS-RFC-021 Technical Working Group issued a + final report + concluding that the NASA ESDS Standards Process Group should + recommend ESDS-RFC-021, CF Metadata + Conventions, for endorsement as a NASA Recommended Standard.
  • + +
  • + On 2011-04-19, the Open Geospatial Consortium (OGC) approved the OGC + Network Common Data Form (netCDF) Core Encoding Standard, and NetCDF + Binary Encoding Extension Standard - netCDF Classic and 64-bit + Offset Format as official OGC standards. These standards are + available for free download at http://www.opengeospatial.org/standards/netcdf. +
  • + +
  • + On 2011-11-03, the ESDS-RFC-022 Technical Working Group issued a final report + recommending ESDS-RFC-022, NetCDF-4/HDF-5 File Format, for + endorsement as an EOSDIS Approved Standard. +
  • + +
  • On 2012-11-16, the OGC adopted the + netCDF Enhanced Data Model Extension to the OGC Network Common Data + Form Core Encoding Standard, making netCDF-4 an official OGC + standard. This standard is available for free download at http://www.opengeospatial.org/standards/netcdf. +
  • + +
  • + On 2013-02-14, the OGC approved the + Climate and Forecast (CF) extension to the NetCDF Core data model + standard, making the CF metadata conventions for netCDF an official + OGC standard. +
  • + +
+ +
+
+ + + \ No newline at end of file From c228426c07b5343dcba61563ae77b70aa7c3915f Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 19 Dec 2022 16:57:36 -0700 Subject: [PATCH 09/20] Fix a logic error re: DAP4 tests, when DAP4 is specified, but hdf5/netcdf4 support is disabled. --- CMakeLists.txt | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/CMakeLists.txt b/CMakeLists.txt index 55140a78cf..28cf65bcd5 100644 --- a/CMakeLists.txt +++ b/CMakeLists.txt @@ -1062,11 +1062,13 @@ ENDIF() IF(ENABLE_DAP) SET(USE_DAP ON CACHE BOOL "") SET(ENABLE_DAP2 ON CACHE BOOL "") - SET(ENABLE_DAP4 ON CACHE BOOL "") - IF(NOT ENABLE_HDF5) + IF(NOT ENABLE_HDF5 OR NOT ENABLE_NETCDF4) + MESSAGE(STATUS "Disabling ENABLE_DAP4") SET(ENABLE_DAP4 OFF CACHE BOOL "") - ENDIF(NOT ENABLE_HDF5) + ELSE() + SET(ENABLE_DAP4 ON CACHE BOOL "") + ENDIF(NOT ENABLE_HDF5 OR NOT ENABLE_NETCDF4) ELSE() SET(ENABLE_DAP2 OFF CACHE BOOL "") From 9ea273961b69ed3f6cba128e4fc4008c5df24bdc Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 12:11:16 -0800 Subject: [PATCH 10/20] Added S3 status to libnetcdf.settings, turned byterange on by default. --- CMakeLists.txt | 15 +++++++-------- configure.ac | 13 +++++++------ libnetcdf.settings.in | 12 +++++++++--- 3 files changed, 23 insertions(+), 17 deletions(-) diff --git a/CMakeLists.txt b/CMakeLists.txt index 28cf65bcd5..5a951bd0d3 100644 --- a/CMakeLists.txt +++ b/CMakeLists.txt @@ -31,8 +31,8 @@ set(PACKAGE "netCDF" CACHE STRING "") SET(NC_VERSION_MAJOR 4) SET(NC_VERSION_MINOR 9) -SET(NC_VERSION_PATCH 2) -SET(NC_VERSION_NOTE "-development") +SET(NC_VERSION_PATCH 1) +SET(NC_VERSION_NOTE "-rc2") SET(netCDF_VERSION ${NC_VERSION_MAJOR}.${NC_VERSION_MINOR}.${NC_VERSION_PATCH}${NC_VERSION_NOTE}) SET(VERSION ${netCDF_VERSION}) SET(NC_VERSION ${netCDF_VERSION}) @@ -1062,13 +1062,11 @@ ENDIF() IF(ENABLE_DAP) SET(USE_DAP ON CACHE BOOL "") SET(ENABLE_DAP2 ON CACHE BOOL "") + SET(ENABLE_DAP4 ON CACHE BOOL "") - IF(NOT ENABLE_HDF5 OR NOT ENABLE_NETCDF4) - MESSAGE(STATUS "Disabling ENABLE_DAP4") + IF(NOT ENABLE_HDF5) SET(ENABLE_DAP4 OFF CACHE BOOL "") - ELSE() - SET(ENABLE_DAP4 ON CACHE BOOL "") - ENDIF(NOT ENABLE_HDF5 OR NOT ENABLE_NETCDF4) + ENDIF(NOT ENABLE_HDF5) ELSE() SET(ENABLE_DAP2 OFF CACHE BOOL "") @@ -1076,7 +1074,7 @@ ELSE() ENDIF() # Option to support byte-range reading of remote datasets -OPTION(ENABLE_BYTERANGE "Enable byte-range access to remote datasets.." OFF) +OPTION(ENABLE_BYTERANGE "Enable byte-range access to remote datasets.." ON) # Check for the math library so it can be explicitly linked. IF(NOT WIN32) @@ -2537,6 +2535,7 @@ is_enabled(ENABLE_ZERO_LENGTH_COORD_BOUND RELAX_COORD_BOUND) is_enabled(USE_CDF5 HAS_CDF5) is_enabled(ENABLE_ERANGE_FILL HAS_ERANGE_FILL) is_enabled(HDF5_HAS_PAR_FILTERS HAS_PAR_FILTERS) +is_enabled(ENABLE_NCZARR_S3 HAS_NCZARR_S3) is_enabled(ENABLE_NCZARR HAS_NCZARR) is_enabled(ENABLE_NCZARR_S3_TESTS DO_NCZARR_S3_TESTS) is_enabled(ENABLE_MULTIFILTERS HAS_MULTIFILTERS) diff --git a/configure.ac b/configure.ac index 3a38069449..6a0f5c3b28 100644 --- a/configure.ac +++ b/configure.ac @@ -10,7 +10,7 @@ AC_PREREQ([2.59]) # Initialize with name, version, and support email address. -AC_INIT([netCDF],[4.9.2-development],[support-netcdf@unidata.ucar.edu],[netcdf-c]) +AC_INIT([netCDF],[4.9.1-rc2],[support-netcdf@unidata.ucar.edu],[netcdf-c]) ## # Prefer an empty CFLAGS variable instead of the default -g -O2. @@ -21,8 +21,8 @@ AC_INIT([netCDF],[4.9.2-development],[support-netcdf@unidata.ucar.edu],[netcdf-c AC_SUBST([NC_VERSION_MAJOR]) NC_VERSION_MAJOR=4 AC_SUBST([NC_VERSION_MINOR]) NC_VERSION_MINOR=9 -AC_SUBST([NC_VERSION_PATCH]) NC_VERSION_PATCH=2 -AC_SUBST([NC_VERSION_NOTE]) NC_VERSION_NOTE="-development" +AC_SUBST([NC_VERSION_PATCH]) NC_VERSION_PATCH=1 +AC_SUBST([NC_VERSION_NOTE]) NC_VERSION_NOTE="-rc2" ## # These linker flags specify libtool version info. @@ -97,7 +97,7 @@ AC_CONFIG_LINKS([nc_test4/ref_hdf5_compat3.nc:nc_test4/ref_hdf5_compat3.nc]) AC_CONFIG_LINKS([hdf4_test/ref_chunked.hdf4:hdf4_test/ref_chunked.hdf4]) AC_CONFIG_LINKS([hdf4_test/ref_contiguous.hdf4:hdf4_test/ref_contiguous.hdf4]) AM_INIT_AUTOMAKE([foreign dist-zip subdir-objects]) - +AM_MAINTAINER_MODE() # Check for the existence of this file before proceeding. AC_CONFIG_SRCDIR([include/netcdf.h]) @@ -1282,9 +1282,9 @@ fi # Does the user want to allow reading of remote data via range headers? AC_MSG_CHECKING([whether byte range support is enabled]) AC_ARG_ENABLE([byterange], - [AS_HELP_STRING([--enable-byterange], + [AS_HELP_STRING([--disable-byterange], [allow byte-range I/O])]) -test "x$enable_byterange" = xyes || enable_byterange=no +test "x$disable_byterange" = xyes || enable_byterange=no AC_MSG_RESULT($enable_byterange) # Need curl for byte ranges if test "x$found_curl" = xno && test "x$enable_byterange" = xyes ; then @@ -1929,6 +1929,7 @@ AC_SUBST(HAS_ERANGE_FILL,[$enable_erange_fill]) AC_SUBST(HAS_BYTERANGE,[$enable_byterange]) AC_SUBST(RELAX_COORD_BOUND,[yes]) AC_SUBST([HAS_PAR_FILTERS], [$hdf5_supports_par_filters]) +AC_SUBST(HAS_NCZARR_S3,[$enable_nczarr_s3]) AC_SUBST(HAS_NCZARR,[$enable_nczarr]) AC_SUBST(DO_NCZARR_S3_TESTS,[$enable_nczarr_s3_tests]) AC_SUBST(HAS_MULTIFILTERS,[$has_multifilters]) diff --git a/libnetcdf.settings.in b/libnetcdf.settings.in index 6dea12d5c7..b25d3b603a 100644 --- a/libnetcdf.settings.in +++ b/libnetcdf.settings.in @@ -27,22 +27,27 @@ XML Parser: @XMLPARSER@ # Features -------- +Benchmarks: @HAS_BENCHMARKS@ NetCDF-2 API: @HAS_NC2@ HDF4 Support: @HAS_HDF4@ HDF5 Support: @HAS_HDF5@ NetCDF-4 API: @HAS_NC4@ +CDF5 Support: @HAS_CDF5@ NC-4 Parallel Support: @HAS_PARALLEL4@ PnetCDF Support: @HAS_PNETCDF@ + DAP2 Support: @HAS_DAP2@ DAP4 Support: @HAS_DAP4@ Byte-Range Support: @HAS_BYTERANGE@ +NCZarr Support: @HAS_NCZARR@ +NCZarr S3 Support: @HAS_NCZARR_S3@ + Diskless Support: @HAS_DISKLESS@ MMap Support: @HAS_MMAP@ JNA Support: @HAS_JNA@ -CDF5 Support: @HAS_CDF5@ ERANGE Fill Support: @HAS_ERANGE_FILL@ Relaxed Boundary Check: @RELAX_COORD_BOUND@ -Parallel Filters: @HAS_PAR_FILTERS@ + NCZarr Support: @HAS_NCZARR@ Multi-Filter Support: @HAS_MULTIFILTERS@ Quantization: @HAS_QUANTIZE@ @@ -50,4 +55,5 @@ Logging: @HAS_LOGGING@ SZIP Write Support: @HAS_SZLIB_WRITE@ Standard Filters: @STD_FILTERS@ ZSTD Support: @HAS_ZSTD@ -Benchmarks: @HAS_BENCHMARKS@ +Parallel Filters: @HAS_PAR_FILTERS@ + From ecd48ae14e7cbe1efc210163e0abac154344d1cd Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 13:48:54 -0800 Subject: [PATCH 11/20] Cleaning up NCZARR_S3 summary, turning on byterange by default. --- CMakeLists.txt | 2 +- configure.ac | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/CMakeLists.txt b/CMakeLists.txt index 5a951bd0d3..7b48f9d07c 100644 --- a/CMakeLists.txt +++ b/CMakeLists.txt @@ -1309,7 +1309,7 @@ ENDIF() IF(NOT ENABLE_S3_SDK) IF(ENABLE_NCZARR_S3 OR ENABLE_NCZARR_S3_TESTS) - message(FATAL_ERROR "S3 support library not found; please specify option DENABLE_NCZARR_S3=NO") + message(FATAL_ERROR "S3 support library not found; please specify option -DENABLE_NCZARR_S3=NO") SET(ENABLE_NCZARR_S3 OFF CACHE BOOL "NCZARR S3 support" FORCE) SET(ENABLE_NCZARR_S3_TESTS OFF CACHE BOOL "S3 tests" FORCE) ENDIF() diff --git a/configure.ac b/configure.ac index 6a0f5c3b28..0d19993806 100644 --- a/configure.ac +++ b/configure.ac @@ -1284,7 +1284,7 @@ AC_MSG_CHECKING([whether byte range support is enabled]) AC_ARG_ENABLE([byterange], [AS_HELP_STRING([--disable-byterange], [allow byte-range I/O])]) -test "x$disable_byterange" = xyes || enable_byterange=no +test "x$enable_byterange" = xno || enable_byterange=yes AC_MSG_RESULT($enable_byterange) # Need curl for byte ranges if test "x$found_curl" = xno && test "x$enable_byterange" = xyes ; then From 19a1f9ec29b4e1ef3bf01d2816294f0983de31e6 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 14:43:50 -0800 Subject: [PATCH 12/20] Add libcurl-dev to cygwin github actions --- .github/workflows/run_tests_win_cygwin.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/run_tests_win_cygwin.yml b/.github/workflows/run_tests_win_cygwin.yml index 147216e768..b971667633 100644 --- a/.github/workflows/run_tests_win_cygwin.yml +++ b/.github/workflows/run_tests_win_cygwin.yml @@ -29,7 +29,7 @@ jobs: git automake libtool autoconf2.5 make libhdf5-devel libhdf4-devel zipinfo libxml2-devel perl zlib-devel libzstd-devel libbz2-devel libaec-devel libzip-devel - libdeflate-devel gcc-core + libdeflate-devel gcc-core libcurl-dev - name: (Autotools) Run autoconf and friends run: | From e02f6781688339cf81ed72995370e895c066b70d Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 14:45:02 -0800 Subject: [PATCH 13/20] Correct libcurl development package. --- .github/workflows/run_tests_win_cygwin.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/run_tests_win_cygwin.yml b/.github/workflows/run_tests_win_cygwin.yml index b971667633..6a05139305 100644 --- a/.github/workflows/run_tests_win_cygwin.yml +++ b/.github/workflows/run_tests_win_cygwin.yml @@ -29,7 +29,7 @@ jobs: git automake libtool autoconf2.5 make libhdf5-devel libhdf4-devel zipinfo libxml2-devel perl zlib-devel libzstd-devel libbz2-devel libaec-devel libzip-devel - libdeflate-devel gcc-core libcurl-dev + libdeflate-devel gcc-core libcurl-devel - name: (Autotools) Run autoconf and friends run: | From bd0341256bea7060680deb1b975b613ef01d378e Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 14:55:30 -0800 Subject: [PATCH 14/20] Add libiconv-devel to cygwin CI --- .github/workflows/run_tests_win_cygwin.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/run_tests_win_cygwin.yml b/.github/workflows/run_tests_win_cygwin.yml index 6a05139305..d3e4f5c880 100644 --- a/.github/workflows/run_tests_win_cygwin.yml +++ b/.github/workflows/run_tests_win_cygwin.yml @@ -29,7 +29,7 @@ jobs: git automake libtool autoconf2.5 make libhdf5-devel libhdf4-devel zipinfo libxml2-devel perl zlib-devel libzstd-devel libbz2-devel libaec-devel libzip-devel - libdeflate-devel gcc-core libcurl-devel + libdeflate-devel gcc-core libcurl-devel libiconv-devel - name: (Autotools) Run autoconf and friends run: | From 394cf6466e474d3937d47a2c34e90ee9127456f9 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 15:07:46 -0800 Subject: [PATCH 15/20] Correct version string change that should not have ended up in this branch. --- CMakeLists.txt | 4 ++-- configure.ac | 6 +++--- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/CMakeLists.txt b/CMakeLists.txt index 7b48f9d07c..60293c39be 100644 --- a/CMakeLists.txt +++ b/CMakeLists.txt @@ -31,8 +31,8 @@ set(PACKAGE "netCDF" CACHE STRING "") SET(NC_VERSION_MAJOR 4) SET(NC_VERSION_MINOR 9) -SET(NC_VERSION_PATCH 1) -SET(NC_VERSION_NOTE "-rc2") +SET(NC_VERSION_PATCH 2) +SET(NC_VERSION_NOTE "-development") SET(netCDF_VERSION ${NC_VERSION_MAJOR}.${NC_VERSION_MINOR}.${NC_VERSION_PATCH}${NC_VERSION_NOTE}) SET(VERSION ${netCDF_VERSION}) SET(NC_VERSION ${netCDF_VERSION}) diff --git a/configure.ac b/configure.ac index 0d19993806..a759a5a4e7 100644 --- a/configure.ac +++ b/configure.ac @@ -10,7 +10,7 @@ AC_PREREQ([2.59]) # Initialize with name, version, and support email address. -AC_INIT([netCDF],[4.9.1-rc2],[support-netcdf@unidata.ucar.edu],[netcdf-c]) +AC_INIT([netCDF],[4.9.2-development],[support-netcdf@unidata.ucar.edu],[netcdf-c]) ## # Prefer an empty CFLAGS variable instead of the default -g -O2. @@ -21,8 +21,8 @@ AC_INIT([netCDF],[4.9.1-rc2],[support-netcdf@unidata.ucar.edu],[netcdf-c]) AC_SUBST([NC_VERSION_MAJOR]) NC_VERSION_MAJOR=4 AC_SUBST([NC_VERSION_MINOR]) NC_VERSION_MINOR=9 -AC_SUBST([NC_VERSION_PATCH]) NC_VERSION_PATCH=1 -AC_SUBST([NC_VERSION_NOTE]) NC_VERSION_NOTE="-rc2" +AC_SUBST([NC_VERSION_PATCH]) NC_VERSION_PATCH=2 +AC_SUBST([NC_VERSION_NOTE]) NC_VERSION_NOTE="-development" ## # These linker flags specify libtool version info. From 3e35a10aa5cf7c057035399877241a08f5eeaccb Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 16:40:51 -0800 Subject: [PATCH 16/20] Correct logic for running DAP4 tests when HDF5 is not enabled. --- CMakeLists.txt | 15 +++++++++------ 1 file changed, 9 insertions(+), 6 deletions(-) diff --git a/CMakeLists.txt b/CMakeLists.txt index 60293c39be..9b0573111b 100644 --- a/CMakeLists.txt +++ b/CMakeLists.txt @@ -519,8 +519,8 @@ OPTION(ENABLE_CDF5 "Enable CDF5 support" ON) # Netcdf-4 support (i.e. libsrc4) is required by more than just HDF5 (e.g. NCZarr) # So depending on what above formats are enabled, enable netcdf-4 if(ENABLE_HDF5 OR ENABLE_HDF4 OR ENABLE_NCZARR) -SET(ENABLE_NETCDF_4 ON CACHE BOOL "Enable netCDF-4 API" FORCE) -SET(ENABLE_NETCDF4 ON CACHE BOOL "Enable netCDF4 Alias" FORCE) + SET(ENABLE_NETCDF_4 ON CACHE BOOL "Enable netCDF-4 API" FORCE) + SET(ENABLE_NETCDF4 ON CACHE BOOL "Enable netCDF4 Alias" FORCE) endif() IF(ENABLE_HDF4) @@ -1062,11 +1062,14 @@ ENDIF() IF(ENABLE_DAP) SET(USE_DAP ON CACHE BOOL "") SET(ENABLE_DAP2 ON CACHE BOOL "") - SET(ENABLE_DAP4 ON CACHE BOOL "") - - IF(NOT ENABLE_HDF5) + + IF(ENABLE_HDF5) + MESSAGE(STATUS "Enabling DAP4") + SET(ENABLE_DAP4 ON CACHE BOOL "") + ELSE() + MESSAGE(STATUS "Disabling DAP4") SET(ENABLE_DAP4 OFF CACHE BOOL "") - ENDIF(NOT ENABLE_HDF5) + ENDIF(ENABLE_HDF5) ELSE() SET(ENABLE_DAP2 OFF CACHE BOOL "") From 4c27c59fea3567ef81a06835c7a0483b2297f13e Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 20:26:05 -0800 Subject: [PATCH 17/20] Update whitespace. --- .github/workflows/run_tests_ubuntu.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/run_tests_ubuntu.yml b/.github/workflows/run_tests_ubuntu.yml index 43a5cf92c8..367e9bcca6 100644 --- a/.github/workflows/run_tests_ubuntu.yml +++ b/.github/workflows/run_tests_ubuntu.yml @@ -4,7 +4,7 @@ name: Run Ubuntu/Linux netCDF Tests -on: [pull_request,workflow_dispatch] +on: [pull_request, workflow_dispatch] jobs: From 341a43b5aa62d7f0595a88039d011c052e1a3267 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Mon, 9 Jan 2023 20:27:12 -0800 Subject: [PATCH 18/20] Correct lingering merge issue. --- .github/workflows/run_tests_ubuntu.yml | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/.github/workflows/run_tests_ubuntu.yml b/.github/workflows/run_tests_ubuntu.yml index 367e9bcca6..9df8b7048e 100644 --- a/.github/workflows/run_tests_ubuntu.yml +++ b/.github/workflows/run_tests_ubuntu.yml @@ -88,8 +88,7 @@ jobs: if: steps.cache-hdf5.outputs.cache-hit != 'true' run: | set -x -<<<<<<< HEAD -======= + wget https://support.hdfgroup.org/ftp/HDF/releases/HDF4.2.15/src/hdf-4.2.15.tar.bz2 tar -jxf hdf-4.2.15.tar.bz2 pushd hdf-4.2.15 @@ -97,7 +96,7 @@ jobs: make -j make install -j popd ->>>>>>> a03bb5e60165b11be11f8c1e8e492a274a742011 + wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-$(echo ${{ matrix.hdf5 }} | cut -d. -f 1,2)/hdf5-${{ matrix.hdf5 }}/src/hdf5-${{ matrix.hdf5 }}.tar.bz2 tar -jxf hdf5-${{ matrix.hdf5 }}.tar.bz2 pushd hdf5-${{ matrix.hdf5 }} From 00065451fc002cc5e1d0e71014af8b3d7c4232c8 Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Tue, 10 Jan 2023 13:42:41 -0800 Subject: [PATCH 19/20] Merging https://github.com/Unidata/netcdf-c/pull/2583 manually into the v4.9.1 wellspring branch. --- configure.ac | 2 +- docs/dispatch.md | 26 ++++++++++++++++++++++++++ 2 files changed, 27 insertions(+), 1 deletion(-) diff --git a/configure.ac b/configure.ac index a759a5a4e7..81a75ea319 100644 --- a/configure.ac +++ b/configure.ac @@ -2065,12 +2065,12 @@ AX_SET_META([NC_HAS_SZIP],[$enable_hdf5_szip],[yes]) AX_SET_META([NC_HAS_ZSTD],[$have_zstd],[yes]) AX_SET_META([NC_HAS_BLOSC],[$have_blosc],[yes]) AX_SET_META([NC_HAS_BZ2],[$have_bz2],[yes]) + # This is the version of the dispatch table. If the dispatch table is # changed, this should be incremented, so that user-defined format # applications like PIO can determine whether they have an appropriate # dispatch table to submit. If this is changed, make sure the value in # CMakeLists.txt also changes to match. - AC_SUBST([NC_DISPATCH_VERSION], [5]) AC_DEFINE_UNQUOTED([NC_DISPATCH_VERSION], [${NC_DISPATCH_VERSION}], [Dispatch table version.]) diff --git a/docs/dispatch.md b/docs/dispatch.md index 55346503f8..3ccd3a6365 100644 --- a/docs/dispatch.md +++ b/docs/dispatch.md @@ -499,6 +499,32 @@ The code in *hdf4var.c* does an *nc_get_vara()* on the HDF4 SD dataset. This is all that is needed for all the nc_get_* functions to work. +# Appendix A. Changing NC_DISPATCH_VERSION + +When new entries are added to the *struct NC_Dispatch* type +-- located in *include/netcdf_dispatch.h.in -- it is necessary to +do two things. +1. Bump the NC_DISPATCH_VERSION number +2. Modify the existing dispatch tables to include the new entries. +It if often the case that the new entries do not mean anything for +a given dispatch table. In that case, the new entries may be set to +some variant of *NC_RO_XXX* or *NC_NOTNC4_XXX* *NC_NOTNC3_XXX*. + +Modifying the dispatch version requires two steps: +1. Modify the version number in *netcdf-c/configure.ac*, and +2. Modify the version number in *netcdf-c/CMakeLists.txt*. + +The two should agree in value. + +### NC_DISPATCH_VERSION Incompatibility + +When dynamically adding a dispatch table +-- in nc_def_user_format (see libdispatch/dfile.c) -- +the version of the new table is compared with that of the built-in +NC_DISPATCH_VERSION; if they differ, then an error is returned from +that function. + + # Point of Contact {#dispatch_poc} *Author*: Dennis Heimbigner
From b67583f0b13c17059d8b026150206f723b16fece Mon Sep 17 00:00:00 2001 From: Ward Fisher Date: Tue, 10 Jan 2023 13:51:34 -0800 Subject: [PATCH 20/20] Fix a doxygen warning-treated-as-error --- docs/dispatch.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/dispatch.md b/docs/dispatch.md index 3ccd3a6365..16689c629b 100644 --- a/docs/dispatch.md +++ b/docs/dispatch.md @@ -501,9 +501,8 @@ work. # Appendix A. Changing NC_DISPATCH_VERSION -When new entries are added to the *struct NC_Dispatch* type --- located in *include/netcdf_dispatch.h.in -- it is necessary to -do two things. +When new entries are added to the *struct NC_Dispatch* type `located in include/netcdf_dispatch.h.in` it is necessary to do two things. + 1. Bump the NC_DISPATCH_VERSION number 2. Modify the existing dispatch tables to include the new entries. It if often the case that the new entries do not mean anything for