-
Notifications
You must be signed in to change notification settings - Fork 262
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
make distcheck fails on master... #2338
Comments
OK, I've fixed that problem, but here's the next one:
|
OK, fixed that one. Now I get:
|
OK, I've got it working. PR shortly... |
Also:
|
OK, I have fixed all the missing file problems in the Makefile.am files. However, now I'm left with this more difficult issue:
I am going to put up a PR with all the Makefile.am changes, while I work on this issue... |
@DennisHeimbigner I think the issue here is the way you are installing the plugins with your own script. In the CCR project, I install them without a script with some automake magic. Here's the example from the zstd plugin:
This builds and installs the plugin in the HDF5_PLUGIN_DIR, which is set by configure, but defaults to the HDF5 default. No install script needed. |
I am surprised; I thought we did a distcheck under our github actions. Guess not. |
Hmmm? Why the reluctance to install the HDF5 plugins? It doesn't auto-install - the user has to do a "make install". Without that, zstandard will not work. ;-) |
For example, I just tried adding this test code, and it fails with current master:
It fails with It does not benefit netCDF users if they can't turn on the feature. Have If we demand a second step, where they have to manually install the plugins, we will generate a lot of support questions and user confusion. What is the benefit of that? The plugin directory is there so that plugins can be installed into it. That's how HDF5 intends us to use plugins. So why the reluctance? |
I took your PR #2342 and did some corrections on it. |
OK, shall I take down #2342? If I read your response correctly, we are not going to install the plugins in the HDF5_PLUGIN_PATH? |
I had not planned to default to HDF5_PLUGIN_PATH, but since it seems important to you,
Do I install into dir1, dir2, or dir3? |
dir1. Here is how I check for it in configure.ac:
In addition to using HDF5_PLUGIN_PATH, you need to default to /usr/local/hdf5/lib/plugin if it is not set. (I don't know what the Windows equivalent is). Most users will not have HDF5_PLUGIN_PATH set. So we need to use HDF5's default location, because that will allow zstandard to work for netcdf-c out of the box. Most people will not |
This won't work if NCZarr is enabled but HDF5 is not. |
Yes, the HDF5_PLUGIN_PATH can be completely ignored in that case. |
Why? NCZarr uses it too. |
OK, sorry I didn't know. In any case, when HDF5 is not built, then no need to worry about the HDF5 default path for plugins. When asking you to support the HDF5 default, I am thinking of NOAA or other HPC systems, where they are going to be producing large netCDF/HDF5 datasets, and will want to use zstandard (as they want to do at NOAA). Having zstandard and HDF5 work together out of the box is very important for scientists who are good at science but not very skilled at installing software. Even advanced users are probably not using any HDF5 filters yet, so they will have no concept of where they should be installed. They should not be expected to know anything about plugin paths. When NOAA starts releasing data with zstandard, we want the community to be able to easily read those files, without having to submit a netcdf support issue, or figure out how to use HDF5 plugins. Instead, it should just quietly work in the default case. The vast majority of users will not care about plugin paths, and will assume that netcdf is going to install itself in the correct place(s) for everything to work. |
Side note. Apparently in HDF5 version 1.13, it will be possible to programmatically |
That will be handy. For testing, what we will have to do is run tests in a script, which sets the HDF5_PLUGIN_PATH and then runs the test. That way, the test can be run before make install. Here's an example:
|
Yes, although some filter tests are a bit more complex. There is already code |
…ript re: Unidata#2338 re: Unidata#2294 In issue Unidata#2338, Ed Hartnett suggested a better way to install filters to a user defined location -- for Automake, anyway. This PR implements that suggestion. It turns out to be more complicated than it appears, so there are fair number of changes; mostly to shell scripts. Most of the change is in plugins/Makefile.am. NOTE: this PR still does NOT address the use of HDF5_PLUGIN_PATH as the default; this turns out to be complex when dealing with NCZarr. So this will be addressed in a subsequent post 4.9.0 PR. ## Misc. Changes 1. Record the occurrences of incomplete codecs in libnczarr so that they can be included in _Codecs attribute correctly. This allows users to see what missing filters are referenced in the Zarr file. Primarily affects libnczarr/zfilter.[ch]. Also required creating a new no-effect filter: H5Zunknown.c. 2. Move the unknown filter test to a separate test file. 3. Incorporates PR Unidata#2343
make distcheck is still failing on the current main branch:
|
I'm not seeing a failure on my local MacOS machine. I'll take a closer look in the am. |
It's because of this section in Makefile.am:
@DennisHeimbigner what is being attempted here? Removing the BZIP2_LICENSE file is causing trouble. |
The comment preceding this block says:
So you should never be invoking this code; it is only to record how bzip2 was built. |
Also, BZIP2_LICENSE is part of the EXTRA_DIST. Did you accidentally delete it? |
I did a make check then a make distcheck. |
Is BZIP2_LICENSE anywhere in your build directory? |
OK everything works with a clean clone. I will close this issue. |
I will take a look...
The text was updated successfully, but these errors were encountered: