You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ran into an issue during release process. We were able to release to PyPi and are able to pull in the latest version and confirm it works. However, when we were testing Dockerhub staging, we came across the following error.
benchmark@c5aca43430ba:~$ opensearch-benchmark --help
Traceback (most recent call last):
File "/usr/local/bin/opensearch-benchmark", line 5, in <module>
from osbenchmark.benchmark import main
File "/usr/local/lib/python3.11/site-packages/osbenchmark/benchmark.py", line 37, in <module>
from osbenchmark import version, actor, config, paths, \
File "/usr/local/lib/python3.11/site-packages/osbenchmark/test_execution_orchestrator.py", line 33, in <module>
from osbenchmark import actor, config, doc_link, \
File "/usr/local/lib/python3.11/site-packages/osbenchmark/worker_coordinator/__init__.py", line 26, in <module>
from .worker_coordinator import (
File "/usr/local/lib/python3.11/site-packages/osbenchmark/worker_coordinator/worker_coordinator.py", line 44, in <module>
from osbenchmark import actor, config, exceptions, metrics, workload, client, paths, PROGRAM_NAME, telemetry
File "/usr/local/lib/python3.11/site-packages/osbenchmark/workload/__init__.py", line 25, in <module>
from .loader import (
File "/usr/local/lib/python3.11/site-packages/osbenchmark/workload/loader.py", line 41, in <module>
from osbenchmark.workload import params, workload
File "/usr/local/lib/python3.11/site-packages/osbenchmark/workload/params.py", line 42, in <module>
from osbenchmark.utils.dataset import DataSet, get_data_set, Context
File "/usr/local/lib/python3.11/site-packages/osbenchmark/utils/dataset.py", line 13, in <module>
import h5py
File "/usr/local/lib/python3.11/site-packages/h5py/__init__.py", line 25, in <module>
from . import _errors
File "h5py/_errors.pyx", line 1, in init h5py._errors
ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject
We were able to discover the root cause was due to the recent numpy 2.0.0 release and how it conflicts with h5py 3.10.0. Our local desktops did not encounter this issue b, the Docker container only had Python 3.11 installed and needed to pull in numpy, which is why it pulled in the latest version.
We can either solve this either by restricting the setup.py to use numpy versions less than 2.0.0 or we can remove the 3.10.0 restriction for h5py in this line of the dockerfile. The image works when numpy 2.0.0 and h5py 3.11.0 is installed.
RUN python3 -m pip install h5py==3.10.0; if [ -z "$VERSION" ] ; then python3 -m pip install opensearch-benchmark ; else python3 -m pip install opensearch-benchmark==$VERSION ; fi
To reproduce
Install OSB 1.7.0
Force install numpy 2.0.0
Run opensearch-benchmark --help
The error should be present:
benchmark@c5aca43430ba:~$ opensearch-benchmark --help
Traceback (most recent call last):
File "/usr/local/bin/opensearch-benchmark", line 5, in <module>
from osbenchmark.benchmark import main
File "/usr/local/lib/python3.11/site-packages/osbenchmark/benchmark.py", line 37, in <module>
from osbenchmark import version, actor, config, paths, \
File "/usr/local/lib/python3.11/site-packages/osbenchmark/test_execution_orchestrator.py", line 33, in <module>
from osbenchmark import actor, config, doc_link, \
File "/usr/local/lib/python3.11/site-packages/osbenchmark/worker_coordinator/__init__.py", line 26, in <module>
from .worker_coordinator import (
File "/usr/local/lib/python3.11/site-packages/osbenchmark/worker_coordinator/worker_coordinator.py", line 44, in <module>
from osbenchmark import actor, config, exceptions, metrics, workload, client, paths, PROGRAM_NAME, telemetry
File "/usr/local/lib/python3.11/site-packages/osbenchmark/workload/__init__.py", line 25, in <module>
from .loader import (
File "/usr/local/lib/python3.11/site-packages/osbenchmark/workload/loader.py", line 41, in <module>
from osbenchmark.workload import params, workload
File "/usr/local/lib/python3.11/site-packages/osbenchmark/workload/params.py", line 42, in <module>
from osbenchmark.utils.dataset import DataSet, get_data_set, Context
File "/usr/local/lib/python3.11/site-packages/osbenchmark/utils/dataset.py", line 13, in <module>
import h5py
File "/usr/local/lib/python3.11/site-packages/h5py/__init__.py", line 25, in <module>
from . import _errors
File "h5py/_errors.pyx", line 1, in init h5py._errors
ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject
Expected behavior
There should be no dependencies conflict.
Screenshots
If applicable, add screenshots to help explain your problem.
Host / Environment
No response
Additional context
No response
Relevant log output
No response
The text was updated successfully, but these errors were encountered:
Describe the bug
Ran into an issue during release process. We were able to release to PyPi and are able to pull in the latest version and confirm it works. However, when we were testing Dockerhub staging, we came across the following error.
We were able to discover the root cause was due to the recent numpy 2.0.0 release and how it conflicts with h5py 3.10.0. Our local desktops did not encounter this issue b, the Docker container only had Python 3.11 installed and needed to pull in numpy, which is why it pulled in the latest version.
We can either solve this either by restricting the setup.py to use numpy versions less than 2.0.0 or we can remove the 3.10.0 restriction for h5py in this line of the dockerfile. The image works when numpy 2.0.0 and h5py 3.11.0 is installed.
To reproduce
2.0.0
opensearch-benchmark --help
The error should be present:
Expected behavior
There should be no dependencies conflict.
Screenshots
If applicable, add screenshots to help explain your problem.
Host / Environment
No response
Additional context
No response
Relevant log output
No response
The text was updated successfully, but these errors were encountered: