You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File ~/anaconda3/envs/nimbus-env/lib/python3.9/site-packages/nimbus_inference/utils.py:563, in prepare_normalization_dict(dataset, output_dir, quantile, n_subset, n_jobs, output_name)
561 # save normalization dict
562 with open(os.path.join(output_dir, output_name), 'w') as f:
--> 563 json.dump(normalization_dict, f)
564 return normalization_dict
File ~/anaconda3/envs/nimbus-env/lib/python3.9/json/init.py:179, in dump(obj, fp, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
173 iterable = cls(skipkeys=skipkeys, ensure_ascii=ensure_ascii,
174 check_circular=check_circular, allow_nan=allow_nan, indent=indent,
175 separators=separators,
176 default=default, sort_keys=sort_keys, **kw).iterencode(obj)
177 # could accelerate with writelines in some versions of Python, at
178 # a debuggability cost
--> 179 for chunk in iterable:
180 fp.write(chunk)
File ~/anaconda3/envs/nimbus-env/lib/python3.9/json/encoder.py:431, in _make_iterencode.._iterencode(o, _current_indent_level)
429 yield from _iterencode_list(o, _current_indent_level)
430 elif isinstance(o, dict):
--> 431 yield from _iterencode_dict(o, _current_indent_level)
432 else:
433 if markers is not None:
File ~/anaconda3/envs/nimbus-env/lib/python3.9/json/encoder.py:405, in _make_iterencode.._iterencode_dict(dct, _current_indent_level)
403 else:
404 chunks = _iterencode(value, _current_indent_level)
--> 405 yield from chunks
406 if newline_indent is not None:
407 _current_indent_level -= 1
File ~/anaconda3/envs/nimbus-env/lib/python3.9/json/encoder.py:438, in _make_iterencode.._iterencode(o, _current_indent_level)
436 raise ValueError("Circular reference detected")
437 markers[markerid] = o
--> 438 o = _default(o)
439 yield from _iterencode(o, _current_indent_level)
440 if markers is not None:
File ~/anaconda3/envs/nimbus-env/lib/python3.9/json/encoder.py:179, in JSONEncoder.default(self, o)
160 def default(self, o):
161 """Implement this method in a subclass such that it returns
162 a serializable object for o, or calls the base implementation
163 (to raise a TypeError).
(...)
177
178 """
--> 179 raise TypeError(f'Object of type {o.class.name} '
180 f'is not JSON serializable')
TypeError: Object of type float32 is not JSON serializable"
}
Nimbus runs smoothly on unsigned 16 bit images. However, when I convert a previously running 16bit image to float32 it reproduces the same error.
Thank you for your help!!
Version information
alpineer NA
nimbus_inference 0.0.2
session_info 1.0.0
PIL 11.0.0
appnope 0.1.4
asciitree NA
asttokens NA
backcall 0.2.0
backports NA
certifi 2024.08.30
charset_normalizer 2.1.1
colorama 0.4.6
comm 0.2.2
cv2 4.10.0
cycler 0.12.1
cython_runtime NA
dateutil 2.9.0
debugpy 1.6.7
decorator 5.1.1
dill 0.3.8
entrypoints 0.4
executing 2.1.0
filelock 3.16.1
huggingface_hub 0.25.2
idna 3.10
imagecodecs 2024.9.22
imageio 2.35.1
importlib_resources NA
ipykernel 6.29.5
ipywidgets 8.1.5
jaraco NA
jedi 0.19.1
joblib 1.4.2
kiwisolver 1.4.7
lazy_loader 0.4
lxml 5.3.0
matplotlib 3.9.2
more_itertools 10.3.0
mpl_toolkits NA
natsort 8.4.0
numcodecs 0.12.1
numpy 2.0.2
packaging 24.1
pandas 2.2.3
parso 0.8.4
pexpect 4.9.0
pickleshare 0.7.5
pkg_resources NA
platformdirs 4.3.6
prompt_toolkit 3.0.48
psutil 5.9.0
ptyprocess 0.7.0
pure_eval 0.2.3
pyarrow 17.0.0
pydev_ipython NA
pydevconsole NA
pydevd 2.9.5
pydevd_file_utils NA
pydevd_plugins NA
pydevd_tracing NA
pygments 2.18.0
pyometiff 1.0.1
pyparsing 3.2.0
pytz 2024.1
requests 2.32.3
scipy 1.13.1
six 1.16.0
skimage 0.24.0
stack_data 0.6.2
tifffile 2024.8.30
torch 2.4.1
torchgen NA
tornado 6.1
tqdm 4.66.5
traitlets 5.14.3
typing_extensions NA
unicodedata2 NA
urllib3 2.2.3
vscode NA
wcwidth 0.2.13
yaml 6.0.2
zarr 2.18.2
zmq 25.1.2
zoneinfo NA
I found the problem and it was on my end, I'm very sorry! It seems that I had numpy 2.0.2 installed in my environment instead of 1.26.4. That created the json error. With the proper numpy version nimbus runs on 32float images without problems.
Sorry for waisting your time and thanks for the help anyway!
Report
Hi Nimbus-Team,
when trying to run nimbus on a float 32 tiff image I get the following json TypeError while running nimbus.prepare_normalization_dict():
Nimbus runs smoothly on unsigned 16 bit images. However, when I convert a previously running 16bit image to float32 it reproduces the same error.
Thank you for your help!!
Version information
alpineer NA
nimbus_inference 0.0.2
session_info 1.0.0
PIL 11.0.0
appnope 0.1.4
asciitree NA
asttokens NA
backcall 0.2.0
backports NA
certifi 2024.08.30
charset_normalizer 2.1.1
colorama 0.4.6
comm 0.2.2
cv2 4.10.0
cycler 0.12.1
cython_runtime NA
dateutil 2.9.0
debugpy 1.6.7
decorator 5.1.1
dill 0.3.8
entrypoints 0.4
executing 2.1.0
filelock 3.16.1
huggingface_hub 0.25.2
idna 3.10
imagecodecs 2024.9.22
imageio 2.35.1
importlib_resources NA
ipykernel 6.29.5
ipywidgets 8.1.5
jaraco NA
jedi 0.19.1
joblib 1.4.2
kiwisolver 1.4.7
lazy_loader 0.4
lxml 5.3.0
matplotlib 3.9.2
more_itertools 10.3.0
mpl_toolkits NA
natsort 8.4.0
numcodecs 0.12.1
numpy 2.0.2
packaging 24.1
pandas 2.2.3
parso 0.8.4
pexpect 4.9.0
pickleshare 0.7.5
pkg_resources NA
platformdirs 4.3.6
prompt_toolkit 3.0.48
psutil 5.9.0
ptyprocess 0.7.0
pure_eval 0.2.3
pyarrow 17.0.0
pydev_ipython NA
pydevconsole NA
pydevd 2.9.5
pydevd_file_utils NA
pydevd_plugins NA
pydevd_tracing NA
pygments 2.18.0
pyometiff 1.0.1
pyparsing 3.2.0
pytz 2024.1
requests 2.32.3
scipy 1.13.1
six 1.16.0
skimage 0.24.0
stack_data 0.6.2
tifffile 2024.8.30
torch 2.4.1
torchgen NA
tornado 6.1
tqdm 4.66.5
traitlets 5.14.3
typing_extensions NA
unicodedata2 NA
urllib3 2.2.3
vscode NA
wcwidth 0.2.13
yaml 6.0.2
zarr 2.18.2
zmq 25.1.2
zoneinfo NA
IPython 8.12.0
jupyter_client 7.3.4
jupyter_core 5.7.2
Python 3.9.20 | packaged by conda-forge | (main, Sep 30 2024, 17:48:00) [Clang 17.0.6 ]
macOS-14.6.1-arm64-arm-64bit
Session information updated at 2024-10-30 14:41
The text was updated successfully, but these errors were encountered: