Skip to content

Latest commit

 

History

History
196 lines (152 loc) · 8.29 KB

README.md

File metadata and controls

196 lines (152 loc) · 8.29 KB

Hyperspectral Denoising algorithm toolbox in Python

Project Status

Code style: black license: BSD-3 Downloads Mail : Helmholtz AI

General User Installation

This project requires the PyTorch-wavelets package. However, this package does not have a PyPi release. Therefore, the way to install this package as a pip package is as follows. Developers should use the Development Installation section further down this page.

pip install git+https://github.com/fbcotter/pytorch_wavelets
pip install hyde-images

Description

Image denoising is the task of recovering the true unknown image from a degraded observed image. It plays an important role in a variety of applications, for example in remote sensing imaging systems in lithological mapping. Hyperspectral Denoising is a Python toolbox aiming to provide, as the name suggests, denoising algorithms for hyperspectral image data. In particular, we provide:

  • A wide variety of hyperspectral denoising algorithms (see Features for details)
  • GPU acceleration for all algorithms
  • An inuitive pythonic API design
  • PyTorch compatibility

High Level Methods

Conventional methods

Neural Methods (see src/hyde/nn)

  • QRNN3D
  • QRNN2D
  • MemNet
  • MemNet3D
  • HSID-CNN (DeNet-like)
  • HSID-CNN-3D (DeNet-like)
  • MemNet + trainable HyRes step
Pretrained Models

Pretrained models are available on the github repository but NOT in the pip release. To use these models, use:

method = hyde.NNInference(arch="qrnn3d", pretrained_file="/path/to/downloaded/model.pth")
output = method(noisy, band_dim=-1, permute=True)

Please check if you need to use the band_dim and permute flags

High Level Function Usage

The high level functions (see Features above) are created with torch.nn.Modules. This means that they are classes which must be initialized before they can be used. An example of the using HyRes with the default parameters is shown below.

import hyde
import torch
input_tens = torch.tensor(loaded_image, dtype=torch.float32, device="gpu or cpu")
hyres = hyde.HyRes()
output = hyres(input_tens)

Citation

If you use this repositity, please cite the following paper:

Bibtex:

@inproceedings{coquelin2022hyde,
  author={Coquelin, Daniel and Rasti, Behnood and Götz, Markus and Ghamisi, Pedram and Gloaguen, Richard and Streit, Achim},
  booktitle={2022 12th Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS)}, 
  title={Hyde: The First Open-Source, Python-Based, Gpu-Accelerated Hyperspectral Denoising Package}, 
  year={2022},
  volume={},
  number={},
  pages={1-5},
  doi={10.1109/WHISPERS56178.2022.9955088}
}

Plain text:

[1] D. Coquelin, B. Rasti, M. Götz, P. Ghamisi, R. Gloaguen and A. Streit, "Hyde: The First Open-Source, Python-Based, Gpu-Accelerated Hyperspectral Denoising Package," 2022 12th Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2022, pp. 1-5, doi: 10.1109/WHISPERS56178.2022.9955088.

Development Installation

In order to set up the necessary environment:

  1. review and uncomment what you need in environment.yml and create an environment hyde with the help of conda:
    python -m venv hyde_venv
    
  2. activate the new environment with:
    source hyde_venv/bin/activate
    
  3. Install requirements
    pip install -r requirements.txt -e .
    

Optional and needed only once after git clone:

  1. install several pre-commit git hooks with:

    pre-commit install
    # You might also want to run `pre-commit autoupdate`

    and checkout the configuration under .pre-commit-config.yaml. The -n, --no-verify flag of git commit can be used to deactivate pre-commit hooks temporarily.

  2. install nbstripout git hooks to remove the output cells of committed notebooks with:

    nbstripout --install --attributes notebooks/.gitattributes

    This is useful to avoid large diffs due to plots in your notebooks. A simple nbstripout --uninstall will revert these changes.

Then take a look into the scripts and notebooks folders.

Project Organization

├── AUTHORS.md              <- List of developers and maintainers.
├── CHANGELOG.md            <- Changelog to keep track of new features and fixes.
├── LICENSE.txt             <- License as chosen on the command-line.
├── README.md               <- The top-level README for developers.
├── configs                 <- Directory for configurations of model & application.
├── docs                    <- Directory for Sphinx documentation in rst or md.
├── environment.yml         <- The conda environment file for reproducibility.
├── notebooks               <- Jupyter notebooks. Naming convention is a number (for
│                              ordering), the creator's initials and a description,
│                              e.g. `1.0-fw-initial-data-exploration`.
├── pyproject.toml          <- Build system configuration. Do not change!
├── references              <- Data dictionaries, manuals, and all other materials.
├── scripts                 <- Analysis and production scripts which import the
│                              actual Python package, e.g. train_model.py.
├── setup.cfg               <- Declarative configuration of your project.
├── setup.py                <- Use `pip install -e .` to install for development or
|                              or create a distribution with `tox -e build`.
├── src
│   └── hyde                <- Actual Python package where the main functionality goes.
├── tests                   <- Unit tests which can be run with `py.test`.
├── .coveragerc             <- Configuration for coverage reports of unit tests.
├── .isort.cfg              <- Configuration for git hook that sorts imports.
└── .pre-commit-config.yaml <- Configuration of pre-commit git hooks.

License

Hyperspectral Denoising is distributed under the BSD-3 license, see our LICENSE file.

Note

This project has been set up using PyScaffold 4.0.1 and the dsproject extension 0.6.1.

Acknowledgements

This work is supported by the Helmholtz Association Initiative and Networking Fund under the Helmholtz AI platform grant.