Skip to content

open-retina/open-retina

Repository files navigation

OpenRetina

Ruff mypy pytorch lightning hydra

Open-source repository containing neural network models of the retina. The models in this repository are inspired by and partially contain adapted code of sinzlab/neuralpredictors.

Installation

For normal usage:

pip install openretina

For development:

git clone git@github.com:open-retina/open-retina.git
cd open-retina
pip install -e .

Before raising a PR please run:

# Fix formatting of python files
make fix-formatting
# Run type checks and unit tests
make test-all

Design decisions and structure

With this repository we provide already pre-trained retina models that can be used for inference and intepretability out of the box, and dataloaders together with model architectures to train new models. For training new models, we rely on pytorch lightning in combination with hydra to manage the configurations for training and dataloading.

The openretina package is structured as follows:

  • modules: pytorch modules that define layers and losses
  • models: pytorch lightning models that define models that can be trained and evaluated (i.e. models from specific papers)
  • data_io: dataloaders to manage access of data to be used for training
  • insilico: Methods perform insilico experiments with above models
    • stimulus_optimization: optimize inputs for neurons of above models according to interpretable objectives (e.g. most exciting inputs)
    • future options: gradient analysis, data analysis
  • utils: Utility functions that are used across above submodules

Related papers

The model in openretina/hoefling_2024 was developed in the paper A chromatic feature detector in the retina signals visual context changes and can be cited as:

@article {10.7554/eLife.86860,
article_type = {journal},
title = {A chromatic feature detector in the retina signals visual context changes},
author = {Höfling, Larissa and Szatko, Klaudia P and Behrens, Christian and Deng, Yuyao and Qiu, Yongrong and Klindt, David Alexander and Jessen, Zachary and Schwartz, Gregory W and Bethge, Matthias and Berens, Philipp and Franke, Katrin and Ecker, Alexander S and Euler, Thomas},
editor = {Rieke, Fred and Smith, Lois EH and Rieke, Fred and Baccus, Stephen A and Wei, Wei},
volume = 13,
year = 2024,
month = {oct},
pub_date = {2024-10-04},
pages = {e86860},
citation = {eLife 2024;13:e86860},
doi = {10.7554/eLife.86860},
url = {https://doi.org/10.7554/eLife.86860},
keywords = {retina, computational modelling, visual ecology, convolutional neural networks, 2P imaging, natural stimuli},
journal = {eLife},
issn = {2050-084X},
publisher = {eLife Sciences Publications, Ltd},
}

The paper Most discriminative stimuli for functional cell type clustering explains a method to automatically cluster and interpret the modeled neurons and was also used with above model (for code see ecker-lab/most-discriminative-stimuli).

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages