Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: Osier: A Python package for multi-objective energy system optimization #6919

Open
editorialbot opened this issue Jun 22, 2024 · 51 comments
Assignees
Labels
Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 3 (PE) Physics and Engineering

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Jun 22, 2024

Submitting author: @samgdotson (Samuel Dotson)
Repository: https://github.com/arfc/osier
Branch with paper.md (empty if default branch): joss
Version: v0.3.1
Editor: @prashjha
Reviewers: @fredshone, @victoraalves
Archive: 10.5281/zenodo.14216170

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/183a04edba2d4952fa1e30c419a844b3"><img src="https://joss.theoj.org/papers/183a04edba2d4952fa1e30c419a844b3/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/183a04edba2d4952fa1e30c419a844b3/status.svg)](https://joss.theoj.org/papers/183a04edba2d4952fa1e30c419a844b3)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@fredshone & @victoraalves, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @prashjha know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @fredshone

📝 Checklist for @victoraalves

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.90  T=0.04 s (953.6 files/s, 197346.6 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          15            575            872           2155
Markdown                         9            113              0            420
Jupyter Notebook                 3              0           2332            316
TeX                              1             12              0            200
YAML                             3             12              8             84
DOS Batch                        1              8              1             26
make                             1              4              7              9
reStructuredText                 2             16             64              9
-------------------------------------------------------------------------------
SUM:                            35            740           3284           3219
-------------------------------------------------------------------------------

Commit count by author:

   286	Sam Dotson
     7	samgdotson
     2	Samuel Dotson
     1	yardasol

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1007/s10479-015-1791-y is OK
- 10.1109/ACCESS.2020.2990567 is OK
- 10.1088/1748-9326/ab875d is OK
- 10.1007/s12532-011-0026-8 is OK
- 10.1016/j.erss.2021.101908 is OK
- 10.1287/inte.6.4.102 is OK
- 10.1016/j.ejor.2018.01.036 is OK
- 10.1016/j.rser.2014.02.003 is OK
- 10.1016/j.apenergy.2020.114728 is OK
- 10.1016/j.erss.2022.102913 is OK
- 10.3390/en12163046 is OK

MISSING DOIs

- No DOI given, and none found for title: 2023 Annual Technology Baseline (ATB)
- No DOI given, and none found for title: Openmod - Open Energy Modelling Initiative

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 1154

✅ The paper includes a Statement of need section

@editorialbot
Copy link
Collaborator Author

License info:

✅ License found: BSD 3-Clause "New" or "Revised" License (Valid open source OSI approved license)

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@fredshone
Copy link

fredshone commented Jun 25, 2024

Review checklist for @fredshone

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/arfc/osier?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@samgdotson) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@fredshone
Copy link

Really nice work well presented.

I require some changes detailed below with optional actions mixed in. A lot of these are based on compatability with Windows. Although not required I suggest both adding Windows to your CI and including env control in the docs. Mamba is very good. There are example projects here and here.


Installation and Installation Instructions:

On a Windows machine I hit two blockers; (i) python version (issue here), and (ii) solver installation documentation (issue here).

I strongly recommend adding a windows machine to the CI. Otherwise this is likely to reoccur.

I have also suggested that the default Windows solver be glpk. This is easy to install and will smooth CI. It is also more in keeping with JOSS (in my opinion). if you stick with a less open solver I think this needs to be explicitly mentioned in ther paper.

OPTIONAL, I also encourage use of mamba in the install instructions as per this issue.


Example usage: OPTIONAL - there is only a single example of multi-criteria and it is a little hidden at the end of a notebook. Please consider making this more obvious. At the moment multi-criteria appears as an after-thought rather than a core feature. OPTIONAL - please consider making the example notebooks more accessible in the project rather than just docs. See this issue.


Automated tests: Please add a code coverage report (eg pytest-cov).


State of the field and References: Please provide some more specific description and reference to similar tools. I think your focus on multi-criteria is very important, as you state - there are many other open energy system modelling tools. You would make the case better by more explicit comparison to similar projects that don't do multi-criteria. For example calliope is very similar in many ways but does not to multi-criteria without hacking the solver.

@samgdotson
Copy link

Hi @fredshone, I appreciate the review and the kind words. Addressing these issues is on my radar but I've hit a busy moment with other work -- I haven't forgotten about this, it's just on the back burner for now!

@fredshone
Copy link

Hi @fredshone, I appreciate the review and the kind words. Addressing these issues is on my radar but I've hit a busy moment with other work -- I haven't forgotten about this, it's just on the back burner for now!

No problem. You don't have to address everything at once - also happy to review bits and bobs when you have time. Similarly the review process is supposed to be open and two-way, some of my requests are quite specific, but you might be able to address them in better ways. Hope that makes sense.

@prashjha
Copy link

prashjha commented Aug 5, 2024

Hi @victoraalves, I hope you are doing well. I see you have not created a reviewer checklist yet. Could you please do so soon? If I can help you with anything, just let me know.

@prashjha
Copy link

Hi @victoraalves, could you please start the review? It has been two months since it started. I understand you are busy with other commitments. If you could provide some timeline, that would be helpful.

Best wishes, Prashant

@victoraalves
Copy link

Hi @prashjha firstly, sorry for the late reply. I will finalize my review by no later than next week. Again, my apologies.

@victoraalves
Copy link

victoraalves commented Aug 21, 2024

Review checklist for @victoraalves

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/arfc/osier?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@samgdotson) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@prashjha
Copy link

Hi @prashjha firstly, sorry for the late reply. I will finalize my review by no later than next week. Again, my apologies.

Glad to hear from you and thank you for the update.

@victoraalves
Copy link

victoraalves commented Aug 29, 2024

I have performed my initial review.

The tool is interesting and a good contribution! Congratulations, nice work.

I had some issues regarding Python version compatibility (already mentioned here by the other reviewer), as well as problems related to numpy 2.0 that breaks some of the functionality of the tool.

Installation

Installation on macOS was a bit problematic, due to the new Python versions and the Numpy version installed by default when doing "pip install". This needs to be clarified in the documentation and/or fixed in the project as well.

Testing

Some tests failed due to the lack of the "cbc" solver, despite the fact that I followed the installation instructions as recommended.

EDIT: Now I found out that this (and other solvers) need to be installed separately. My suggestion then is to have a conda-based installation of your package, which will help to automate all of the installation process.

Status of the field

As mentioned, I believe it's a good idea to highlight other tools that have the same (or similar) objectives and explain how Osier differentiates itself.

@samgdotson
Copy link

samgdotson commented Sep 23, 2024

@fredshone @victoraalves

Thank you both for your reviews! @fredshone, thank you for making a pull request, as well.

Regarding the status of the field, how much elaboration are you both looking for? The paper is already ~1000 words. I have an open pull request that adds a direct reference to two open source packages. Is more elaboration necessary or will this suffice?

Two well known open-source ESOMs, Calliope [@pfenninger:2018] and Python for Power Systems Analysis (PyPSA)
[@brown:2018], partially address equity issues by implementing MGA, but this does not resolve the limitation of
mono-objective optimization.

Please let me know!

@fredshone
Copy link

fredshone commented Sep 23, 2024

@samgdotson short is good I agree. (from memory) I'd like to see specific mention on how your work builds on above by adding multi-objective functionality. This could be a one-liner.

FYI, I spoke to calliope authors and they agreed MO was not a proper or intended functionality. So lean on it hard I suggest.

More pressing for me personally is that when I went through the examples - the (single) multi-criteria example was at the end of a long notebook. I'd like to see it (perhaps with an even more minimal example) brought more front and center so that a new user (me in this case) can more quickly use the examples to understand the project scope.

@fredshone
Copy link

Nice job with the CI and operating systems btw 🥇

@victoraalves
Copy link

@samgdotson Thank you for addressing this. I agree that you don't have to craft a long discussion on this. To me, your current PR is sufficient.

@samgdotson
Copy link

@fredshone @victoraalves I pushed a final PR that should address the remaining comments.

This PR

  • adds a notebook with a "minimum working example" (which I suppose is only "minimal" in the energy modeling space 😄) called "getting started."
  • adds an environment.yml file and instructions to use this file to install osier -- this should cover the env control request from @fredshone
  • addresses the notebook access by adding a binder badge to each example notebook in the repository (I couldn't get nblink to work and I don't want two version of the examples floating around, perhaps this can be added in the future).

If you both agree and are satisfied, what are the next steps? Thanks for your reviews!

@fredshone
Copy link

looks good, i will make a final skim through everything once merged.

@samgdotson
Copy link

@fredshone, all merged!

@fredshone
Copy link

@editorialbot generate pdf

@fredshone
Copy link

@editorialbot commands

@editorialbot
Copy link
Collaborator Author

Hello @fredshone, here are the things you can ask me to do:


# List all available commands
@editorialbot commands

# Get a list of all editors's GitHub handles
@editorialbot list editors

# Adds a checklist for the reviewer using this command
@editorialbot generate my checklist

# Set a value for branch
@editorialbot set joss-paper as branch

# Run checks and provide information on the repository and the paper file
@editorialbot check repository

# Check the references of the paper for missing DOIs
@editorialbot check references

# Generates the pdf paper
@editorialbot generate pdf

# Generates a LaTeX preprint file
@editorialbot generate preprint

# Get a link to the complete list of reviewers
@editorialbot list reviewers

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@fredshone
Copy link

@editorialbot check references

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1007/s10479-015-1791-y is OK
- 10.1109/ACCESS.2020.2990567 is OK
- 10.1088/1748-9326/ab875d is OK
- 10.1007/s12532-011-0026-8 is OK
- 10.1016/j.erss.2021.101908 is OK
- 10.1287/inte.6.4.102 is OK
- 10.1016/j.ejor.2018.01.036 is OK
- 10.1016/j.rser.2014.02.003 is OK
- 10.1016/j.apenergy.2020.114728 is OK
- 10.1016/j.erss.2022.102913 is OK
- 10.3390/en12163046 is OK
- 10.5334/jors.188 is OK
- 10.21105/joss.00825 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: 2023 Annual Technology Baseline (ATB)
- No DOI given, and none found for title: Openmod - Open Energy Modelling Initiative

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@fredshone
Copy link

@samgdotson @prashjha - Looks ready to me, nice job 🙇🏻 .

@prashjha
Copy link

@fredshone, thank you for letting us know.

@prashjha
Copy link

Hi @victoraalves, I'm just checking to see if you are close to finishing the reviews! Thank you!

@victoraalves
Copy link

HI @prashjha LGTM, nice work!

@samgdotson
Copy link

@prashjha what are the next steps?

@prashjha
Copy link

Hi @fredshone and @victoraalves, thank you for your efforts in reviewing this submission!

@prashjha
Copy link

Hi @samgdotson, sorry for the delay.

I am reading the draft and will let you know if I have any suggestions. Next step would be recommend accept and one of the EiC will make the final call. It should be fairly quick.

Meanwhile, could you please archive (if not done already) the release using zenodo and provide the archive reference so that I can associate it with your JOSS submission? Also, please ensure that the zenodo archive's title matches the title of this JOSS article.

If you have updated a version of your code, let me know, and I can update it here.

@prashjha
Copy link

@editorialbot generate pdf

1 similar comment
@prashjha
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

1 similar comment
@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@prashjha
Copy link

Hello @samgdotson, I read the article and it looks good to me. Please read my comments above and respond.

@samgdotson
Copy link

Hi @prashjha, here is the Zenodo publication is available here: 10.5281/zenodo.14216170

DOI

The current version of the code is v0.3.1, thanks!

Let me know if I need to do anything else.

@prashjha
Copy link

@editorialbot set 10.5281/zenodo.14216170 as archive

@editorialbot
Copy link
Collaborator Author

Done! archive is now 10.5281/zenodo.14216170

@prashjha
Copy link

@editorialbot set v0.3.1 as version

@editorialbot
Copy link
Collaborator Author

Done! version is now v0.3.1

@prashjha
Copy link

Thanks, @samgdotson!

@prashjha
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1007/s10479-015-1791-y is OK
- 10.1109/ACCESS.2020.2990567 is OK
- 10.1088/1748-9326/ab875d is OK
- 10.1007/s12532-011-0026-8 is OK
- 10.1016/j.erss.2021.101908 is OK
- 10.1287/inte.6.4.102 is OK
- 10.1016/j.ejor.2018.01.036 is OK
- 10.1016/j.rser.2014.02.003 is OK
- 10.1016/j.apenergy.2020.114728 is OK
- 10.1016/j.erss.2022.102913 is OK
- 10.3390/en12163046 is OK
- 10.5334/jors.188 is OK
- 10.21105/joss.00825 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: 2023 Annual Technology Baseline (ATB)
- No DOI given, and none found for title: Openmod - Open Energy Modelling Initiative

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/pe-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#6185, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 3 (PE) Physics and Engineering
Projects
None yet
Development

No branches or pull requests

5 participants