Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: PANINIpy: Package of Algorithms for Nonparametric Inference with Networks in Python #7312

Closed
editorialbot opened this issue Oct 3, 2024 · 45 comments
Assignees
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review Track: 5 (DSAIS) Data Science, Artificial Intelligence, and Machine Learning

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Oct 3, 2024

Submitting author: @baiyueh (Baiyue He)
Repository: https://github.com/baiyueh/PANINIpy
Branch with paper.md (empty if default branch):
Version: v1.0.1
Editor: @vissarion
Reviewers: @ankurankan, @gchure
Archive: 10.5281/zenodo.14100356

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/2bb89bc0c034338bcdff3c342dd7c1ff"><img src="https://joss.theoj.org/papers/2bb89bc0c034338bcdff3c342dd7c1ff/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/2bb89bc0c034338bcdff3c342dd7c1ff/status.svg)](https://joss.theoj.org/papers/2bb89bc0c034338bcdff3c342dd7c1ff)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@ankurankan & @gchure, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @vissarion know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @gchure

📝 Checklist for @ankurankan

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.90  T=0.90 s (59.0 files/s, 568272.5 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
CSV                              7              0              0         503219
Python                          14            298            342           1169
reStructuredText                13           1221           2882            586
Jupyter Notebook                 4              0            348            211
Markdown                         8             89              0            164
TeX                              1             14              0            157
CSS                              1             12              0             55
YAML                             2              6             22             38
DOS Batch                        1              8              1             26
make                             1              4              7              9
HTML                             1              1              0              5
-------------------------------------------------------------------------------
SUM:                            53           1653           3602         505639
-------------------------------------------------------------------------------

Commit count by author:

    58	Baiyue
    21	Alec Kirkley
    10	baiyueh

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1103/physreve.109.054306 is OK
- 10.1038/s42005-023-01270-5 is OK
- 10.1038/s42005-022-01029-4 is OK
- 10.1103/physreve.109.034310 is OK
- 10.1103/physrevresearch.6.033307 is OK
- 10.48550/arXiv.2409.06417 is OK
- 10.6084/m9.figshare.1164194 is OK
- 10.1093/oso/9780198805090.001.0001 is OK
- 10.1016/j.physrep.2009.11.002 is OK
- 10.1038/s41467-022-34267-9 is OK
- 10.1103/physreve.105.014312 is OK
- 10.25080/tcwv9851 is OK
- 10.1038/s41567-021-01371-4 is OK
- 10.1103/physrevx.12.011010 is OK
- 10.1126/science.298.5594.824 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: The igraph software package for complex network re...
- No DOI given, and none found for title: Network Science

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 1061

✅ The paper includes a Statement of need section

@editorialbot
Copy link
Collaborator Author

License info:

✅ License found: MIT License (Valid open source OSI approved license)

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@gchure
Copy link

gchure commented Oct 4, 2024

Review checklist for @gchure

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/baiyueh/PANINIpy?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@baiyueh) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@ankurankan
Copy link

ankurankan commented Oct 12, 2024

Review checklist for @ankurankan

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/baiyueh/PANINIpy?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@baiyueh) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@ankurankan
Copy link

The package appears well-developed and includes a range of examples, making it accessible. My primary feedback is focused on the paper, which in my opinion can benefit from some refinement to ensure that key points come across more clearly.

Specific Feedback:

Installation: Currently, the package does not automatically install its dependencies, nor could I locate a requirements file. Running the example code in the documentation led to errors and based on that I had to manually install numpy, pandas, and scipy. I think these dependencies should get installed automatically.

Automated Tests: I couldn't find any automated tests for the package’s functionality. If I missed these, could you please point them out?

Statement of Need: I think this section can be refined to improve readability and clarity. For example, the last sentences in the first paragraph appear to hint towards the need for non-parametric methods, but this point is not clearly articulated since "non-parametric" isn't mentioned explicitly. Additionally, the next paragraph introduces non-parametric methods, but because the motivation isn't clear in the last paragraph it isn't clear why is it needed. I also think, in the current text, there isn't a clear distinction between the need for non-parametric methods and the need for the software package.

Related Software Packages: The arguments here could be clarified to more effectively illustrate the specific advantages of PANINIpy. For instance, in the comparison with Graph-Tool, it’s not fully clear in which cases PANINIpy should be preferred over Graph-Tool. Based on the text, it seems that there is some overlap, but is it just about the ease of use or there are distinct methods. Clarifying these distinctions would help reinforce the contributions of PANINIpy.

@gchure
Copy link

gchure commented Nov 4, 2024

Hi @vissarion, I've made it as far as I can right now in my review. I've opened up issues regarding licensing, documentation, and testing and view those as blocking. Like @ankurankan, I had trouble installing this project due to the lack of clear requirements. Additionally, the package appears to be missing testing writ large, which should be addressed before I continue assessing the functionality claimed in the project.

@vissarion
Copy link

Hi, @gchure and @ankurankan thanks for your reviews so far!

@baiyueh could you please reply to and address the issues opened by reviewers.

@baiyueh
Copy link

baiyueh commented Nov 5, 2024

Hi @ankurankan and @gchure, thank you for your helpful suggestions about the Installation, Automated Tests, Statement of Need and Related Software Packages sections, as well as your time to review the paper. We've addressed the paper-related comments as follows:

(1) We've added the CI for auto-testing, which has been discussed here under the section Testing and Continuous Integration. Besides, all dependencies have been tested under the clean test environment and through series of workflow.

(2) We've clarified the motivating first paragraph in question by explicitly mentioning the nonparametric requirement that is central to the package, which allows for the avoidance of ad hoc parameter choices and provides robustness to noise that we mention is important.

(3) We've edited our discussion of existing methods to focus on the novelty and importance of having a separate PANINIpy package. In particular, we've emphasized that the existing packages only have methods for community detection and/or network reconstruction (two network inference tasks which we do not address with PANINIpy), and that PANINIpy has a wide breadth of different unsupervised inference methods (e.g. hub identification, network population clustering, etc) whose results can be compared on a universal scale (data compression in bits) to find parsimonious summaries of networks from multiple perspectives. We have tried to emphasize in the revision that there is actually little overlap with these other packages in terms of the inference tasks being addressed, which can enhance the justification for the need for the package.

(4) We have combined the "Statement of Need" and "Related Software Packages" sections as we feel the argument for the package's novelty and importance can be made more concise this way. But we are happy to separate them if this is desired.

Let us know if you have further concerns regarding the software, paper or documentation. Happy to help.

@gchure
Copy link

gchure commented Nov 8, 2024

Hi @vissarion, I closed my last remaining issue with PANINIpy and have checked the last box on my review checklist. I think it's ready to go! Thanks @baiyueh for the rapid responses on my issues, and sorry this review took longer than I anticipated.

@ankurankan
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@ankurankan
Copy link

@baiyueh Thanks for the changes, I think the paper reads much better now. A very minor suggestion about the current modules section: The citations are currently within brackets which look a bit weird in sentences. I don't know if there is a way to add citations without brackets? Else, you could potentially rewrite these sentences in a bit more compact form like: "Methods (A. Kirkley, 2024b) for identifying MDL-optimal temporally contiguous partitions ...". This should also make the paper fit in 2 pages.

@vissarion I have updated my checklist and I also think this is ready to go.

@aleckirkley
Copy link

@vissarion @ankurankan @gchure Thank you all for engaging us in the JOSS review process and helping us to improve the package! We hope it can serve as a useful tool for people across disciplines using networks in their research.

@baiyueh
Copy link

baiyueh commented Nov 12, 2024

Hi @vissarion @gchure @ankurankan, thanks very much for taking the time to review our software and providing valuable suggestions for enhancement. We will ensure to uphold that standard for future modules to maintain a positive and collaborative vibe within the community!

@vissarion
Copy link

Thanks, @baiyueh I have a minor comment, see baiyueh/PANINIpy#5

@vissarion
Copy link

When a submission is ready to be accepted, we ask that the authors issue a new tagged release of the software (if changed), and archive it (see this guide). Please do this and post the version number and archive DOI here.
Please make sure that the author names, affiliations as well as the title of the archive (e.g. zenodo) is exactly the same with the submission.

@baiyueh
Copy link

baiyueh commented Nov 12, 2024

Hi @vissarion, have addressed the minor issue with the citation.

The published the release of PANINIpy-v1.0.1 with DOI: 10.5281/zenodo.14100356. Do let us know if there is anything missing out there.

@vissarion
Copy link

@editorialbot set 10.5281/zenodo.14100356 as archive

@editorialbot
Copy link
Collaborator Author

Done! archive is now 10.5281/zenodo.14100356

@vissarion
Copy link

@editorialbot set v1.0.1 as version

@editorialbot
Copy link
Collaborator Author

Done! version is now v1.0.1

@vissarion
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@vissarion
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1103/physreve.109.054306 is OK
- 10.1038/s42005-023-01270-5 is OK
- 10.1038/s42005-022-01029-4 is OK
- 10.1103/physreve.109.034310 is OK
- 10.1103/physrevresearch.6.033307 is OK
- 10.48550/arXiv.2409.06417 is OK
- 10.6084/m9.figshare.1164194 is OK
- 10.1093/oso/9780198805090.001.0001 is OK
- 10.1016/j.physrep.2009.11.002 is OK
- 10.1038/s41467-022-34267-9 is OK
- 10.1103/physreve.105.014312 is OK
- 10.25080/tcwv9851 is OK
- 10.1038/s41567-021-01371-4 is OK
- 10.1103/physrevx.12.011010 is OK
- 10.1126/science.298.5594.824 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: The igraph software package for complex network re...
- No DOI given, and none found for title: Network Science

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/dsais-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#6128, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Nov 13, 2024
@crvernon
Copy link

crvernon commented Nov 13, 2024

@editorialbot generate pdf

🔍 checking out the following:

  • reviewer checklists are completed or addressed
  • version set
  • archive set
  • archive names (including order) and title in archive matches those specified in the paper
  • archive uses the same license as the repo and is OSI approved as open source
  • archive DOI and version match or redirect to those set by editor in review thread
  • paper is error free - grammar and typos
  • paper is error free - test links in the paper and bib
  • paper is error free - refs preserve capitalization where necessary
  • paper is error free - no invalid refs without justification

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@crvernon
Copy link

crvernon commented Nov 14, 2024

👋 @baiyueh - I just need you to address the following before I accept this one for publication:

In the archive:

  • The license specified in your Zenodo archive should be the same as the one in your code repository. Please edit the metadata in your Zenodo archive to fix this.

In the paper:

  • Please remove the formatting (the bold designations) in the title of your paper. Otherwise it will not show up correctly in JOSS.

That's all. This paper was very clean. Thank you for that! Let me know when you have made these changes.

@baiyueh
Copy link

baiyueh commented Nov 14, 2024

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@baiyueh
Copy link

baiyueh commented Nov 14, 2024

Hi @crvernon, thanks for reviewing before the publication. Both issues in the archive and the paper mentioned have been addressed.

@crvernon
Copy link

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

cff-version: "1.2.0"
authors:
- family-names: Kirkley
  given-names: Alec
  orcid: "https://orcid.org/0000-0001-9966-0807"
- family-names: He
  given-names: Baiyue
  orcid: "https://orcid.org/0009-0007-9787-9726"
doi: 10.5281/zenodo.14100356
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Kirkley
    given-names: Alec
    orcid: "https://orcid.org/0000-0001-9966-0807"
  - family-names: He
    given-names: Baiyue
    orcid: "https://orcid.org/0009-0007-9787-9726"
  date-published: 2024-11-14
  doi: 10.21105/joss.07312
  issn: 2475-9066
  issue: 103
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 7312
  title: "PANINIpy: Package of Algorithms for Nonparametric Inference
    with Networks In Python"
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.07312"
  volume: 9
title: "PANINIpy: Package of Algorithms for Nonparametric Inference with
  Networks In Python"

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

@editorialbot
Copy link
Collaborator Author

🐘🐘🐘 👉 Toot for this paper 👈 🐘🐘🐘

@editorialbot
Copy link
Collaborator Author

🦋🦋🦋 👉 Bluesky post for this paper 👈 🦋🦋🦋

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.07312 joss-papers#6137
  2. Wait five minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.07312
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Nov 14, 2024
@crvernon
Copy link

🥳 Congratulations on your new publication @baiyueh! Many thanks to @vissarion for editing and @ankurankan and @gchure for your time, hard work, and expertise!! JOSS wouldn't be able to function nor succeed without your efforts.

Please consider becoming a reviewer for JOSS if you are not already: https://reviewers.joss.theoj.org/join

@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following

code snippets

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.07312/status.svg)](https://doi.org/10.21105/joss.07312)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.07312">
  <img src="https://joss.theoj.org/papers/10.21105/joss.07312/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.07312/status.svg
   :target: https://doi.org/10.21105/joss.07312

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review Track: 5 (DSAIS) Data Science, Artificial Intelligence, and Machine Learning
Projects
None yet
Development

No branches or pull requests

7 participants