Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: Science Capsule - Capturing the Data Life Cycle #2484

Closed
54 of 60 tasks
whedon opened this issue Jul 17, 2020 · 117 comments
Closed
54 of 60 tasks

[REVIEW]: Science Capsule - Capturing the Data Life Cycle #2484

whedon opened this issue Jul 17, 2020 · 117 comments
Assignees
Labels
accepted Groovy published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review Shell

Comments

@whedon
Copy link

whedon commented Jul 17, 2020

Submitting author: @dghoshal-lbl (Devarshi Ghoshal)
Repository: https://bitbucket.org/sciencecapsule/sciencecapsule
Version: v0.1.1
Editor: @pibion
Reviewers: @colbrydi, @gflofst, @atrisovic
Archive: 10.5281/zenodo.4968576

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/e234e62f92841ac2b6b74d53964b4d78"><img src="https://joss.theoj.org/papers/e234e62f92841ac2b6b74d53964b4d78/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/e234e62f92841ac2b6b74d53964b4d78/status.svg)](https://joss.theoj.org/papers/e234e62f92841ac2b6b74d53964b4d78)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@gflofst, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @pibion know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @atrisovic

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@dghoshal-lbl) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @gflofst

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@dghoshal-lbl) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @colbrydi

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@dghoshal-lbl) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Jul 17, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @cmbiwer it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jul 17, 2020

Reference check summary:

OK DOIs

- None

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Jul 17, 2020

@pibion
Copy link

pibion commented Jul 17, 2020

👋🏼 @dghoshal-lbl @colbrydi @cmbiwer this is the review thread for the paper. All of our communications will happen here from now on.

Both reviewers have checklists at the top of this thread with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#2484 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for reviews to be completed within about 6 weeks. Please let me know if any of you require some more time. We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.

Please feel free to ping me (@pibion) if you have any questions/concerns.

@pibion
Copy link

pibion commented Jul 17, 2020

@whedon add @colbrydi as reviewer

@whedon
Copy link
Author

whedon commented Jul 17, 2020

OK, @colbrydi is now a reviewer

@colbrydi colbrydi self-assigned this Jul 24, 2020
@pibion
Copy link

pibion commented Aug 5, 2020

👋🏼 @dghoshal-lbl @colbrydi @cmbiwer, I'm just checking in on when you expect to be able to begin taking a look at this repository and the reviewer checklist.

If you know you won't be able to start until a particular date, we can set a timer to remind you to take a look at the review.

@cmbiwer
Copy link

cmbiwer commented Aug 6, 2020

I'll begin looking at it within a week.

@pibion
Copy link

pibion commented Aug 6, 2020

@cmbiwer thanks for the update!

@cmbiwer
Copy link

cmbiwer commented Aug 12, 2020

Hi, I think I need a new invitation. When I try to accept the invitation I get:

Sorry, we couldn't find that repository invitation. It is possible that the invitation was revoked or that you are not logged into the invited account.

EDIT: I am logged in.

@pibion
Copy link

pibion commented Aug 13, 2020

@cmbiwer the only thing you need to be able to do for the review is post on this thread.

Some reviewers do contribute to the code in the actual software repository, but until you want to do that non-write public access should be all you need.

Does this help?

@colbrydi
Copy link

Here are some notes from my review:

Science Capsule allows the capturing of Command line activities inside a user specified working directory. These activities include file systems events (using inotify) and process events (using strace) using the python watchdog module. The results can be monitored using a live web interface. The captured information is intended to help identify artifacts of a scientific work flow process that are not always captured using traditional computational workflows. This information is intended to help improve documentation of research data methodology and curation (aka data provenance) that may be lost without such a monitoring system.

Overall I am very pleased with the software, it performed nicely. It did not seem to take up too much of my local resources and the web interface was fairly straightforward. The code seems to be well written and I have ideas on how I may use this tool in my own projects. Great job!

That being said, the documentation was probably the weakest aspect of the submission. Both the repository readme file and the JOSS paper do not do a good enough job explaining what the software is doing or why it is needed. The paper uses an overly general statement "Science Capsule captures, organizes, and manages the end-to-end scientific process to facilitate capture of provenance," This sounds good but is extremely vague and I feel overstates the roll of the software. I think these documents would be significantly improved if a few concrete example cases for the software use are included. Without these examples I feel many readers would not know why they would want to install and use the software. I worry that many system administrators that are not researchers will not see the benefit of the software and many scientists that could use the software will not understand what it can (and can't) do (i.e. it's not a a magic wand but it is a fairly useful tool). Should the software be run locally or run on a large system? What does it do well and where might it need work?

I was able to download install sciencecapsule using an conda environment on MacOS and a user account on our local High performance Computing system running RHEL7. Overall installation was very simple and easy to get up and running, especially if you are already familiar with the tools. It may be difficult for some end users to install locally as there are many steps and requirements. However, the instructions are complete even given the complex environment.

The python testing suite was straightforward and all tests ran successfully. There was some additional "testing" procedures to verify that the software is working. Mostly these tests involve making a directory and reviewing the results. Seems reasonable and seems to work.

Below are a few more of my notes using the Joss Review Checklist

Review checklist

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

Functionality

  • Installation: Installation instructions worked quite well on both MacOS and as a unprivileged user on a Linux (RHEL7) machine (using bare-mettle install). I was able to get the software up and running in a few minutes. Although I am very familiar with the tools being used which may have helped.
  • Functionality: The core functionality of the software has been confirmed. Their seems to be a mechanism to extend the functionality in a variety of ways. I did not test this API mechanism but after reviewing the code it seems fairly straightforward.
  • Performance: The software performed nicely as expected. It did not seem to take up too much of my local resources and the web interface seemed to work.

Documentation

  • A statement of need: This was a week point of the submission. Both the readme and the paper do not do a good job explaining what the software is exactly tracking or why it is needed. (See below)
  • Installation instructions: Very simple and easy to get up and running, especially if you are already familiar with the tools. The instructions are complete but have multiple steps. It may be hard for an average scientist to install locally. Although, like I said, they are complete and some may be able too.
  • Example usage: The examples to get the system up and running are clear and work.
  • Functionality documentation: Despite the difficulty with motivation, the technical functionality of the software is well documented.
  • Automated tests: There were both pytest suite as well as a built in testing instructions. The pytest work as expected the internal testing instructions require multiple steps and extensive user interpretation but overall seemed very reasonable.
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support. There was only a one line note about checking and submitting issues. Kind of weak but probably a reasonable work flow.

Software paper

  • Summary: Again this is where I think their can be some improvement. The paper uses an overly general statement "Science Capsule captures, organizes, and manages the end-to-end scientific process to facilitate capture of provenance," This sounds good but is really vague and I think overstates the roll of the software.
  • A statement of need: This is where the documentation and/or paper is missing an opportunity. Although valuable, this is not a "magical" tool that does all the work for you, instead it is a fairly useful tool that would require a large learning curve interpret the results effectively. Installation on a large scale shared system makes the most sense to me. Originally I was not sure I saw the value of installing the software locally (except for testing). However, now that I have it running on my system I can think of a few possible useful applications. I really think some example/use cases would be helpful.
  • State of the field: I did not notice a comparison to other software. Data providence is a big deal so I am surprised that there is no comparison.
  • Quality of writing: Writing is very good.
  • References: There is only one reference to inotify. This seems week. The software is building off a bunch of other tools that may be appropriate to reference. Also maybe a paper about data providence would be helpful?

@cmbiwer
Copy link

cmbiwer commented Aug 17, 2020

@pibion Okay thanks. I wasn't sure if it was needed to fill out the checklist in the first post. Though, I can post it all here as the previous post.

@cmbiwer
Copy link

cmbiwer commented Aug 19, 2020

Here are my notes below.

Science Capsule is a tool to capture the command line and state of directories with information such as new or deleted files. There is a HTML viewer that organizes this information for the user. These capabilities in Science Capsule have been designed to facilitate the reproduction of analyses and sharing results.

The strengths of this submission are its walkthrough documentation and implementation. I believe the software functionality does what it intends. Its README instructions were well-written to get the user started quickly. There are tests which seem to capture the core functionality and were easy to execute. The full instructions do require some tools such as Docker, so some familiarity with those would be required for those sections in the instructions. However, the bare metal instructions are easy to follow for a novice user. And therefore, the tool should be widely accessible to new users that do not have much experience with these types of tools.

I think the one major weakness is the summary of the software itself in the paper and documentation. Citations and the state of the field are omitted. I can see the usefulness of capturing this information, but I do not think the introduction of the paper clearly states the state of the field, how Science Capsule is unique, and why the user would want this information that Science Capsule captures. It would have been nice to see a description about what exactly Science Capsule is doing under the hood—either in the documentation or paper. The documentation is more of a walkthrough and not really any documentation of the code itself. There are several key features listed at the end that really highlight what the user wants to know. I would have liked to see explanation of these features expanded and emphasized in the text. A real-world example of a problem Science Capsule helps solve would really improve its motivation as well. There was also a bold claim at the start that DOE is at risk of unusable data. How is it unusable due to the complexity and processing of the data? What is meant by complexity? And how does Science Capsule make it useable? This statement was not very clear to me.

Overall, I think the authors have provided excellent instructions and designed a tool that matches their key features they list in the paper. I see nothing wrong with Science Capsule itself. It has a unique functionality that I can see how it would be useful. However, the documentation/paper could be strengthened since the motivation and summary in the paper are vague and perhaps too high-level.

Review checklist
Conflict of interest
• Yes. I confirm that I have read the JOSS conflict of interest policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
• Yes. I confirm that I read and will adhere to the JOSS code of conduct.
General checks
• Repository: Yes. The code is publicly available at: https://bitbucket.org/sciencecapsule/sciencecapsule.git
• License: Yes. There is a plain-text LICENSE file.
• Contribution and authorship: Yes. The submitting author has made contributions to the initial design, as logged in the commit history. The author list seems appropriate for the contributors.
Functionality
• Installation: Yes. The installation was fairly easy from their instructions. I installed as a user on a MacOS. For the container instructions, the installation does require some familiarity with the tools being used such as Docker. For example, a user behind a proxy simply downloaded Docker and Science Capsule would run into some issues not addressed here. I don’t think these are Science Capsule’s problems to solve or spell out for the user, but it is a difficulty novice user could encounter. The bare metal instructions seem accessible to a novice user, which overcomes this.
• Functionality: Yes. I tested the functionality that the authors present and it appears to operate as intended.
• Performance: Yes. I did not encounter any performance issues.
Documentation
• A statement of need: No. The README essentially just dives into installation and a walkthrough.
• Installation instructions: Yes. The installation instructions were straightforward and easy to follow.
• Example usage: Yes. The examples were straightforward as well, and I did not encounter any issues. However, I did not notice any references to any specific real-world problems the authors are attempting to solve, or have solved, with Science Capsule.
• Functionality documentation: Yes. The functionality is documented in the paper and demonstrated in the documentation examples.
• Automated tests: Yes. They use pytest for automated tests which appear to cover some of the core functionality.
• Community guidelines: No. There is only a mention at the bottom of the README to create an issue in the issue tracker. Which satisfies two of the three criteria. However, there are no instructions I noticed on how to contribute to the project. A contributing statement would help. A stretch would be some kind of description of what Science Capsule does under the hood.
Software paper
• Summary: No. There is a high-level introduction at the beginning of the paper. I am not sure an unexperienced user would leave feeling they have a concise understanding of what Science Capsule does. For example, the authors state that Science Capsule manages the end-to-end scientific process, but it appears more to capture and organize the state information, not necessarily what happens in the processes themselves or any parent-child relationships within the workflow like a workflow management system would. The key features at the end of the paper really hit on what Science Capsule does. There is also a significant portion of the paper for CLI which is well documented in the README, this could have probably been replaced with a better motivating introduction and highlight the uniqueness of Science Capsule. The graphic promises scientific knowledge access, but I see no example of this. In the first sentence, the authors refer to data not being usable due to its complexity and processing. It was not clear what they meant, Science Capsule captures the workflow, so how does it take unusable data and make it usable? There have already been decades of work in tools for reproducible analyses and workflows, so I was surprised there were no citations to reinforce the motivation. Overall, the summary and statement of need seems too vague.
• A statement of need: No. They do reference that it’s for reproducibility and hint at DOE facilities. However, there is no mention of real-world applications that Science Capsule can be applied to. Storing this information is important for reproducibility, however, there is not much discussion on how this information can be used. The paper lists what it does in the key features, but not why its needed.
• State of the field: No. There is a large number of reproducible workflow and state monitoring tools, yet there is no reference to the state of the field or citations. They mention that they capture information not capture by other traditional computational workflow which could use a citation.
• Quality of writing: Yes. It was written well.
• References: No. There should have been more references to the state of the field. Science Capsule is built on several other packages and there is no citation for those as well.

@pibion
Copy link

pibion commented Aug 20, 2020

@cmbiwer thanks for these comments!

@pibion
Copy link

pibion commented Aug 20, 2020

@dghoshal-lbl, it sounds like most of the technical aspects of your software (installation, tests, usage) already meet JOSS standards for publication.

The most important thing to address is the "statement of need" as that's critical in determining whether software is within the scope of JOSS. I'll let the reviewers comment for themselves; @colbrydi says that the package might be useful in his scientific work, which seems like a point in favor.

@dghoshal-lbl
Copy link

Thank you @colbrydi and @cmbiwer for your detailed review.
@pibion I will update the paper including the statement of need and the relevant references and will get back to you once that is done.

@pibion
Copy link

pibion commented Aug 25, 2020

Thanks @dghoshal-lbl !

@pibion
Copy link

pibion commented Sep 24, 2020

@dghoshal-lbl just wanted to check in on the status of this. No pressure, just saying hello.

@dghoshal-lbl
Copy link

@pibion Sorry for the delay. Had too many deadlines in the last few weeks. We are working on fixing the documentation and paper. I will let you know once everything has been updated.

@whedon
Copy link
Author

whedon commented Jun 17, 2021

To recommend a paper to be accepted use @whedon recommend-accept

@pibion
Copy link

pibion commented Jun 17, 2021

@whedon recommend-accept

@whedon
Copy link
Author

whedon commented Jun 17, 2021

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Jun 17, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1145/2882903.2899401 is OK
- 10.1016/j.future.2017.12.029 is OK
- 10.1051/epjconf/201921406034 is OK
- 10.1093/bioinformatics/bth361 is OK
- 10.1016/j.future.2014.10.008 is OK
- 10.1007/11890850_14 is OK
- 10.1007/11890850_1 is OK
- 10.1088/1742-6596/898/8/082021 is OK
- 10.1093/nar/gkq429 is OK
- 10.1109/ICPC.2011.23 is OK
- 10.1109/eScience.2017.51 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@danielskatz
Copy link

Sorry - I forgot that had changed...

@whedon
Copy link
Author

whedon commented Jun 17, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2397

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2397, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@pibion
Copy link

pibion commented Jun 17, 2021

@openjournals/joss-eics this paper is ready for its final review!

@danielskatz
Copy link

@dghoshal-lbl - Do you want to update the title metadata in the Zenodo archive to match the paper title? It's not required, but it's usual for JOSS submissions and archives

@dghoshal-lbl
Copy link

@danielskatz This time I intentionally kept the title metadata in Zenodo as the name of the software since the archive is for the software :). But I think it would be good to match it with the JOSS paper title. I will change that.

@dghoshal-lbl
Copy link

Done!

@danielskatz
Copy link

@dghoshal-lbl - I've also found a few things in the paper while proofreading it - see https://bitbucket.org/sciencecapsule/sciencecapsule/pull-requests/54

@danielskatz
Copy link

Let me know if this doesn't look right - I'm not really used to bitbucket PRs, so it's possible I've done something wrong

@dghoshal-lbl
Copy link

@danielskatz Thanks for the proofread and the fixes! I have merged your changes and also fixed a missing citation. Let me know if everything looks good and I will update the Zenodo archive.

@danielskatz
Copy link

@whedon check references

@danielskatz
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jun 17, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1145/2882903.2899401 is OK
- 10.1016/j.future.2017.12.029 is OK
- 10.1051/epjconf/201921406034 is OK
- 10.1093/bioinformatics/bth361 is OK
- 10.1016/j.future.2014.10.008 is OK
- 10.1007/11890850_14 is OK
- 10.1007/11890850_1 is OK
- 10.1088/1742-6596/898/8/082021 is OK
- 10.1093/nar/gkq429 is OK
- 10.1109/ICPC.2011.23 is OK
- 10.1109/eScience.2017.51 is OK
- 10.5334/jors.ag is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Jun 17, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@danielskatz
Copy link

This looks good to me - if you want to update the archive, that's fine, though it's not required. Let me know either way, and I'll optionally update the archive address, and then finish the publishing of the paper

@dghoshal-lbl
Copy link

If it's not required then it should be good to go. Thanks!

@danielskatz
Copy link

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Jun 17, 2021

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Jun 17, 2021
@whedon
Copy link
Author

whedon commented Jun 17, 2021

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Jun 17, 2021

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.02484 joss-papers#2398
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.02484
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@danielskatz
Copy link

Congratulations to @dghoshal-lbl (Devarshi Ghoshal) and co-authors!!

And thanks to @colbrydi, @gflofst, and @atrisovic for reviewing, and @pibion for editing!

@whedon
Copy link
Author

whedon commented Jun 17, 2021

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02484/status.svg)](https://doi.org/10.21105/joss.02484)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02484">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02484/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02484/status.svg
   :target: https://doi.org/10.21105/joss.02484

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Groovy published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review Shell
Projects
None yet
Development

No branches or pull requests

8 participants