Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JOSS Submission Review (Reviewer 1): Documentation #147

Closed
29 tasks done
damar-wicaksono opened this issue Mar 4, 2024 · 6 comments
Closed
29 tasks done

JOSS Submission Review (Reviewer 1): Documentation #147

damar-wicaksono opened this issue Mar 4, 2024 · 6 comments

Comments

@damar-wicaksono
Copy link

damar-wicaksono commented Mar 4, 2024

This issue is related to the JOSS submission review.

Please find below some remarks, questions, and suggestions regarding the documentation and testing of the package; just let me know if clarifications are needed.

As a general remark, please be consistent with the choice of English spelling, either American or British both in the paper and the docs. I see a couple of mix-up, e.g., "behavior", "vapour".

Introduction

  • Add a statement of need in README and in the documentation (either in the landing page or Introduction):
    • This statement may be a summary of the one appears in the paper; give one- or two-line summary regarding the purpose of the package, its target audience, and its intended (common) use.
    • Currently the Introduction section only answers "what is lbh15".

Installation

  • (Suggestion) add the additional requirements to build the docs and testing for prospective contributors in the toml/setup.py file.

Basic Usage:

  • The summary information of the liquid metal using pprint package looks the same as __repr__() (by design)?
    • as already mentioned in the remarks on Functionality

Initialization from Properties (experimental)

  • "Such function, which constitutes a physical correlation, must be injective in the range considered for the root.": Perhaps it would be more apt to say "physical property correlation" instead of "physical correlation"?

Advanced Usage:

  • The whole section should be reorganized into two subsections: Adding a new correlation of a given property and Adding a new property (with its own correlation).

  • The first bullet point can be reformulated focusing instead on introducing the function to query the available correlations either for all physical properties or selected properties. Then, the next bullet point can focus on introducing a new property implemented as a Python module.

  • "Let implement the new property in...": I believe it should be "Let's" instead of "Let".

  • As noted in the Functionality review, there should either be a note on the path separator in different OS (particularly Windows) or make the internal implementation agnostic regarding the separator.

  • Instead of showing all the available correlations after adding a new one, perhaps focus on listing the relevant correlation, i.e., rho.

    >>> from lbh15 import Lead
    >>> import os
    >>> Lead.set_custom_properties_path(os.getcwd() + '/custom_properties/lead_properties.py')
    >>> Lead.correlations_available()
    ...
  • Is it Lead.available_correlations() or Lead.correlations_available()?

  • "If the density correlation is not specified for a new object instantiation, the last one in the list will be selected as default:": The example below should also shows the default behavior as noted here if the function Lead.set_correlation_to_use() is not used; should it automatically use custom2022 as it is the last one in the list?

  • "For instance, let’s implement in <execution_dir>/custom_property/double_T.py": Note that in the preceeding discussion the custom properties are stored in <execution_dir>/custom_properties.

Learn More

  • "...describes a tutorial application coming together with lbh15,...": Consider rephrasing "coming together with" with "delivered with".
  • Is there any particular reason to capitalize "Oxygen" in most places? As a common noun, I'm not sure that would be necessary.

Oxygen Control

  • "Oxygen is the most important chemical element, which results from start-up operations, maintenance services and, possibily, incidental contaminations...": Typo "possibily", should have been "possibly".

Oxygen Concentration Lower Limit:

  • Point 4, "Ongoing researches are in progress, but currently the exact values for the chemical activities of the dissolved metal and of the Oxygen are not known.": The plural form "researches" is uncommon; simply use "research". Furthermore, "ongoing research are in progress" is redundant, it's enough to say that the relevant research is "on going" or "in progress".

Ranges of Validity:

  • "The temperature range of validity specified in the lbh15 package for each correlation is the most restrictive one.": In what sense the range is the "most restrictive one"?
  • Consistent use of temperature unit. For instance, "673;1000 K range" while in the previous section unit is always written as $[K]$.
  • Are multiple additional sources in this section, e.g., tecdoc (2002) also from the Handbook? If not, I think they should be listed in the bibliography of the documentation.
  • "the correlations are preferred whose validity ranges are related to the lowest available temperature values and whose extension is the widest available.": What does it mean here "extension"? Does it mean "extrapolation" or perhaps something else?

Tutorials

  • "...like an Heaviside function...": "like a heaviside function".
  • "Let suppose that the lead...": As a conjunction, the sentence can start with "Suppose the lead...".
  • "...an Iron...": I don't think "Iron" should be capitalized.
  • I think the tutorial script also requires matplotlib which unlike simple-pid is not listed.

API Guide

Community Guidelines

There is no section in the docs regarding contribution guidelines. I believe this is a requirement of JOSS, so please consider adding them following the common best-practices. It should include, according to the reviewer guideline:

  • Contribute to the software
  • Report issues or problems with the software
  • Seek support

Please also include the protocol for testing the package. In particular, which framework does it employ (pytest? unittest?)?

Regarding the tests:

I tried to run the tests of the package from the repo. I'm not sure If I did everything correctly, but I got these errors:

FAILED test_bismuth_fromX_spanT.py::BismuthTester::test_init_fromX - AssertionError: 548.15 != 1102.673571428572 within 8 places (554.5235714285719 difference) : rho FAILED
FAILED test_lead_basics.py::LeadComparisonTester::test_vs_data - AssertionError: 10652.22 != 10643.67958 within 4 places (8.540419999999358 difference)
FAILED test_lead_fromX_spanT.py::LeadTester::test_init_fromX - AssertionError: 1573.15 != 1794.2094619326822 within 8 places (221.0594619326821 difference) : cp FAILED

Perhaps you could double check them? Thanks a lot! PS: This is no longer an issue, see below.

lelaus added a commit that referenced this issue Mar 26, 2024
lelaus added a commit that referenced this issue Mar 27, 2024
…ed to Initialization from Properties section
lelaus added a commit that referenced this issue Mar 27, 2024
lelaus added a commit that referenced this issue Mar 27, 2024
…ons suggested for the Ranges of Validity section
lelaus added a commit that referenced this issue Mar 27, 2024
lelaus added a commit that referenced this issue Mar 27, 2024
…ons related to lead, bismuth and lbe modules
@lelaus
Copy link
Collaborator

lelaus commented Mar 27, 2024

Dear @damar-wicaksono , we are working for applying the modifications you suggested.

About your remarks in the last section "Regarding the tests", we are not able to reproduce the errors you got. Could you please describe us the procedure you follow before getting the reported errors?

Thanks!

@damar-wicaksono
Copy link
Author

damar-wicaksono commented Mar 28, 2024

Hi @lelaus,

It seems there's an interaction between test functions when I execute all of them with pytest. I later realized that you use unitttest and from the GitHub actions I see that you execute the files one after the other. I would consider this a non-issue now but let me just demonstrate what may have happened.

Consider running the test modules test_bismuth_fromX_spanT.py and test_custom_properties.py with pytest:

  • If I run the modules separately:
$ pytest test_bismuth_fromX_spanT.py
...
test_bismuth_fromX_spanT.py ....                                                                                                                                                                                                         [100%]

======================================================================================================= 4 passed, 126 warnings in 1.15s ========================================================================================================
$ pytest test_custom_properties.py 
...
test_custom_properties.py ....                                                                                                                                                                                                           [100%]

============================================================================================================== 4 passed in 0.42s ===============================================================================================================
  • If I specify the module test_custom_properties.py before test_bismuth_fromX_spanT.py:
$ pytest test_custom_properties.py test_bismuth_fromX_spanT.py
...
======================================================================================================= 8 passed, 134 warnings in 1.19s ========================================================================================================
  • If I specify the module test_custom_properties.py after test_bismuth_fromX_spanT.py:
$ pytest test_bismuth_fromX_spanT.py test_custom_properties.py
...
=================================================================================================================== FAILURES ===================================================================================================================
________________________________________________________________________________________________________ BismuthTester.test_init_fromX _________________________________________________________________________________________________________

self = <test_bismuth_fromX_spanT.BismuthTester testMethod=test_init_fromX>

    def test_init_fromX(self):
        for bismuthP in bismuthPs:
            properties = load_prop('lbh15.properties.bismuth_properties')
            for prop in properties:
                name = prop.name
                if name in Bismuth.roots_to_use().keys():
                    continue
                val = getattr(bismuthP, name)
                init_dict = {name: val}
                fromX = Bismuth(**init_dict)
>               self.assertAlmostEqual(bismuthP.T, fromX.T, tol, name+" FAILED")
E               AssertionError: 548.15 != 1102.673571428572 within 8 places (554.5235714285719 difference) : rho FAILED

test_bismuth_fromX_spanT.py:48: AssertionError
=========================================================================================================== short test summary info ============================================================================================================
FAILED test_bismuth_fromX_spanT.py::BismuthTester::test_init_fromX - AssertionError: 548.15 != 1102.673571428572 within 8 places (554.5235714285719 difference) : rho FAILED
=================================================================================================== 1 failed, 7 passed, 78 warnings in 0.93s ===================================================================================================

After some inspection it seems init_dict = {name: val} for rho would have different values depending on which test module is executed first. The value is either 'rho': 10056.257 or 'rho': 10832.59 for T = 548.15. I suspect the latter value comes from the custom correlation 11600 - 1.4 T provided in the test directory. The line Bismuth.set_custom_properties_path(bismuth_custom_property_path) at the top level of test_custom_properties.py may have altered which correlation to use across test modules and that caused some tests to fail depending on the order of execution.

Again because you're not using pytest and execute the test modules one by one this is a non-issue albeit a curious one.
If in the future, if you'd like to use pytest for, perhaps, better scalability you might want to consider this finding again.

I'm removing that task from the above.

@lelaus
Copy link
Collaborator

lelaus commented Mar 29, 2024

Dear @damar-wicaksono ,

the authors agree with your comments and suggestions. They tried to manage all of them. The details of all the actions performed correspondingly is in the following:

General remark: British English vs American English

  • Adopted American English;
  • Corrected JOSS paper in issue Paper editing for JOSS #133 by converting English words vapourization and vectorisation into the American ones vaporization and vectorization;
  • Corrected documentation by homogenizing all terms to the US version;

Introduction

Installation

Basic Usage

Initialization from Properties

  • done

Advanced Usage

  • Two new subsections have been created, that is, How to Add New Correlations and How to Add New Properties
  • The description of how to retrieve the available correlations has been put outside from the bullet points (that have become two subsections); two examples have been provided, one for retrieving all the correlations, the other for retrieving the correlations of a few specified properties;
  • done
  • task managed in issue Checking separator when calling set_custom_properties_path() method #153
  • done
  • done: it was a typo
  • an example is added to show which is the default behavior
  • done

Learn More

  • done
  • done
  • done

Oxygen Concentration Lower Limit

  • sentence rephrased into In spite of the ongoing research, the exact ...

Ranges of Validity

  • sentence rephrased into Such a validity range is defined as the intersection of the validity ranges of the correlations on which the corresponding lower limit depends
  • done
  • any additional source cited in the documentation is already cited in the Handbook, thus we removed this citation
  • the authors agree with the reviewer. extension term is used erroneously, leading to misunderstandings. The authors would like to talk about the amplitude of the validity range. Thus the sentence has been rephrased into: preference is given to those correlations whose range of validity is based on the lowest available temperature value and is the widest available

Tutorials

  • done
  • done
  • done
  • done

API Guide

  • removed the second of the two identified pages, the one without the title

lead Module, bismuth Module, and lbe Module

Community Guidelines

@damar-wicaksono
Copy link
Author

Hi @lelaus,

Thank you very much for considering and handling the issues I raised.

I went through the documentation once more and confirm that most issues are fixed, including the one related to the community guideline document.

There are however, a couple of minor issues that you said they are revised (meaning, I assume, you have no objections with) that I cannot confirm both in the online documentation and the source (see unchecked boxes above). Perhaps I was looking at the wrong docs/sources?

@lelaus
Copy link
Collaborator

lelaus commented Apr 2, 2024

Hi @damar-wicaksono ,

first of all, we would like to thank you very much for the accurate and thorough work you have done in reviewing our project. We are very happy as your analysis has contributed a lot to improving the quality of the work.

As for the "missing" tasks, I think that you didn't find the corresponding implemented actions on the master branch since the actions implemented in the current issue have not yet been merged. Now we have done it!

Please, let me know if there are any more problems.

Thanks!

@damar-wicaksono
Copy link
Author

Hi @lelaus,

Thanks a lot! Now I can see them and can confirm that everything has been fixed. Thank you again!

I'm closing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants