Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tolvanen] Effort Used to Create DSMLs #9

Open
grammarware opened this issue Jul 13, 2018 · 9 comments
Open

[tolvanen] Effort Used to Create DSMLs #9

grammarware opened this issue Jul 13, 2018 · 9 comments
Assignees
Labels
accepted Artefact accepted

Comments

@grammarware
Copy link
Collaborator

Submitted by @mccjpt to https://github.com/modelsconf2018/artifact-evaluation/tree/master/tolvanen

@grammarware grammarware added the submitted Received for review label Jul 13, 2018
@grammarware grammarware added under review Artefact being reviewed and removed submitted Received for review labels Jul 13, 2018
@hernanponcedeleon
Copy link

@mccjpt could you please provide a version of the paper?

@mccjpt
Copy link
Collaborator

mccjpt commented Jul 17, 2018 via email

@mherzberg
Copy link
Collaborator

Artefact summary

The submitted artefact accompanies the paper "Effort Used to Create
Domain-Specific Modeling Languages", which presents a survey of 10 case studies
which analyzes the amount of effort spent on creating domain-specific models and
languages. The artifact consists of an excel spreadsheet that provides details
on the time spent per activity for two of the case studies.

Consistency with the paper

The excel sheet contains some of the graphs of the paper along with the formulas
used to generate them. The artefact is therefore consistent with the paper.

Completeness of artefact

The artefact only contains data and graphs regarding the two big case studies.
The data regarding the 8 smaller studies, which has been used for figure 6,
seems to be missing; however, each case study is recorded only with a single
data point (the number of days invested), which is contained in the paper.

Artefact documentation

The artefact is undocumented. Some information can be inferred from looking at
the contained log event names from the tool, which are listed with time stamps
and serve as the bases for the graphs.

Ease of reuse

The case studies, as described in the paper, seem reproducible.

The data, in the form present in the artefact, is difficult to interpret and
therefore not reusable.

@mccjpt
Copy link
Collaborator

mccjpt commented Jul 19, 2018 via email

@mccjpt
Copy link
Collaborator

mccjpt commented Jul 19, 2018 via email

@hernanponcedeleon
Copy link

@mccjpt yes ... I found it immediately after writing the comment. Thanks!

@hernanponcedeleon
Copy link

Summary

This artifact accompanies a paper presenting a study in the investment needed to create DSLs, generators and related tooling.
The study is based on ten use cases: two cases implemented by the authors and 8 cases coming from different companies and based on industrial reports.
The study focus on languages created using the MetaEdit+ tool.

The artifact consist on an excel sheet presenting the data used in their experiments. Data is also shown as graphs

Assessment

Packaging: Met expectations

The artifact is packed as a single (excel) file. For such kind of artifact the packaging is ok.

Reproducibility and Consistency: (partially) Met expectations

The data is consistent with the results display in the paper.
For the two uses cases performed by the authors, I cannot really think about a way of reproducing the results. However, for the 8 cases coming from different industrial reports, the authors could have provided in the artifact the reports back-uping the data.

Documentation: Fell below expectations

There is no documentation at all. An excel sheet is normally self-explanatory, but the sheet is quite messy with a lot of information.

@TheoLeCalvar
Copy link
Collaborator

Summary

In the paper, the authors want to assess the time and effort needed to build ten different fully functional DSL/DSM.
Eight of them come from industrial reports and two from the author themselves.
This artefact consists in a spreadsheet containing detailed data about the effort invested in the creation of these two complete DSM.

Is the artefact consistent with the paper?

The artefact is consistent with the data presented in the paper.
It contains formulas used to generates the graphs used in the article.

Is the artefact as complete as possible?

Yes, the artefact contains the durations of each task of both case study developed by the authors.
However a sheet could have been added with durations of industrial cases.
Even if not detailed it could have been interesting to gather all data into a single shareable document.

Is the artefact well-documented?

The spreadsheet is not documented however its content is self explanatory.

Is the artefact easy to (re)use?

Given the type of artefact I doubt it is easily reusable as is.
However, given the descriptions of the case study is given they can be reproduced.

@grammarware
Copy link
Collaborator Author

Dear @mccjpt,

Based on all the comments and the reviews provided by the members of the Artifact Evaluation Committee of MoDELS 2018, we have reached the conclusion that this artifact conforms to the expectations and is hereby approved. Please use the badge instructions page to add the badge of approval to your article, and add the link to the artifact to the camera ready version of the paper.

Thank you very much for putting extra effort into the preparation and finalising of the artifact. If any of the comments above are still not addressed, please try to accommodate them before the conference.

In particular, we would recommend you to focus on two major points:

  • Submit a snapshot of your artifact to ReMoDD, FigShare, Zenodo or any similar service, for archival purposes. Both GitHub and MetaCase are mature and stable, but even they are not immune to URI changes and similar trouble. It is also more reuser-friendly to package the entire artifact in one place (now there are three).
  • If some documentation, even the most basic one, can be added, it could improve usability of the artifact. Our reviewers did not think that this is a major problem, but if you have the means, please consider accommodating.

@grammarware grammarware added accepted Artefact accepted and removed under review Artefact being reviewed labels Jul 20, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Artefact accepted
Projects
None yet
Development

No branches or pull requests

5 participants