You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To encourage testing, we might set up a way to report test results. This would need to be done in a fairly standardized way, perhaps adding a line to a table, like the CRAN test results tables, possibly with one column for a free text comment.
The text was updated successfully, but these errors were encountered:
Recording test in progress or configuration tested will help all.
I thought the petition was to test with different configuration, external software and specially interactive code for which, for starters like me I wouldn't even know what and how to report. Maybe some guidelines about it would help. Even if just a table with some headings and a minor description. Maybe this could be expanded on the R dev guide's chapter?
Maybe this could have some gamification side where new external programs, OS or configuration combinations reported receive some "points". This could be attractive to "younger" generations.
One way we ask community members to contribute is by testing pre-releases of R (https://developer.r-project.org/Blog/public/2021/04/28/r-can-use-your-help-testing-r-before-release/). However, we only ask them to report if something goes wrong. So people could be contributing without any recognition and perhaps there is a lot of time wasted with people running the same tests.
To encourage testing, we might set up a way to report test results. This would need to be done in a fairly standardized way, perhaps adding a line to a table, like the CRAN test results tables, possibly with one column for a free text comment.
The text was updated successfully, but these errors were encountered: