Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong score from assess_reverse_dependencies #352

Open
albertopessia opened this issue Aug 26, 2024 · 0 comments
Open

Wrong score from assess_reverse_dependencies #352

albertopessia opened this issue Aug 26, 2024 · 0 comments
Assignees
Labels
Bug Something isn't working Difficulty: Low No foreseable design or implementation challenges

Comments

@albertopessia
Copy link

The documentation of assess_reverse_dependencies states:

The more packages that depend on a package the more chance for errors/bugs to be found

A package with no reverse dependencies should then get a maximum score of 1 while a package with many reverse dependencies should get a score close to zero.
At the moment assess_reverse_dependencies is doing the opposite and does not produce values in the range [0, 1] but rather on [0.2463761, 1].

First of all, the logistic function should be decreasing therefore the function implemented by metric_score.pkg_metric_reverse_dependencies should have an opposite sign growth rate.

Also, the logistic function is defined on the whole real line while the number of packages variable is non-negative. If number of packages is not log-transformed, then the asymptote at x -> -Inf will never be reached (see the minimum score of 0.2463761).

I suggest using the log-logistic function because uses the same scale of number of packages and it is well defined also when number of packages is zero:

metric_score.pkg_metric_reverse_dependencies <- function(x, ...) {
  1 - 1 / (1 + (5 / length(x))^2.5)
}

The growth rate of 2.5 is just an example and it should be tuned to the desired score behavior.

@emilliman5 emilliman5 added the Difficulty: Low No foreseable design or implementation challenges label Sep 17, 2024
@emilliman5 emilliman5 self-assigned this Sep 17, 2024
@emilliman5 emilliman5 added the Bug Something isn't working label Sep 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working Difficulty: Low No foreseable design or implementation challenges
Projects
None yet
Development

No branches or pull requests

2 participants