-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sashimi not accepting volume #23
Comments
It seems like we're leaving this for post MVP. I'll remove it from our current report, then. Thanks, Kristian. |
thats right. We can discuss it here actually. |
@kjgarza hi! @sfisher I can confirm that the DataCite hub still does not accept volume. I just opened CDLUC3/counter-processor#8 I just spoke with three other Dataverse developers about volume and country counts. This might be more of a CoP question but we'll follow up in a bit with more thoughts on volume. Thanks! |
Volume thoughts: We were poking around at the volume metrics in Make Data Count, and noticed that it was kinda weird how the spec for volume is not separated by country like the “hits” are.
This information would be way more useful if there was also an entry like this:
We are aware that right now any SUSHI reports with volume sent to Make Data Count are rejected, but figured we’d throw our two cents in about what would be useful. We are logging the data and would love to provide it when its available! |
@matthew-a-dunlap and @pdurbin I think that makes a lot of sense to me too. We haven't started in on the volume reports yet, but it seems clear to me that we would want to facet them along the same lines as the counts. |
I'm not sure if this is a bug or else just a feature we're not implementing until later. I can leave it out of our reports for now if it's not something being accepted yet. Related: I also wondered if DataCite wanted country-volume just like we're sending country-counts.
When I include volume in the "instance" statistics like in the snippet below I get rejected submissions.
The text was updated successfully, but these errors were encountered: