-
-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
p2p peer review #139
Comments
@lukeburns thanks for opening this! Yes that sounds very close to what I've been thinking!
Late here but I'll read your repo tomorrow and write up my thoughts. |
@lukeburns what do you think about extending your model beyond pre-publication? |
I see two approaches: linking to external sources or, better, replicating the external publication on the network as a static hyperdrive. The latter option is nice because it doesn't require the author to publish on the network for the paper to be reviewed and it helps ensure the availability of content by distributing it across peers. |
@lukeburns, do I understand correctly that you propose the following workflow for a review?
|
@lukeburns the P2P process is very interesting compared to the traditional process! @igRo as I understand @lukeburns proposal, it is more or less like you say, but it is a more open and transparent process. In step 2 the Author forwards to selected peers he/she knows, and should be involved In this way scientific work would become available to a larger group sooner, which in turn might lead to quicker validation and better feedback. |
One other thing to consider: I don't have much experience with scientific review processes in particular, but I've worked a lot with cms'es in a SaaS environment where you have e.g. content review processes. Here we never had just one review type. We had many:
We used state machines (for the simple cases) and workflow engines (for the complex ones) to implement this. Now in no way am I recommending you to include a workflow engine. Just saying you should think carefully what processes you are gonna support, now and in the future, and design accordingly so that adaptation and extension do not introduce too many breaking changes. |
@aschrijver that 2a step you are mentioning, would you envision that as a restart of the whole process, starting again at 1 with a replication of the publication? I am sceptical about if/how it would be feasible to let non-owners extend the visibility of an encrypted hyperdrive, based on the dat-pki. (Thanks all for demonstrating that a lower-case L is a really bad choice for my username. Also sorry @igRo for the confusion.) |
First of all, this was my interpretation of how things work, you'll have to ask @lukeburns to be sure. But not sure, because when allowing the review to spread out on an organically growing network of peers ( |
Some more analysis based on my previous observation.. A review could be addressed to:
A review process could stop when:
This raises the follow-up question: How do you stop a review process, avoid people wasting time? All these choices have (potentially significant) design impact and lead to further questions. |
@LGro a minimal and fairly generic implementation might be a hypercore-archiver + a feed of the keys in the archive:
this says nothing about the structure of reviews (maybe comments on a hypercore feed or a hyperdrive with latex files). it also says nothing about how peers find each other, so it could work for open or closed review networks. there are a couple issues with this proposal. the propagation of reviews through the network might be too slow if it has to go through the same filtering process that publications do (through steps 1, 2, 3). one way around this might be to have all peers auto-replicate reviews. they could even auto-replicate publications if filtering is not necessary for the size of the network. additionally, while it's important that peers be able to have filtration control, peers with no followers are unheard. one could implement a process by which to "push" messages to new peers (e.g. jayrbolton/dat-wot#7), so that a reviewer can push a review onto the network, whether or not they are followed by other peers, or to send "follow requests." otherwise, a peer needs to find a "champion" who is already connected to other peers somehow and convince them to replicate their publication / review, which might be all one needs in a small review network (e.g. an undergrad student researcher on a network consisting of collaborators on a research program has a review replicated by their advisor). a beefier implementation might use dat-pki and allow for selective sharing with groups or individual peers. i think this is a good first step that works with minimal dependencies. |
@LGro so to actually answer your question, the key differences between what you said and what i'm imagining are how publications are shared and filtered and the structure of reviews (which i'm not settled on yet):
|
@lukeburns I think it depends on who you see as your audience. |
@step21 the above proposal isn't free for all review, although you could do that. it works for arbitrary networks of peers, so you could easily put together a closed group of reviewers / editors for a journal using this. you could even do more interesting things like e.g. build a trusted network of reviewers underlying a consortium of overlay journals, so that the overlay journals have a consistent and reliable source of reviewers to tap into. |
Sure. @lukeburns thanks for the clarification. That sounds really great. In general I just think that a lot of what I hear from #openscience or movement of 'against publishers etc' is just very much like 'free everything' without a replacement, or a very disorganized one. If projects like sciencefair actually want to include academics and give them a platform, I think it is important to consider these things. Like how best to include them, things like that. Also because if I am just a random Tech Guy with an opinion or something, I would just use a blog ;) |
An issue to discuss implementation details of p2p peer review. I've documented some of my thoughts on what a p2p review process might look like (see https://github.com/lukeburns/peer-review) fwiw.
My initial thoughts on an implementation are to create modules in the hyper* ecosystem that include mechanisms for publishing feeds under a "publishing feed" that includes identity metadata and a collection of feeds (Would multifeed or dat-pki be helpful?) of publications and forwarded publications that would benefit from review, plus linking between feeds so that one can find reviews of a given feed (hyperdb?).
Is this at all like what you've been thinking?
The text was updated successfully, but these errors were encountered: