Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Helping reader find most relevant content #504

Open
dmitriz opened this issue Jan 25, 2016 · 6 comments
Open

Helping reader find most relevant content #504

dmitriz opened this issue Jan 25, 2016 · 6 comments

Comments

@dmitriz
Copy link

dmitriz commented Jan 25, 2016

First of all, let me thank @sindresorhus and everyone else for your titanic efforts to make these lists so awesome, it is greatly appreciated!!!

The problem I'd like to address though, is a challenging task to keep the lists relevant as things can move very fast and projects become deprecated, obsolete, outdated, unmaintained etc. As an example I can mention the list of AngularJS seed projects, many of which are sadly quite old and full of what nowadays considered bad practices.

Showing some of these old projects to a newcomer without proper warning can actually be considered harmful, as people can mistakenly learn some bad practices from there without being aware that they are. Especially with AngularJS this feels like a problem with people misjudging the framework based on the outdated information, when modern best practices actually solving many of those old problems. Given the current fast pace of the JavaScript ecosystem and the overwhelming and growing amount of information, this problem will likely only get worse.

On the opposite side, the most modern, recent and relevant projects can sadly blend and get lost amidst other older ones, making it hard for the reader, especially beginner, to find them.

So I'd like to suggest to discuss possible ways to mitigate this challenge, and help the reader by adding various meta-data beside the links that would help to decide which links to check first.

I have previously proposed to timestamp the list entries here as one possible meta-data that may be easy to automate and @sindresorhus rightfully suggested it belongs here. It is based on the assumption that the resource was considered "awesome" at the time when the link was put and the reader would quickly see when that happened.

I think that would really help the reader with the type of resources whose awesomeness might change fast in time. I'm sure the author would not put many of those links now but wants to keep them for historical or whatever reasons.

It is not a perfect solution and might not be so helpful for other types of resources, where other clues may be needed. However, it is a simple information that would keep things more transparent, and may have different use for different people. Also it would help the maintainers to feel less guilty and less responsible for the content by clearly saying how old it is.

As @sindresorhus pointed, this metric may be flawed e.g. for npm packages, where other information could be useful such as:

  • the number of downloads;
  • the number of stars;
  • the number of open issues;
  • the date of the last commit;
  • the size of the package with all dependencies.

Especially the package size with dependencies and how long it takes to download is what I found very hard to estimate in advance.

Of course, any meta-data generation should be fully automatic and should not add any burden to the author. If anything more can be done to save the author's time -- the better!

@davisonio
Copy link
Contributor

I agree with you here, it's sometimes quite a struggle to check through projects listed in awesome lists for something which you may want to use or learn about. You don't want to be using outdated projects which don't work with recent versions or you don't want to be learning things which are out of date.

I don't think that timestamping is the best possible option though, as @sindresorhus rightly pointed out. Some items on awesome lists are simple "Done".

My suggestion here is for awesome list maintainers or those who contribute to lists to check though items manually every once in a while to check the quality of the resource. Maybe we should have some guidelines based on project maintainers ensuring items in awesome lists stay awesome? 😎 Essentially I don't think "stars", "downloads" or last commit date can show anything about how awesome a resource is - therefore we'd need awesome list maintainers to check the awesomeness of items in the lists reasonably regularly and semi-manually.

An example of this in action is over at awesome-irc. Every once in awhile I review the items in Awesome IRC. If there are projects which are not deemed awesome anymore I move them into the archive.md file with an explanation as to why it's removed.

There are some scripts which can help us of course! awesome_bot is very helpful. If the link to an item is broken then it's obviously not suitable for the list anymore. This is the exact problem awesome_bot aims to solve.

Another main problem we have is when awesome list maintainers simply disappear or are too busy to review pull requests. Here's a crazy idea: how about we reccomend that awesome lists have at least 2 people with commit access?

@dmitriz
Copy link
Author

dmitriz commented Jan 26, 2016

Thanks @davisonio for extensive comments!

I don't think that timestamping is the best possible option though, as @sindresorhus rightly pointed out.

Just so it doesn't get lost in the cyberspace, let me quote from what @sindresorhus exactly said:

I agree with the sentiment in theory, but for packages, using timestamp as a quality metric is flawed, especially in the Node.js world. ...
...
I do see this being useful for lists for projects with high churn, like JS frameworks, or where the author isn't able to ensure high quality. Can you open this issue on https://github.com/sindresorhus/awesome instead with your above comment as issue description? ...

Whence the reason for this issue.

Some items on awesome lists are simple "Done".

"Done" as in "I did it and now I don't care"? Then clearly this content cannot be considered of comparable quality with ones where the author makes efforts to keep the content up-to-date. And as reader I need a clue to tell the difference. Very valuable given the current information overload.

My suggestion here is for awesome list maintainers or those who contribute to lists to check though items manually every once in a while to check the quality of the resource.

It is hard for me to imagine how that would work at scale in practice. Manually going through hundreds of links requires a lot of time. It is honourable when people find the time to do it, to benefit community, however, it is not what the community should automatically expect or feel entitled to. And even if maintainers want to commit, it can put a lot of pressure on them and get them burned out and dropping everything else, if the overload gets too high. You don't even need to search hard for an example.

I am actually suggesting to look for ways to make it easier, less of a burden, and less time-consuming for maintainers, which, at the same time, would benefit the community the most.

Maybe we should have some guidelines based on project maintainers ensuring items in awesome lists stay awesome?

It is always good to have guidelines, the more the better, no question about it. However, the guidelines have advisory role. They cannot replace automatic procedures that guarantee the outcome.

Essentially I don't think "stars", "downloads" or last commit date can show anything about how awesome a resource is - therefore we'd need awesome list maintainers to check the awesomeness of items in the lists reasonably regularly and semi-manually.

Here I have to respectfully disagree. Stars and downloads tell a lot about the quality, and, even more important, about the viability of the project in the future. People download and give stars for reasons. A popular project is more likely to stay more actively maintained even if the original leader drops out. In contrast, a less popular project can die for random reasons, with no one else willing to commit. If I am a reader trying to decide which project I want to use and invest my time in, this makes a big difference.

I find it honourable that you as maintainer take your personal time to manage your links and put some of them into "archive". However, I imagine this as a difficult decision as some people might not be happy seeing their projects boxed into archive.

Imagine there is this new hot tool A and everybody is writing how much it is better than B that was so oh hot last year. :-) In reality it might be great for some use cases but not so for others. The author of B may rightfully think it is unfair, and even get irritated seeing it boxed into archive and declared "not so awesome". And if you don't want to offend anyone, you will wait till the project's quality really drops down enough to be removed. And again we are back to the same problem of many links of varying quality next to each.

That is why I think it is better for both sides to take the subjectivity out of the equation as much as possible and make it objective, transparent and automated. The timestamp, for instance, is one of those metrics. Even if you think it is "flawed", it is up to the reader to decide how much it is. Or it isn't. You as maintainer don't even need to worry about it. It is just an information for everyone to use (or not to use).

You don't even need to think about how useful this or that metric is. Your readers are grown up enough to figure this out for themselves. Give it to them, and they will use it the way they want and they will thank you for the choice. ;-)

There are some scripts which can help us of course! awesome_bot is very helpful. If the link to an item is broken then it's obviously not suitable for the list anymore. This is the exact problem awesome_bot aims to solve.

That is certainly useful idea but it raises few questions:

  • How many maintainers are actually using it and how does the reader see this? Is there any kind of badge for it?
  • Once the link is found broken, what are the actions?
  • Are the links automatically removed from the list?
  • Or perhaps project is moved elsewhere and simple search can uncover the new link? Is this process automated?
  • What if the link is temporarily down for whatever reason?
  • What if the link is intact but the content is still not so awesome any more?

Another main problem we have is when awesome list maintainers simply disappear or are too busy to review pull requests. Here's a crazy idea: how about we reccomend that awesome lists have at least 2 people with commit access?

Indeed, this is very real problem. Not so much with pull requests that can be merged quickly but with manual quality checks that can take real time. It may indeed help to allow access to several people but even then it fundamentally remains the same human problem.

It is a real problem that is worth to have real thoughts about with real potential benefits imo.

@davisonio
Copy link
Contributor

People download and give stars for reasons. A popular project is more likely to stay more actively maintained even if the original leader drops out.

Fair point here. I'd agree that stars can definitely give an idea of the project but we can't singularly sort/organise them by this statistic.

I find it honourable that you as maintainer take your personal time to manage your links and put some of them into "archive". However, I imagine this as a difficult decision as some people might not be happy seeing their projects boxed into archive.

I'm glad you've brought this up. I'd rather not offend anyone and their project by putting it into an archive. Personally, I'll think of something different - such as showing that these projects are still awesome but just need a little more work to get onto the list. If I was organising a project and people were giving me valuable feedback and advice I'd really appreciate this. Even if it comes in the form of a short description in an archive-like file. Something similar to an "archive" could even bring positives - previously unmaintained projects could find new maintainers.

You don't even need to think about how useful this or that metric is. Your readers are grown up enough to figure this out for themselves. Give it to them, and they will use it the way they want and they will thank you for the choice. ;-)

👍

I see what you mean by the questions you mentioned. Most of the answers from my point of view are "check/do it manually" unfortunately. But let's not forget that these are "curated lists of awesome", we shouldn't leave robots to completely run and ensure the validity for links in projects 😄

@egeerardyn
Copy link
Contributor

@dmitriz
You don't even need to think about how useful this or that metric is. Your readers are grown up enough to figure this out for themselves. Give it to them, and they will use it the way they want and they will thank you for the choice. ;-)

I don't think ANY metric can be used universally to solve the problem of lists gathering dust. Different communities just have different dynamics. And even within a community, often you need multiple metrics to find what you are looking for.
I myself, work in academia, where also tons of metrics are used to gauge the performance of researchers, journals, institutes, etc. The values can differ by orders of magnitude across fields and learning what is a good or a poor value, is something you only learn by being in the field.
Also, it gives a very one-dimensional view of "popularity", but often you rather want to see "theoretical" vs "practical", "easy" vs "difficult", things that are hard or impossible to capture in a metric.

Just to say: I think there is little merit in picking a metric at the top level and suggesting this to other repositories. You readers may eventually learn in their own field what a "good" or a "poor" value for a metric is, but that also takes quite a bit of experience: something you only learn by reading the actual items. So metrics do not help the novices to find good content, but rather help more seasoned users.

@dmitriz
It is hard for me to imagine how that would work at scale in practice. Manually going through hundreds of links requires a lot of time. It is honourable when people find the time to do it, to benefit community, however, it is not what the community should automatically expect or feel entitled to. And even if maintainers want to commit, it can put a lot of pressure on them and get them burned out and dropping everything else, if the overload gets too high. You don't even need to search hard for an example.

You say that maintainers already have a lot to do to keep track of their items. I would even say that keeping metrics is not going to make the situation of a maintainer any better, but rather worse. Once you introduce metrics, you also need to keep the metrics updated.

Personally, I think the current approach of manual checking is fine. Moreover, if there is an involved community, this is something that the users will notice. E.g. if a link goes stale, I would expect the readers of that list to voice their concerns. The detection of basic problems (broken links) can be automated, but I think that is where it stops as well.

@dmitriz
That is certainly useful idea but it raises few questions:

  • How many maintainers are actually using it and how does the reader see this? Is there any kind of badge for it?
  • Once the link is found broken, what are the actions?
  • Are the links automatically removed from the list?
  • Or perhaps project is moved elsewhere and simple search can uncover the new link? Is this process automated?
  • What if the link is temporarily down for whatever reason?
  • What if the link is intact but the content is still not so awesome any more?

Some people use a CI server to run that script, this can be seen e.g. from a .travis.yml file or something similar.
While your questions regard awesome_bot (which only checks dead links and redirects), similar issues also exist for updating timestamps or other metrics:

  • How big a change is needed to update the timestamp? 10 characters? 10% of the page?
  • Where would you store the cached version of the page (to check whether the new version has changed)? How about copyright/IP restrictions?
  • How to deal with pages that are redesigned (same content, different stylesheet)?
  • How to deal with sites that are replaced (e.g. hijacked, removed and eventually replaced by spam sites, ...)? Those will differ a lot from the previous version.
  • What if the link is temporarily down for whatever reason?
  • What if the new content is not awesome?
  • How often would you check for updates? Every day? Every week? Won't that clutter the history of the project with lots of automatic updates?

It cannot be avoided that there is manual work involved. I just have the feeling that keeping track of more things (even assisted with an automatic procedure), will just cause a lot more work for limited benefit.

@dmitriz
You don't even need to think about how useful this or that metric is. Your readers are grown up enough to figure this out for themselves. Give it to them, and they will use it the way they want and they will thank you for the choice. ;-)

@davisonio
👍

I see what you mean by the questions you mentioned. Most of the answers from my point of view are "check/do it manually" unfortunately. But let's not forget that these are "curated lists of awesome", we shouldn't leave robots to completely run and ensure the validity for links in projects 😄

I agree with this: let bots/scripts help with the work, but it remains essentially manual work since there is no bot that can check "is X awesome?". Awesomeness will always remain a subjective matter.

@dmitriz
Copy link
Author

dmitriz commented Jan 28, 2016

Here is a reddit thread illustrating how much people care about timestamps :)

https://www.reddit.com/r/reactjs/comments/431oat/webpack_for_react_the_complete_and_uptodate_guide/

@perryprog
Copy link

perryprog commented Dec 29, 2016

I know I'm late to the party, but maybe a simple change would be to highly recommend to order each section in the list by something important, like stars or downloads. Most lists I've seen are alphabetically sorted. It's a huge pain the look through through each item, check the stars and last commit along with compatibility.

It would really speed up time to do this sort of thing. Maybe awesome_bot could automate added the GH starts badge, etc. While there is no way for a bot to tell if something is awesome we can still sort a list.

Repository owner deleted a comment from dGrant5 Nov 27, 2021
Repository owner deleted a comment from Annush73 Oct 9, 2023
Repository owner deleted a comment from Amateur445 Oct 9, 2023
Repository owner deleted a comment from Cristycanal386 Mar 1, 2024
Repository owner locked and limited conversation to collaborators Mar 1, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants
@egeerardyn @dmitriz @davisonio @perryprog and others