Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible rewarding of sites using excessive ads #1966

Open
vertigo220 opened this issue Nov 14, 2021 · 7 comments
Open

Possible rewarding of sites using excessive ads #1966

vertigo220 opened this issue Nov 14, 2021 · 7 comments

Comments

@vertigo220
Copy link

I understand and like the idea behind this, but I do have one concern about it. Regardless of the setting used, whether clicking all ads on a page or only 10%, if those clicks are generating revenue for the website, it's rewarding sites with lots of ads. A site that's responsible and cares about user experience and therefore only displays a small handful of ads will receive very few clicks, whereas one that plasters ads all over the site (you know the type I mean, where ads are surrounding the content and even literally on top of each other), they will receive a large number of clicks and earn more money, rewarding this bad behavior. This project seems focused on the ads and advertising companies, who are certainly a big part of the problem, but seems to ignore the site owners, who are also largely responsible for destroying the web with their practices.

To address this, I suggest an option--preferably enabled by default--to determine the number of ads to click on inversely based on the number of ads. Perhaps for every three ads, the percentage of clicked ads is halved or, even better and perhaps necessarily, reduced to a quarter of the setting. So if it's set to click on 100% of ads, instead of a site with 3 ads receiving 3 clicks and a site with 10 ads receiving 10 clicks, the first would still receive 3 clicks but the second would only receive (10/3)*.25=0.833, thereby rewarding the former for their restraint and punishing the latter.

@vertigo220
Copy link
Author

Another thought: there should be an option to only click after being on a page for a certain amount of time and/or once a page is approved. There are a lot of sites that, due to their poor design, would just be benefitting from this to make money of ad "clicks" despite not offering any real value. For example, many link redirection sites use captchas, and some of them just keep going and going, never letting you complete it and continue on. Those sites shouldn't be rewarded with ad clicks for wasting users' time and causing them to have to reload or close the site and access it again from the initial link, which in either case would just result in even more clicks and revenue for them, all due to their extremely user-unfriendly design. With that in mind, another possible option for something like this is to not click on any ads if there's a captcha present until after it's solved.

@mneunomne
Copy link
Collaborator

mneunomne commented Nov 14, 2021

Thank you very much @vertigo220 for the suggestions.

One question, do you think an option to exclude or include specific websites for the clicking mechanism would be enough to address your concerns?

I think your arguments do make a lot of sense, but perhaps we could cover more diverse scenarios by simply allowing the user to choose the websites they would like the Click functionality to work on. That way each user can have their own criteria of what sites "deserve" or don't deserve to be clicked.

@mneunomne
Copy link
Collaborator

This was suggested here: #1912

@vertigo220
Copy link
Author

If that's the only feasible way to do it, it would be better than nothing, but I don't think it would be ideal. Adblockers often add additional steps during browsing to unblock certain things to get a page to work properly or to block things that are missed. This is a deterrent to their use for many people, and so they should require as little user interaction as possible. This seems especially true for the audience of AN, which I suspect tends to be the type that wants something that works with minimal interference and interaction at the cost of some loss of functionality, e.g. loading everything in the background which prevents breakage but also allows trackers to function.

So I would think requiring the user to white/blacklist every site would be too cumbersome. This is especially true in the case of link shorteners, as there are dozens, if not hundreds, of different ones, and handling each one individually is painstaking and time-consuming. And that's just one category. Making this a manual thing (there should be a manual white/blacklist ability, I'm just saying that should be for occasional use, not for something like this that would potentially need to be used frequently) would just be too involved on the front-end, requiring too much from the user.

There are three other reasons I would argue for making this its own feature with smart management. One is that doing so would make the "need" (in the context of this extension's mission) apparent. A simple whitelist/blacklist solution would merely cause users to do one of two things. The first would be to allow all sites and only blacklist ones that maybe don't work or that they really don't want to support, but sites that use ridiculous amounts of ads, which would be unseen and therefore unknown to the user, would benefit from doing so, being rewarded for behavior that this extension seems to want to curtail, with the user being completely unaware. The other thing users might do would be to block all ads and only whitelist them on sites they want to support, which would be ideal, but that requires them to actually remember to do that, and it would most likely leave the majority of sites unsupported by not receiving ad clicks, which also doesn't really fit into the idea of AN. So neither scenario seems ideal.

Another reason is related to the captcha issue, as well as related issues. As an example, most link shortener sites require you to complete a captcha before providing you with a link, and many will keep failing over and over, wasting the user's time and never allowing them to proceed. And some sites fail, every single time, to lead to a valid destination, unrelated to the captcha issue. Whereas some typically work well, with the captcha being successful and it leading you where you want to go. The problem with AN is that the user doesn't know until after attempting the captcha, or even after leaving the page, whether it's a decent one designed to work well or a lousy one designed to waste your time and (I assume) to get ad revenue without actually working. If all except blacklisted sites get clicks, these bad sites will get them, and will earn money despite being poorly designed. And users likely won't blacklist them either because by the time they realize it, it will be too late, or because they'll assume it's a fluke. If only whitelisted sites get clicks, the most likely scenario would be that users wouldn't whitelist these sites, and so, as mentioned above, AN won't be accomplishing its mission. Clicking links on sites with captchas only after the captcha is successfully completed (assuming that's even possible) would help with this problem.

Finally, as mentioned in the OP, the point of this would be to try and encourage more responsible ad usage by rewarding placement of fewer ads and punishing the use of excessive amounts of ads. This is something a simple white/blacklist wouldn't accomplish. And since the ads are hidden from the user, they can't even see how many ads a site uses in order to decide whether to whitelist or blacklist the site, assuming it even occurred to them to do so.

@dhowe
Copy link
Owner

dhowe commented Nov 15, 2021

Thanks for the thoughtful response – some practical ideas below, but I wonder whether we really want to consider the click as a 'reward'? While it is true bad sites may be paid more in the short-term, one of the goals here is to diminish the appeal of the larger economic model, with networks refusing to pay those sites, and with clicks leading (ideally/eventually) to diminishing profits...

And since the ads are hidden from the user, they can't even see how many ads a site uses in order to decide whether to whitelist or blacklist the site, assuming it even occurred to them to do so.

  1. The user does (or can) see the number of ads on each page, on the ADN menu badge

  2. But seems this could be a (automatic) metric for disabling clicks on a site (perhaps via a global setting): sites with ad counts above a threshold are automatically whitelisted (or blacklisted, however you think about it) so they are never clicked

  3. Another option would be to set a max-click-count for sites or domains, per some time period

@vertigo220
Copy link
Author

Also, another possible criteria would be the type of ads, e.g. pop-up and video ads could be counted as x# of ads for the above calculation (and x could be user-configurable) and/or could result in an automatic no-click policy for that site.

@vertigo220
Copy link
Author

Thanks for the thoughtful response – some practical ideas below, but I wonder whether we really want to consider the click as a 'reward'? While it is true bad sites may be paid more in the short-term, the goal is to diminish the appeal of the larger economic model, with networks refusing to pay those sites, and with clicks leading (ideally/eventually) to diminishing profits...

I suppose it's possible sites that plaster ads everywhere and, as a result, get a lot of clicks might draw attention from the advertisers for suspiciously high click amounts/percentages, and therefore have those advertisers drop them, this seems highly unlikely. First, it could only happen if AN was popular enough to significantly shift the click ratios, which it's not. I doubt even uBO is, and I'd wager AN has far fewer users than uBO. Second, it would rely on the advertisers actually getting true, usable metrics from the ad agencies and from Google and Facebook (who make up an advertising duopoly), and we already know they've lied, and likely continue to lie, to their customers. Unfortunately, this means the entire premise of AN is questionable, since the attempted corruption of user data relies on the advertisers, who are ultimately paying for the ads and losing out, realizing it's happening, whereas Google and Facebook not only don't care, but are incentivized to hide the truth and only report the number of clicks, making it look like they're providing a great service. This means not only is it the businesses trying to advertise, not the advertising networks that are the real problem, losing out, but that AN may potentially be making the problem worse by clicking. And so it's very unlikely that a site using an egregious number of ads, and therefore receiving a higher number of clicks, is going to even stand out, much less have the advertising network or the advertisers themselves pull ads from that site. So I do think such sites will absolutely be rewarded, through receiving more ad revenue for their actions.

And since the ads are hidden from the user, they can't even see how many ads a site uses in order to decide whether to whitelist or blacklist the site, assuming it even occurred to them to do so.

  1. The user does (or can) see the number of ads on each page, on the ADN menu badge

Again, most users, especially, I assume, those that would use AN vs uBO or AB, want to simply have active protection and to be able to surf the web without really thinking about it. So the percentage of users who are going to even consider this factor, then pay attention to the number of ads, then know or keep track of whether the number on any particular page is excessive and act on it, seems very small to me.

  1. But seems this could be a (automatic) metric for disabling clicks on a site (perhaps via a global setting): sites with ad counts above a threshold are automatically whitelisted (or blacklisted, however you think about it) so they are never clicked

That would probably be the easiest solution, but not necessarily the best, and it can get tricky. What if a site the user really likes and wants to support is auto-blacklisted and they don't realize it? What about a site that's long with a lot of good content and therefore, due to its length, has more ads, versus a short site that's half content and half ads? What if the user still wans some ads clicked regardless, to support the sites and/or to help throw off the tracking?

  1. Another option would be to set a max-click-count for sites or domains, per some time period

This seems like a better solution than just a white/blacklist, whether manual or auto. I still don't like it quite as good as my idea* just because it provides the same clicks to a site that abuses ads as one that doesn't, but I'm sure it would be much simpler and at least would help. It also brings up another potential issue: how long after visiting a site are the ads clicked? If right away, not only would the site receive revenue even if the user doesn't actually stay on the site or use it, or if the site doesn't provide the intended functionality (e.g. as described with the shorteners above), but, if the networks did decide to do something about AN, they could simply dismiss clicks that happen right after pages load, since most people don't click on ads that quickly. The same goes for if the ads are clicked simultaneously or very close to each other. So to simulate real clicks, they should happen after being on the site for a while and spaced out. And, ideally, it should be based on time the user is actually viewing the site, not just when the page loads, i.e. if a page is loaded in an inactive tab, no clicking should be done until the tab is active, nor should the clock start until then.

*I want to clarify that I'm not saying my idea is the best solution. There very well may be a better way to handle it, and my suggestions are only meant as a provocation for discussion about the issue and the best way to deal with it. Thank you for caring enough to do so and to allow me to play devil's advocate with your suggestions without taking it personally. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants