Skip to content
This repository has been archived by the owner on Apr 6, 2020. It is now read-only.

<all_urls> matching rule not working #14

Closed
ghost opened this issue Oct 22, 2017 · 10 comments
Closed

<all_urls> matching rule not working #14

ghost opened this issue Oct 22, 2017 · 10 comments
Labels

Comments

@ghost
Copy link

ghost commented Oct 22, 2017

That global pattern does not seem to be working, on 'default' is showing in the blocking rules pop-up.

To me the meaning of "Blocking of functionality across all domains, with a fallback, "default" blocking rule." is not not quite clear. How is 'default' a fallback, a fallback for what?

From my perspective it would make sense to start with a global rule blocking everything by default and then being able to dynamically ease (pop-up from the toolbar button) the rules for each domain needing relaxed rules, rather then tediously setting rules manually for each domain. For such dynamic setting there should be a general option to reload the page when new (relaxed) settings a applied.

@Thorin-Oakenpants
Copy link

From my perspective it would make sense to start with a global rule blocking everything by default

Default-deny-all is a lot of work for end-users. Deny-all on this list will render the web almost useless IMO. Advanced users can still easily set this (one click). No need to enable it by default

@ghost
Copy link
Author

ghost commented Oct 23, 2017

Likely that such extension will be not spread widely among the average user base but rather advance users, or say privacy conscious users, which are not in the majority. I understand it is matter of perspective though and that you have other preferences. The way the extension is currently designed I would not be utilizing it but rather work with user.js or set the preferences in about:config.

It still leaves an explanation of the 'default' blocking role as fallback? And <all_urls> matching rule not working.

@Thorin-Oakenpants
Copy link

Regardless of the default settings - making it easier to use widens the base, and that is a definite aim. While a pref will disable (or spoof in some cases with privacy.resistFingerprinting) the API, this extension allows the ability to apply domain level permissions - so it's a superior technical solution - is it use-ably superior? Depends on the end user and how easy toggling some presets on/off can be made (see my suggestion of creating some levels)

As for the default settings not seeming to work - I have the same. I block all cookies by default and allow a very very tiny handful of exceptions - and this is what blocks WAM from performing AFAIK, sine it needs to create a temp cookie (a bit over my head). Do you have cookies enabled?

@pes10k
Copy link
Owner

pes10k commented Oct 23, 2017

If I'm understanding right, there are a couple of issues here. I'm going to try and break them out. Please let me know if I've missed anything:

  1. "(default)" is a bad name for the "rule that will be used if there are no more specific rules"

I'm open to suggestions for what would be a better name here, but I don't think "global" is an improvement, since, to my mind, "global" suggests something thats in place universally / globally, which isn't the case here (since more specific rules will cause the "global" ruleset to not be used). I'd be grateful for other suggestions though!

  1. Default block vs. Default Allow

I understand the points being made here, but nearly every site breaks in a "universal block" configuration (based on this measurement work we did a bit ago, most relevantly the table on the last page, showing that almost 90% of the web breaks from disabling DOM alone. The cumulative effect of disabling everything would be higher). I appreciate the suggestion, but I'm in favor of leaving it as is. As is, folks who know what they're doing (and not using EasyList style subscription-based rule sets, which I'll get in here as soon as all the CSP dust settles) can go in and just block everything from the "(default)" rule set if needed.

Does that address your concern? If not, happy to continue discussing, but wanted to give the motivation behind the current choice.

  1. Things aren't working

@Thorin-Oakenpants is right that the current functionality relies on cookies being enabled (at least session length cookies). This is an odd choice I had to make, since as TO mentioned, its the only synchronous method for the background script to send information to the content script, which is needed to make sure that the extension can modify the DOM before any page scripts operate.

If there is a better / other way of pushing the configuration information into the content script in a synchronous manner though, please let me know. It would help simplify a number of parts of the implementation.

Thanks for the comments!

@Thorin-Oakenpants
Copy link

Thorin-Oakenpants commented Oct 23, 2017

I'm open to suggestions for what would be a better name here, but I don't think "global" is an improvement, since, to my mind, "global" suggests something thats in place universally / globally, which isn't the case here (since more specific rules will cause the "global" ruleset to not be used). I'd be grateful for other suggestions though!

Global is the correct term IMO. The <all urls> is applied to every single site* (i.e global) but is overruled/overriden by domain specific rules (i.e not global) ... So default global rules + domain-specific ruleset overrides

@Thorin-Oakenpants
Copy link

[a temp cookie] its the only synchronous method for the background script to send information to the content script, which is needed to make sure that the extension can modify the DOM before any page scripts operate. If there is a better / other way of pushing the configuration information into the content script in a synchronous manner though, please let me know.

I have no idea. Not doubting your skills/knowledge at all snyderp, but someone like @gorhill 's insight if he could spare a minute or too would be invaluable

@gorhill
Copy link

gorhill commented Oct 23, 2017

I pondered the cookie idea months ago. But I never went ahead with this because I think there were too many things I would end up having to worry about (be sure to never leak the cookie to a remote server, unsure whether reading the cookie in content script would cause synchronous calls between content process/main process, overhead, etc.)

Currently what looks promising is webRequest.filterResponseData, which would allow to inject stuff at the top of the page on the fly. I do plan to investigate using this Firefox-specific extension to webRequest: gorhill/uBlock#3069.

@psnyde2
Copy link
Contributor

psnyde2 commented Oct 24, 2017

@Thorin-Oakenpants Thanks for roping @gorhill in ;)

@gorhill thank you for your thoughts. Not leaking the cookie is the easy part I think (as long as its deleted before any page script executes, and its removed from any outgoing headers (with webRequest.onBeforeSendHeaders) should make sure its never leaked.

The bigger concern I have is multiple requests to the same origin, which means it difficult to know when its OK to delete the cookie. My current, not totally satisfying, strategy is to push the cookie value into localStorage, so that even if there are race conditions on managing the cookie from multiple requests, I can be sure that the localStorage has the most recent settings for the domain. This raises possible finger printing issues, if the extension ever got popular, but, at least for now, I'm going to punt on it (and, I doubt there is enough entropy for it to be useful, but, its on my mind).

Just out of curiously, whats your interest in sync pushing values into the extension?

You might find this useful in the short term too, if you need to take the cookie approach, but might be worried about the size of options you're pushing into the cookie.

And, last, that extension sounds great (though I'm a little worried about the fragility of parsing a document fragment to shove more script into the document, and it'd take even more careful managing of the CSP header…). Anything I can do to support its general support? I'm hoping to keep the code 100% browser agnostic for as long as I can…

@pes10k pes10k closed this as completed Nov 3, 2017
@ghostwords
Copy link

Regarding conditionally (or with configuration you don't have to wait for, similar problem) injecting a before-anything-else script, there is also this "spam page frames from chrome.webNavigation.onCommitted with messages until the frame responds or message limit is hit" approach: gorhill/uBlock#1930 (comment)

@pes10k
Copy link
Owner

pes10k commented Feb 8, 2018

@ghostwords Thanks for the heads up, I tried something similar and I found the same as the first note in that comment (that it seemed to work) but I had worries about situations like cached pages, or opening the same frame twice, or just general places where the page content could load quickly and the browser might not fire the desired onCommitted events before the page rendered.

So, i think it'd probably work, but the browser folks not promising one way or another made me too nervous to keep at the approach, and go with the cookie hack :-/

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

5 participants