Skip to content
This repository has been archived by the owner on Jun 24, 2022. It is now read-only.

For users on very slow connections, block document.written scripts #17

Closed
KenjiBaheux opened this issue Apr 22, 2016 · 73 comments
Closed

Comments

@KenjiBaheux
Copy link
Collaborator

For users on high latency connections, parser-blocking scripts loaded via document.write cause significant delays in user perceived page load latency.

Most scripts inserted via document.write are for third party content. A quick survey of the third parties suggests that async solutions are commonly offered. Given how bad the user experience can get for users on slow connections, it's quite likely that a large fraction of page visits never succeeds. The hope is that these newly rescued page views would incentivize publishers to adopt async solutions.

Chrome is exploring the following intervention:

  • for users on slow connections, block network requests for document.written scripts
  • design doc
  • crbug
@shivanigithub
Copy link

Possible Spec edits:
The "Note" in 4.12.1 mentions about scripts inserted using document.write in https://html.spec.whatwg.org/multipage/scripting.html#the-script-element:
"When inserted using the document.write() method, script elements execute (typically blocking further script execution or HTML parsing), but when inserted using innerHTML and outerHTML attributes, they do not execute at all."
We can add to this saying:
"The user agent may elect not to load synchronously loaded script elements inserted using document.write e.g. document.write('<scr' + 'ipt src="' + src + '" ></scr' + 'ipt>'). This is because such usage can cause significant delays in user perceived page load latency."

And also add similar statement in the Warning section in https://html.spec.whatwg.org/multipage/webappapis.html#document.write()

@shivanigithub
Copy link

Summarizing mail thread discussing the spec changes here:

As per feedback from ojan@ and domenic@ on the process of spec change:
"In another thread, Domenic suggested that we not add a diff to the interventions repo. Instead we can do a pull request to the HTML spec at the WHATWG and link to it from the issue in the interventions repo."

@toddreifsteck
Copy link

Per above, I've opened the linked issue on W3C HTML5 spec.

@domenic
Copy link
Collaborator

domenic commented May 31, 2016

Per above, I've opened the linked issue on W3C HTML5 spec.

Wrong repo :). We were talking about a pull request to https://github.com/whatwg/html, which is what is implemented in browsers.

Possible Spec edits:

It's important to note include normative requirements (even "may"s) inside non-normative notes. Instead, we need to work through the normative algorithms.

Let's figure out instead how innerHTML normatively prevents script execution. It turns out it's done in the HTML parser itself. See e.g. https://html.spec.whatwg.org/multipage/syntax.html#parsing-main-inhead, where "A start tag whose tag name is "script"" step 4 says

If the parser was originally created for the HTML fragment parsing algorithm, then mark the script element as "already started". (fragment case)

(Note that "in body" delegates to "in head" for script, so this is the place to look.)

To fix this, your PR should probably just add an extra step that's very similar: something like

If the parser was invoked via the document.write() method, then optionally mark the script element as "already started". (For example, the user agent may wish to disallow third-party scripts inserted via document.write() under slow network conditions, or when page loading time is already high.)

I'm happy to write this PR if you'd prefer, although we're always happy to have more contributors to HTML. Instructions at https://github.com/whatwg/html#pull-requests

@domenic
Copy link
Collaborator

domenic commented May 31, 2016

Oh, but of course we should also update the note you found, too! My suggestion for how to do that would be to give an id to the new <li> with that step, and then insert the word "usually" before "execute" and have the "usually" link to that <li>.

@shivanigithub
Copy link

Thanks for the feedback , Domenic.
It would be great if you write this PR. Thanks for offering.

domenic added a commit to whatwg/html that referenced this issue Jun 7, 2016
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
@domenic
Copy link
Collaborator

domenic commented Jun 7, 2016

OK! I wrote up the pull request at whatwg/html#1400.

@shivanigithub
Copy link

Great, thanks!

@KenjiBaheux
Copy link
Collaborator Author

KenjiBaheux commented Aug 9, 2016

Let me document the current behavior of the intervention and its associated feedback loops in Chrome.

Intervention

Chrome will block the load of document.written scripts when the following conditions are met:

Script related criteria

  • Top level document. The blocking of scripts is restricted to the top level document (in other words, the intervention does not apply to document.written scripts within iframes)
  • Parser-blocking. Scripts that are parser-blocking are prohibitively expensive. Note: since asynchronous scripts are not parser blocking, the intervention does not apply to these scripts.
  • Cross site (i.e. hosts with different eTLD+1). To mitigate breakage of key functionality, the intervention only targets cross site scripts (i.e. scripts hosted on js.example.com will continue to work when inserted via document.write on www.example.com).
  • Browser HTTP cache miss. The blocking of scripts only applies in the case of a browser HTTP cache miss. If the document.written script is already available in the browser HTTP cache, it is allowed to execute, since there is no network delay incurred in this case.

Circumstances

  • Slow connections. Since the performance penalty of these scripts leads to a worst case page load experience in slow connections, we intend to block scripts only if all the above conditions are true and if the connection is slow. Initially, the change will be restricted to 2G. In the future, the change might be extended to other users on slow connections, such as slow 3G or slow WiFi.
  • Not a reload. To further mitigate the possibility of breaking user functionality, Chrome will not intervene if the user triggered a reload.

Browser HTTP cache hit case

When the script needed by the document.write statement is found in the browser HTTP cache, Chrome will use it even if stale, up to a lifetime of 1 year. To mitigate version skew issues, Chrome might queue a low priority asynchronous request in order to update stale assets.

Feedback loops

In order to make the intervention actionable for developers, we will offer the following feedback loops.

Warning in devtools

Since Chrome M53, Devtools is issuing warnings for potentially problematic document.write statements. The warning is issued regardless of the connectivity.

Example:
(index):34 A Parser-blocking, cross-origin script, http://www.example.org/a.js, is invoked via document.write. This may be blocked by the browser if the device has poor network connectivity.

Intervention HTTP header (strawman)

When a script inserted via document.write has been blocked, Chrome will send:

Intervention: <https://shorturl/relevant/spec>

When a script inserted via document.write is found and could be blocked in different circumstances, Chrome might send:

Intervention: <https://shorturl/relevant/spec>; level="warning"

The intervention header will be sent as part of the GET request for the script (asynchronously in case of an actual intervention).

Updates

10/25/2016: fix incorrect usage of the term "cross origin" when we meant "cross site (i.e. hosts with different eTLD+1)".

zcorpan pushed a commit to whatwg/html that referenced this issue Sep 1, 2016
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
@matthewp
Copy link

matthewp commented Sep 7, 2016

I think this is assuming that document.write() is only used for additive third party scripts like ad networks but one common use case of document.write is to fallback to a local script when a CDN fails. You can see an example of this from a very popular starter kit. This pattern has existed as far back as CDNs have been around and is probably encoded in a large number of websites.

@PaulKinlan
Copy link

In your specific scenario this intervention not trigger because your fallback script is not cross-origin.

@matthewp
Copy link

matthewp commented Sep 7, 2016

That's only true in Chrome. My concern is that giving UAs permission to do this will cause sites to behave differently in some browsers and break in some browsers.

@hexalys
Copy link

hexalys commented Sep 8, 2016

I object to blindly targeting cross-origin. It's too restrictive with potential to break some sites' core functionality. Including half a dozen of my own sites.

As I mentioned in a Tweet to @PaulKinlan I have been using cross-browser patterns such as:

document.write('<script src="/cdn/jquery/'+(!Object.defineProperty?'1.8.3' : ie<11|this.opera ? '1.11.1' : '2.2.4')+'min.js"><\/script>');

or with conditionals to load browser shims such as:

!Object.create && document.write('<script src="/cdn/lib/es5-shim.min.js"><\/script>');

As described, prohibiting cross-origin scripts from CDNs break the above use cases. In addition, given the above jQuery version method, I use jQuery fallbacks from CDNJS to Google's API. Both of them via document.write().

Some additional points:

  1. This is the only way to conditionally load synchronous scripts from a CDN, for good reasons.
  2. The described aim of this intervention is "reducing the impact of document.written scripts on user perceived page load latency". AFAICT that script usage early in the <head> makes no visible impact on performance.
  3. Browser aren't likely able to properly evaluate such document.write(s) in the preload scanner. One of the initial motives for the crbug. And in this case, attempts in preloading those would defeat the very purpose of using them conditionally for polyfills with feature detection conditions.

I am all for preventing poorly implemented document.write "for third party content such as ads and trackers" that inject late content in the page. But we need a reasonable compromise here.

As mitigation, here an option to consider, which i'd be fine with:

Allow cross-origin scripts only when loaded in the <head>.
(easy and clear spec with no significant impact on performance).

@domenic
Copy link
Collaborator

domenic commented Sep 8, 2016

Why do you believe that script usage (actually, it's document.write usage) early in the head makes no visible impact on performance? It janks the page just as bad as document.write anywhere else (it stops further parsing of the page, much less any rendering and layout, until the script is downloaded!). <head> is not a magical no-jank container.

@bzbarsky
Copy link

bzbarsky commented Sep 8, 2016

it stops further parsing of the page

For what it's worth, this may not be true in all UAs. It needs to stop observable DOM construction, and it needs to act as if parsing had been stopped, but you can in fact parse ahead speculatively if you want, and some UAs do that.

@ojanvafai
Copy link
Member

The combination of DOM construction blocking and speculative parsing is actually the problem here. The speculative parsing will load resources that are needed later in the page, delaying the load of the doc.written script. On a 2G connection, by the time you doc.write the script, the network connection is already flooded loading other resources, even if it's in the head.

You can't just reprioritize the resource loads at that point because it's the upstream network bandwidth that's flooded.

We were consistently seeing page load improvements of 10-30 seconds on many pages on 2G.

@hexalys do you have a page you could point us to that doesn't see a significant improvement on a 2G connection? I'd like to understand it better. Maybe we should be doing something more nuanced.

@matthewp
Copy link

matthewp commented Sep 8, 2016

@ojanvafai A page might load quick, but load incorrectly if a needed script is ignored. If I write the entire page using jQuery and jQuery doesn't load because of this algorithm, the page is broken on such a connection.

@hexalys
Copy link

hexalys commented Sep 9, 2016

Why do you believe that script usage (actually, it's document.write usage) early in the head makes no visible impact on performance?
@domenic

By tests and observation of network waterfalls in fairly all browsers. There is little to no timing penalty for the overall document in loading jQuery via document.write(). Compare those 2 tests as examples:

Test A : Nexus 7 - Chrome - 2G speed (using document.write jQuery).
Test B : Nexus 7 - Chrome - 2G speed (using normal sync jQuery).

The only difference is a very small negligible priority penalty of 10-20ms only for the script itself.

It janks the page just as bad as document.write anywhere else (it stops further parsing of the page, much less any rendering and layout, until the script is downloaded!). is not a magical no-jank container.
@domenic

It doesn't stop rendering any more than any other synchronous script. It's not that's it's a magical no-jank container. As an aside, I just assume that having it at the top where the body is not yet parsed has a more minimal impact with the effect document.write can have on the document. The main point here is that, it obviously loads those scripts much faster and it is the only proper way to use document.write() without penalty. In contrast, see how bad document.write is at the bottom on Exhibit C below.

Test C : Nexus 7 - Chrome - 2G speed (adding an ES6 shim using document.write at the bottom).

The impact on page load or 'Document Complete' there is minimal, as it is a small script that doesn't do anything else other than being parsed. It however clearly delays 'Render' by 1-2s and 'Interactive' by 4-5 seconds already. That is the kind of harmful practice everyone should be told not to use at all.

For what it's worth, this may not be true in all UAs. It needs to stop observable DOM construction, and it needs to act as if parsing had been stopped, but you can in fact parse ahead speculatively if you want, and some UAs do that.
@bzbarsky

Correct. That's true of Chrome since Chrome PLT Improvements Q1 2013. According to that document, even if a script blocks it may speculatively download images if the network is idle (i.e. a light head). The important thing here, is that document.write() does not block assets in the head from being preloaded. Images are low priority anyway and won't start if you are busy with quite a few script + css downloads.

The combination of DOM construction blocking and speculative parsing is actually the problem here. The speculative parsing will load resources that are needed later in the page, delaying the load of the doc.written script.
@ojanvafai

Indeed as shown on C. But it's only a performance problem past the <body>, depending on how many script resources or images you have in between, but definitely not when prioritized in the <head> as demonstrated.

You can't just reprioritize the resource loads at that point because it's the upstream network bandwidth that's flooded.

Agreed past </head>. That's the proper recommendation to extract from this and promote.
Ultimately I would have no issue with preventing document.write() in the <body> entirely, even for same origin scripts. But again, in the <head>. I don't see any harm at all.

We were consistently seeing page load improvements of 10-30 seconds on many pages on 2G.

I can see that with any bottom document.write(s), especially if those also modify the DOM themselves, or would do worse by loading additional assets. Those are really obvious performance killer and an anti-pattern.

I totally get document.write() isn't ideal. But that's the only way that job gets done, at little to no cost, without a recourse to a more modular approach which in itself usually require an additional script and a performance cost in itself. (which mostly makes sense for a web app centric page as opposed to the average web site).

Clearly document.write() is not an enemy of performance when carefully used, and for the right reason.

@RByers
Copy link
Member

RByers commented Sep 10, 2016

As anyone who follows blink intent threads knows, I'm very concerned about web compat (often arguing that risking breaking even 0.01% of page views is unacceptable). There's no doubt that this is going to break some content. But when evaluating the compat cost (even in isolation before considering the benefits) we have to consider that this fixes 10% of pages loads on 2G that are otherwise so slow the user gives up waiting before anything paints. So to even be at neutral compat impact this would have to break at least 10% of 2G page loads. The initial evidence suggests it's well below that threshold. So this may be a rare intervention where we get to BOTH improve the user experience on many pages AND increase the number of pages that load correctly!

Of course there's still a risk of some class of pages being harmed more then they're helped. I suggest we focus on collecting examples of real pages that are broken with little benefit to see if there are any patterns that can be used to tweak the discrimination of this heuristic.

Also, interventions are premised on the idea that developers who follow best practices will not be impacted by them. @hexalys describes some use cases that seem pretty reasonable to me. Perhaps there's some way we can allow those to work without loosing most of the benefit?

@FremyCompany
Copy link

I think it is ok for an inline script to use document.write; technically an inline script is nearly identical to having the script tag in the html source.

@RByers
Copy link
Member

RByers commented Sep 10, 2016

Interventions generally have an opt-out. If there are some legit use cases, should this get an opt-out too - at least for now? Eg. What if the script loaded was previously listed as a <link rel=preload> target? @hexalys would that be enough to address your use cases? That would address the issue @ojanvafai describes with the load order getting hurt, but could obviously make things worse if the resource is large and usually not needed.

@michaelhood
Copy link

@RByers If we could expand that to a <link rel=preconnect>, it would cover my use cases: where the URI for the script tag is assembled dynamically at runtime (usually based on the host page's URL), but one of them is loaded on 100% of requests. Always from the same CDN domain, cross-origin.

@RByers
Copy link
Member

RByers commented Sep 10, 2016

I think it is ok for an inline script to use document.write; technically an inline script is nearly identical to having the script tag in the html source.

I'm not an expert here, but Ojan's argument above is that it's not - at least in Chrome. The script tag is exposed to the preload scanner (so can trigger loads early) while the doc.written script is not and so may block loading longer (especially due to all the later speculative resources that are now hogging the network).

Also the biggest problems come when one render-blocking script doc.writes another which doc.writes another and so on. As I understand it this can't happen with script tags, even when added via DOM APIs.

@hexalys
Copy link

hexalys commented Sep 10, 2016

Eg. What if the script loaded was previously listed as a target? @hexalys would that be enough to address your use cases?
@RByers

You mean to opt out of the restriction? It would for the multi version jQuery load, yes. Or the link rel=preconnect like @michaelhood said. Which would cover the 2nd case and CDN domains broadly. That's a good opt-out idea, I could get along with.

Also, should you decide to preload document.write(s). It's no too hard to get around it for case 2. Which I already do. If you look at my source on this site. I have two custom methods for document.write(s) and async loads for both contexts.

So I technically load this:
!Object.create && document.write('<script src="/cdn/lib/es5-shim.min.js"><\/script>');
as:
!Object.create&&cd.ws('es5-shim/4.5.9/es5-shim.',_);
which the preloader would not interfere with. Which seems a reasonable workaround.

Also the biggest problems come when one render-blocking script doc.writes another which doc.writes another and so on. As I understand it this can't happen with script tags, even when added via DOM APIs.

Well you can load synchronous ajax with similar effects. Though Chrome already get a "Synchronous XMLHttpRequest on the main thread" deprecation console warning to deter the practice.

@RByers
Copy link
Member

RByers commented Feb 21, 2017

First shipped in Chrome 55

@mrquincle
Copy link

This ends up in my /var/log/syslog.

May 31 17:50:45 V compiz[2597]: A Parser-blocking, cross site (i.e. different eTLD+1) script, https://ssl.google-analytics.com/ga.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details.

And many similar ones that are fed into my syslog through compiz.

chrome://version/

Google Chrome	58.0.3029.110 (Official Build) (64-bit)
Revision	691bdb490962d4e6ae7f25c6ab1fdd0faaf19cd0-refs/branch-heads/3029@{#830}
OS		Linux
JavaScript	V8 5.8.283.38
Flash		25.0.0.171 $HOME/.config/google-chrome/PepperFlash/25.0.0.171/libpepflashplayer.so
User Agent	Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36
Command Line	/usr/bin/google-chrome-stable --flag-switches-begin --enable-experimental-web-platform-features --flag-switches-end
Executable Path	/opt/google/chrome/google-chrome

It looks like this is a place where issues like these are considered, if I'm wrong, I apologize and please correct me in that case!

@bryanmcquade
Copy link

Hmm, I'm not sure why messages Chrome writes to the console are ending up in your syslog. I'd recommend disabling that (not sure how, will depend on how your system is set up) if you don't want these to show up there.

domenic added a commit to whatwg/html that referenced this issue Jul 13, 2017
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
@verdy-p
Copy link

verdy-p commented Aug 11, 2017

I see this symptom in Chrome over a fast FTTH connection (>100Mbps symetric), which is definitely not a slow 2G connection.

It occurs with various websites using scripts to load maps or satellite views from Google Map (and i doubt it is slow; not all sites are affected and not Google Map itself.

Those that are affected are using document.write() to insert a script, that will then load the map asynchronously (this works) then will load additional data or markers on it: a part of those markers are visible, then the map and markers disappear (the canvas turns to gray) and then fails.

Apparently Chrome does not correctly track the session, between successive requests, and then fail with the XSS issue when GoogleTagService trackers are loaded, and then we are returned an obscure error about daily quota exceeded for using the map...

Example of what is logged in the console when visiting http://www.maplandia.com/belarus/mahilyow/podgoritsa/


2www.maplandia.com/:39 A Parser-blocking, cross site (i.e. different eTLD+1) script, http://www.googletagservices.com/tag/js/gpt.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details.
(anonymous) @ (index):39
www.maplandia.com/:43 Uncaught TypeError: googletag.sizeMapping is not a function
at www.maplandia.com/:43
2www.maplandia.com/:769 A Parser-blocking, cross site (i.e. different eTLD+1) script, http://www.google-analytics.com/ga.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details.

@verdy-p
Copy link

verdy-p commented Aug 11, 2017

Apparently this is caused by GoogleAnalytics when a page inserts its tiny script with document.write() near the top of page, that will post an asynchronous request once the page is loaded.
Visibly the Google Analytics tools have bugs.

@verdy-p
Copy link

verdy-p commented Aug 11, 2017

So Google Chgrome blocks Google Tag Services (which uses such bad practices of user tracking with cross site scripting). It is prevalent anyway on so many sites to track visitors and monetize websites via profiling of visitors in order to post "relevant" ads. These services won't work, it may be a good thing, but websites will no longer get monetized using the Tag Servies they subscribed Google for, so Google won't pay them.

Is Google Tag Services top be dead? if it cannot postpone some scripts that will run at end of page load (even when they are scheduiled to be executed like here after 10 seconds and the page is compeltely loaded since long and the bandwidth is fully available to posting trackers andd loading porfiled ads?) Apparently there's a conlfict between the own's site profiling needs and the tracking made by Google itself when using its map services, where google will also profile users to send ads or customize the rendered map.

Note: it may as well be bugs in using GPT and Google Map together on the same page, beause I can see several objets ion the console that should have been already initialized but that have been cleared when Google Map was loaded first

Note: I'm not the author of the "maplandia.com" website, it's just no longer usable in Chrome 60, but it still works in IE, Edge, Firefox, or Opera, on Windows 10 x64. I did not test on MacOS, iOS or Android.

@normanzb
Copy link

The issue lies in how fast is fast and how slow is slow, 2G network may seems slow to someone who is inpatient to visit a large web page, but it is perfectly fine for a small web page which is viewed by somebody who occasionally visit it with a casual mind.

Google seems decided that it wants to be the authority that defines the word "fast" and "slow" for us, make them a universal value that all chrome users and developers should compliant.

@shivanigithub
Copy link

Note that the logs mention this as a warning for developers so they know that their site may behave differently in slower networks. But from the logs given above (Copying them here for reference), it does not seem that the network is considered slow and scripts are blocked since there is no subsequent error message.


2www.maplandia.com/:39 A Parser-blocking, cross site (i.e. different eTLD+1) script, http://www.googletagservices.com/tag/js/gpt.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details.
(anonymous) @ (index):39
www.maplandia.com/:43 Uncaught TypeError: googletag.sizeMapping is not a function
at www.maplandia.com/:43
2www.maplandia.com/:769 A Parser-blocking, cross site (i.e. different eTLD+1) script, http://www.google-analytics.com/ga.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details.

@fwebdev
Copy link

fwebdev commented Aug 14, 2017

Google stopped the implementation of that Browser Interventention, because the performance improvement on real Users was too small.
In current Chrome Version there is also no logging in dev Console for me anymore.

https://bugs.chromium.org/p/chromium/issues/detail?id=575850

@domenic
Copy link
Collaborator

domenic commented Aug 14, 2017

@fwebdev that bug you link to is not related to the intervention discussed here. It has to do with document.write() and script tags, but with executing them more aggressively, not with blocking them.

@EddieOne
Copy link

I get this error on a fast connection.

cap.js:153 A Parser-blocking, cross site (i.e. different eTLD+1) script, https://back20.keycaptcha.com/swfs/caps.js?uid=79849&u=http%3A%2F%2Fstatic.y8.com%2Fupload&r=0.10667958468013206, is invoked via document.write.

I thought it was for 2g only?

@bryanmcquade
Copy link

That's right, this is a warning to let site developers know that their scripts may be blocked for users on slow connections, even if they aren't currently on a slow connection.

The full message is:
A Parser-blocking, cross site (i.e. different eTLD+1) script, , is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details.

soundasleep added a commit to soundasleep/statgit2 that referenced this issue Sep 22, 2017
This intervention was being triggered because we were using old Google Charts
loader code, and Chrome was warning that on slower connections the charts
may not be loaded (to help the user experience). Using the latest loader code
listed at https://developers.google.com/chart/interactive/docs/basic_load_libs
resolves this issue.

See WICG/interventions#17
alice pushed a commit to alice/html that referenced this issue Jan 8, 2019
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
@Astara
Copy link

Astara commented Oct 6, 2019

Am I to understand that I should not get this error in my console log on a 30/10Mbps (u/d) cable-modem connection? Or...why might I get this error if I'm on a home connection?

@zcorpan
Copy link

zcorpan commented Oct 7, 2019

@Astara read the comment above #17 (comment)

@Astara
Copy link

Astara commented Oct 7, 2019

I read that -- in fact I remember almost that exact wording in the message I saw in Opera's console window. Ok, "dev-level msg", about what could happen for any reason with cross-site content inserted by a doc.write. A bit arcane, but a useful feature to have to preserve interactivity.

Is there, or should there be a configurable somewhere to define 'slow'? Or is it done algorithmically based on performance of the static, cross-site content on the page (i.e. hypothetically, not in some browser-specific specific instance). I'm guessing 'unspecified' & possibly given that this is talking about allowed behavior.

Ok, thanks for the feedback!

@nucleare
Copy link

nucleare commented Jul 14, 2020

Is there any way to override this or prevent this from occurring automatically in the browser?

My apologies in advance if I am in the wrong place to report this issue (as I'm not sure if reporting it as a new issue was applicable, and) as I have tried resolving this matter for days to no avail and have been lead to this discussion from the this page as shown in the console message here:

2020-07-14 08_33_42-DevTools - olui2 fs ml com_gedp_products_marketpro aspx

I am hardwired on a desktop running Chrome Version 83.0.4103.116 on a connection of greater than 300Mbps as shown in the speedtest here:

2020-07-14 08_37_37-Speedtest by Ookla - The Global Broadband Speed Test

Which seems to be contrary to the alleged remarks of only happening to users on 2G as confirmed by comment #17 and of all my efforts which include but is not limited to disabling my firewall, enabling pop-ups, enabling insecure content from all associated websites of this page, changing default programs, verifying group policies and what else I or any troubleshooting manual could suggest related to the matter that I could think of to determine why this page will not open the java application to open and these console messages are all I have left to go off of.

Please correct me if something else other than the browser that may be preventing me from loading this page or is it possibly something else going on not shown in the console preventing me from loading the application?

I am trying to load an application called MarketPro from merrilledge.com, it is a stock market trading application that is not loading as intended and continues to lead me to another page.

In case there is any question about my attempt to contact Merrill Edge directly they have confirmed that it should be working but even after going through the troubleshooting guide and taking what other steps I could think of unrelated to Chrome, this is all I have left. I realize this is not a support page or technical support page but it seems to be an error related to this particular parser in Chrome's console that I have yet to eliminate as a reason for my being unable to access the application. So any re-direction or help would be greatly appreciated.

@shivanigithub
Copy link

@nucleare : The console warnings do not imply that this resource was actually blocked since there was no error log that confirmed it. I suggest testing if this works in other browsers or not. If the issue is occurring only on Chrome, you might want to open an issue via https://bugs.chromium.org/p/chromium/issues/entry

@nucleare
Copy link

@nucleare : The console warnings do not imply that this resource was actually blocked since there was no error log that confirmed it. I suggest testing if this works in other browsers or not. If the issue is occurring only on Chrome, you might want to open an issue via https://bugs.chromium.org/p/chromium/issues/entry

Thank you for the reply and insight. I've only tested it on another computer with a fresh install of chrome and with Microsoft Edge on that same, other computer with no difference in results. Since it seems to produce an error on Microsoft Edge as well then it seems it's not exclusive to Chrome.

Once again, I appreciate the insight and pointing me in the right direction if it had been a chrome issue.

@shivanigithub
Copy link

@nucleare : I would also suggest to test on a non-Chromium browser like Firefox/Safari/some other (both Chrome and Edge use the chromium code) to make sure it is not a bug in chromium.

@CoderNie
Copy link

CoderNie commented Jun 22, 2021

block network requests for document.written scripts

Does it means this intervention will block both download and execution of the script?

I set my chrome to 2G mode, but I still has not observed this intervention

@domenic
Copy link
Collaborator

domenic commented Jun 23, 2022

As part of shutting down this repo (see #72), I'll close this issue. The behavior here is standardized in HTML so further spec-level discussions aren't necessary.


For those interested in the Chromium project's implementation, and not the standardization process, I encourage you to discuss on the Chromium issue tracker, at https://crbug.com/. Here are some parting notes on things I discovered while investigating the status of Chromium's implementation:

Chromium shipped (in version 55):

  • Disabling such scripts on 2G connections
  • A warning, on any connection, for such scripts (so that you know that for any 2G users, they might get a different experience than you are getting, on a good connection)

Chromium has a disabled-by-default feature to block such scripts on slow connections in general, not just connections that advertise as 2G.

I've updated the ChromeStatus for these two features: shipped 2G feature, not-shipped slow-connections feature. I've also inquired on the bug for the slow-connections feature about the future of that work.

Again, all discussion on these Chromium-specific things should happen on the Chromium issue tracker, and not here :).

@domenic domenic closed this as completed Jun 23, 2022
@CoderNie
Copy link

CoderNie commented Jun 23, 2022 via email

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests