Skip to content
This repository has been archived by the owner on May 8, 2020. It is now read-only.

pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed. #171

Open
GigiisJiji opened this issue Dec 3, 2018 · 17 comments

Comments

@GigiisJiji
Copy link

GigiisJiji commented Dec 3, 2018

I run demo on win7, python 3.7, have download chromium with command ' pyppeteer-install'
script as:

import asyncio
from pyppeteer import launch

async def main():
browser = await launch(headless=False)
page = await browser.newPage()
await page.goto('https://www.google.com');
await page.screenshot({'path': 'example.png'})
await browser.close()

asyncio.get_event_loop().run_until_complete(main())

but it has errors below:
Traceback (most recent call last):
File "F:/run.py", line 28, in
asyncio.get_event_loop().run_until_complete(main())
File "C:\Python37\Lib\asyncio\base_events.py", line 568, in run_until_complete
return future.result()
File "F:/run.py", line 25, in main
await page.screenshot({'path': 'example.png'})
File "C:\Users\admiunfd\Envs\gigipy3\lib\site-packages\pyppeteer\page.py", line 1227, in screenshot
return await self._screenshotTask(screenshotType, options)
File "C:\Users\admiunfd\Envs\gigipy3\lib\site-packages\pyppeteer\page.py", line 1232, in _screenshotTask
'targetId': self._target._targetId,
pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed.

Process finished with exit code 1

@Lucifaer
Copy link

Lucifaer commented Dec 7, 2018

Same problem, and I according to this to fix my code like this:

    async def spider(self, url):
        self.log.detail_info(f"[*] Start crawl {url}")
        self.log.detail_info(f"[*] {url} started at {time.strftime('%X')}")
        # Handle Error: pyppeteer.errors.NetworkError: Protocol error Runtime.callFunctionOn: Target closed.
        browser = await launch({
            'args': ['--no-sandbox']
        })
        page = await browser.newPage()
        await page.setViewport(self.set_view_port_option)
        await page.goto(url, self.goto_option)
        title = await page.title()
        filename = await self.translate_word(title)
        await page.evaluate(scroll_page_js)
        pdf = await page.pdf(self.pdf_option)
        await browser.close()
        self.log.detail_info(f"[*] {url} finished at {time.strftime('%X')}")
        return filename, pdf

But it doesn't work for me.This this answer said the error would happen if you call browser.close() while browser.newPage() has yet to resolve.

@dp24678
Copy link

dp24678 commented Dec 13, 2018

Has your problem been solved?

@Lucifaer
Copy link

I have solved my problem by rewriting the js code to python code like this:

async def scroll_page(page):
    cur_dist = 0
    height = await page.evaluate("() => document.body.scrollHeight")
    while True:
        if cur_dist < height:
            await page.evaluate("window.scrollBy(0, 500);")
            await asyncio.sleep(0.1)
            cur_dist += 500
        else:
            break

I guess the problem is that pyppeteer has a default running-timeout when executing js code or taking screenshot or saving page to pdf. Accroding to the results of my test, this default timeout is about 20 seconds. So when your work has been working for more than 20 seconds, that is, when your work times out, the running work will be automatically closed, which is independent of the timeout settings of the headless you set before.

@dp24678
Copy link

dp24678 commented Jan 22, 2019

My reason is that the version of the websockets package is incorrect, and websockets 7.0 is replaced by WebSockets 6.0.

@boramalper
Copy link

boramalper commented Mar 31, 2019

In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.

Run:

pip3 install websockets==6.0 --force-reinstall

and everything should be okay. =)

EDIT 2019-12-26: People also recommend websockets==8.1, which might be a better idea to try first since it's more up to date. =)

@pAulseperformance
Copy link

@boramalper I just ran into this issue. Similar problem with scraping a webpage for longer than 20seconds.

My initial traceback was "pyppeteer.errors.NetworkError: Protocol Error (Runtime.callFunctionOn): Session closed. Most likely the page has been closed."

Which led me to this #178
offering this solution. https://github.com/miyakogi/pyppeteer/pull/160/files

I really wasn't ready to modify the codebase of pyppetteer according to the above "solution", which to me is really more of just a hack. So I tried some other things and got the traceback which led me here.

Downgrading from websockets==7.0 to 6.0 instantly fixed this issue and should be the solution for anyone running into issue #171 or #178

Thanks!!

bomanimc added a commit to bomanimc/black-health-scraper that referenced this issue Apr 26, 2019
@andreroggeri
Copy link

Same here as described by @GrilledChickenThighs

Should this be fixed somehow on this repo ?

@boramalper
Copy link

boramalper commented May 25, 2019

It's a shame that this severe issue, that has a very simple fix, is yet to be resolved... It's been more than 6 months @miyakogi, what are you waiting for exactly?

See #170

@pengisgood
Copy link

pengisgood commented Aug 6, 2019

In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.

Run:

pip3 install websockets==6.0 --force-reinstall

and everything should be okay. =)

This suggestion saved me. After hours trying with other approaches. Thanks!
I checked my version of websockets, 8.0 didn't work either.

@ColdHeat
Copy link

ColdHeat commented Sep 7, 2019

@miyakogi would you consider transferring ownership of this repo to someone who's willing to maintain it? pyppeteer is a valuable tool but it's effectively broken while this issue is there.

@Francesco149
Copy link

in the meantime you could use my fork which has the websockets fix (without downgrading) and supports the latest chrome revisions on windows: https://github.com/Francesco149/pyppeteer

pip install pyppeteer_fork

@anaselmhamdi
Copy link

websockets==8.1 worked for me !

paul-tqh-nguyen added a commit to paul-tqh-nguyen/one_off_code that referenced this issue Dec 9, 2019
@vkdvamshi
Copy link

vkdvamshi commented Dec 25, 2019

import pyppdf.patch_pyppeteer

above fix worked with parsing website and collecting links in rapid fashion.

@Mattwmaster58
Copy link

duplicate of #62

@LennyLip
Copy link

LennyLip commented Mar 10, 2020

@marksteward

No idea what's wrong, I think it has something to do with the websockets dependancy. With 6.0, there are no issues, with 8.1, issues left and right

Same here!!! waitForSelector function raise pyppeteer.errors.NetworkError: Protocol error Target.activateTarget: Target closed. websockets 6.0 is OK

@mostafamirzaee
Copy link

I had the same problem which was resolved by

In case someone ends up here after having wasted hours... @dp24678 kind of hinted at it but did not explain properly: the issue is indeed with websockets package. Downgrading back to 0.6.0 helped.

Run:

pip3 install websockets==6.0 --force-reinstall

and everything should be okay. =)

EDIT 2019-12-26: People also recommend websockets==8.1, which might be a better idea to try first since it's more up to date. =)

I resolved the same issue by your solution, downgrading to 6.0. however, for your information 8.1 still has the same issue. because I was having this version originally when I faced this problem. so the bug is yet to be resolved by the author.

@Mattwmaster58
Copy link

Mattwmaster58 commented Apr 5, 2020

@mostafamirzaee It's actually not an issue in the websockets library per se, rather in chrome and pyppeteer. Chrome doesn't respond to pings over websockets, making the websockets (rightly) believe Chrome is dead, so it closes the connection. Older versions of websockets simply didn't support sending be pings to the server, hence why the problem doesn't show itself.

The fix is to tell websockets to not send any pings at all, and this fix has been applied to pyppeteer2, the on-going update to this library.

@boramalper since your comment is so far up would you mind including this info?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests