Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Puppeteer dev branch! #704

Open
garris opened this issue Mar 16, 2018 · 45 comments
Open

Puppeteer dev branch! #704

garris opened this issue Mar 16, 2018 · 45 comments

Comments

@garris
Copy link
Owner

garris commented Mar 16, 2018

This is happening!
image

This branch...
https://github.com/garris/BackstopJS/tree/puppeteer

is tracking this branch...
https://github.com/krisdigital/BackstopJS

Interested in contributing? ...please coordinate with @krisdigital.

What you need to know...

  • The Puppeteer engine option will be released as soon as we are at feature parity with Chromy and Phantom.
  • We will continue to support Phantom, Chromy and Puppeteer moving forward.
  • You are awesome.
@krisdigital
Copy link
Contributor

krisdigital commented Mar 16, 2018

Hello to all potential puppet players!

Here is a little road map for all who want to try it out or even contribute. In general, for me it works much more reliable and it is also faster.

  • Screnshots for viewport and document
  • Engine scripts onBefore and onReady, loadCookies and clickAndHover
  • ReadyEvent, readySelector, hide- and removeSelectors
  • Screenshots of selectors, selectorExpansion
  • let Puppeteer and Chromy coexist
  • start clean up by pulling common functions between Chromy and Puppeteer in a separate module
  • more cleanup and testing

Take it for a spin an let me know what you think 🚀

@garris
Copy link
Owner Author

garris commented Mar 20, 2018

Just bumped to 3.2.0 and pushed to NPM @beta!

TESTERS & COMMENTS! 🙏

npm install backstopjs@beta

@Aeotrin
Copy link

Aeotrin commented Mar 21, 2018

Works for me :) NOTE: the engine needs to be puppet
Node version requirement of > v8 as well.

This also fixed the issue of a white screenshot I was having when running against a local (.local or .dev) domain #706 .

@matthiaskomarek
Copy link
Contributor

matthiaskomarek commented Mar 23, 2018

I use the reference urls for our test suite with the following command:

backstop reference && backstop test

And it looks like that the test suite hangs up when an error occurs. In the console i get

COMMAND | Command `reference` ended with an error after [95.102s]
COMMAND | Error: Navigation Timeout Exceeded: 60000ms exceeded
                    at Promise.then (/Users/matthiaskomarek/Documents/Projects/zurich/visual-regression/node_modules/puppeteer/lib/NavigatorWatcher.js:71:21)
                    at <anonymous>
(node:35439) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: Navigation Timeout Exceeded: 60000ms exceeded
(node:35439) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

After this, it just stops running. But doesn't terminate the terminal process.

If no error occurs, everything is running fine!

@krisdigital
Copy link
Contributor

@matthiaskomarek do you get a different behavior using Chromy in the stable branch? Not sure if it has something to do with Puppeteer..

@AkA84
Copy link
Contributor

AkA84 commented Mar 25, 2018

Hey @krisdigital , thanks a lot for doing this! I'm testing it out and I saw that when I set debugWindow: true, there is no chrome windows opened when I run the tests. Has the param name changed?

EDIT: I also see no difference between debug: false and debug: true

@krisdigital
Copy link
Contributor

@AkA84 Both true, did not implement the debugWindow option, but it is not a problem to do... I never could find any use for the debug option though, as it throws just too much text out, I think it only makes sense if you have no debugWindow option like phantomJS.. #713

@AkA84
Copy link
Contributor

AkA84 commented Mar 26, 2018

@krisdigital thanks a lot again, and yeah I agree I didn't find the debug option useful unless i wanted to stare at a big dump of html for some reason!

@matthiaskomarek
Copy link
Contributor

Hey @krisdigital i have tried the same with the current 3.1.21 branch and if something fails with the chrome engine. The script goes on and just shows an empty entry in the report.
So this is an issue (the hung up script) with the puppeteer engine.

@krisdigital
Copy link
Contributor

@matthiaskomarek Thank's for checking, I will look into it!

@AkA84
Copy link
Contributor

AkA84 commented Mar 26, 2018

anyone has any problems with running captures in parallel (say, 5 at a time)? I have random errors (puppeteer that can't find an element, for example) for scenarios with onReady scripts that run just fine when doing one capture at a time.

I'm still not sure if this is something wrong with my setup/scripts, but just wanted to bring it up in case someone encountered the same issue. At the moment I can rely on the results only if i don't do parallel capturing, which really is a shame :(

@garris
Copy link
Owner Author

garris commented Mar 26, 2018

@krisdigital this issue will probably be addressed by the work happening in #712.

@garris
Copy link
Owner Author

garris commented Mar 26, 2018

@AkA84 we are still working on the migration— once all of our feature tests pass we’ll be looking at why some parallel cases seem to work fine but others have issues. This could be a performance issue in backstop— but it could also be on the network, the server or something else in the environment.

@iain17
Copy link
Contributor

iain17 commented Mar 27, 2018

Hey @krisdigital We're super excited for puppeteer. I'm currently trying to get it to run in Docker with the latest changes from https://github.com/garris/BackstopJS/tree/puppeteer.

One thing I noticed is that we seem to create a new browser instance for each test case. We're currently experimenting with just having one browser instance and creating only new page objects (tabs) for each test. Let me know what you think.

My repo is at: https://github.com/iain17/BackstopJS/tree/puppeteer

@kiran-redhat
Copy link
Contributor

kiran-redhat commented Mar 27, 2018 via email

@krisdigital
Copy link
Contributor

krisdigital commented Mar 27, 2018

@iain17 The current behaviour with new browser instances mirrors the behaviour of Chromy. Opening a tab is probably more ressource friendly, see this discussion: puppeteer/puppeteer#85
Maybe it could be provided as an option for users who do not need a new session for each scenario, afaik this would be also possible with Chromy.

@krisdigital
Copy link
Contributor

krisdigital commented Mar 27, 2018

@garris @matthiaskomarek Just made a pull request that should handle that problem #718

@garris
Copy link
Owner Author

garris commented Mar 28, 2018

📣 Puppeteer branch is merged to master and pushed to npm @canary! 🎆

npm install backstopjs@canary

@krisdigital
Copy link
Contributor

🎉🍾🍻

@cactusa
Copy link
Contributor

cactusa commented Apr 6, 2018

Just trying puppeteer out in v3.2.3 on a docker setup and got an error:

BackstopJS v3.2.3
Loading config:  /src/backstop.config.js 

COMMAND | Executing core for `test`
createBitmaps | Selected 45 of 45 scenarios.
      COMMAND | Command `test` ended with an error after [0.06s]
      COMMAND | Error: Failed to launch chrome!
                [0406/123126.823853:ERROR:zygote_host_impl_linux.cc(88)] Running as root without --no-sandbox is not supported. See https://crbug.com/638180.
                
                
                TROUBLESHOOTING: https://github.com/GoogleChrome/puppeteer/blob/master/docs/troubleshooting.md
                
                    at onClose (/usr/local/lib/node_modules/backstopjs/node_modules/puppeteer/lib/Launcher.js:246:14)
                    at Interface.helper.addEventListener (/usr/local/lib/node_modules/backstopjs/node_modules/puppeteer/lib/Launcher.js:235:50)
                    at emitNone (events.js:110:20)
                    at Interface.emit (events.js:207:7)
                    at Interface.close (readline.js:367:8)
                    at Socket.onend (readline.js:147:10)
                    at emitNone (events.js:110:20)
                    at Socket.emit (events.js:207:7)
                    at endReadableNT (_stream_readable.js:1059:12)
                    at _combinedTickCallback (internal/process/next_tick.js:138:11)
>> Exited with code: 1.
>> Error executing child process: Error: Process exited with code 1.

Anyone experiencing the same?

@krisdigital
Copy link
Contributor

@cactusa see the link in the error: https://github.com/GoogleChrome/puppeteer/blob/master/docs/troubleshooting.md, especially the sandbox section, also this PR #722

@cactusa
Copy link
Contributor

cactusa commented Apr 9, 2018

@krisdigital Just pulled the latest commit to try out the PR you linked above, but it didn't work for me, absolutely no change at all. I still get that same error. And I rebuild the docker image as well and followed the documentation from the readme.

FYI @VesterDe

@garris
Copy link
Owner Author

garris commented Apr 9, 2018

@krisdigital or @here Anybody know how to handle CSP issues with puppeteer? I am getting errors like...

######## Error running Puppeteer ######### Error: Evaluation failed: EvalError: Refused to evaluate a string as JavaScript because 'unsafe-eval' is not an allowed source of script in the following Content Security Policy directive ...

@VesterDe
Copy link
Contributor

VesterDe commented Apr 9, 2018

@cactusa Hey, did you also include the engineOptions to your config file?

Specifically, this line:

"engineOptions": {"args": ["--no-sandbox", "--disable-setuid-sandbox"]}

That solved your exact problem for me. I'm not sure how to help if that doesn't change the error, at least...

@garris
Copy link
Owner Author

garris commented Apr 10, 2018

@VesterDe unfortunately no. same exact behavior. 😢

This is weird -- I didn't have this problem with the Chromy engine. 🤔

@krisdigital
Copy link
Contributor

@garris This seems to be a known issue puppeteer/puppeteer#1814 ...

@cactusa
Copy link
Contributor

cactusa commented Apr 10, 2018

@VesterDe Yes I did indeed include that exact line in my config. 😞

@garris
Copy link
Owner Author

garris commented Apr 10, 2018

Thanks guys! Fixed by adding this onBefore script 👉 67801c2

@garris
Copy link
Owner Author

garris commented Apr 10, 2018

I am hoping this will be a much better fix in the near future! puppeteer/puppeteer#2324

@cactusa
Copy link
Contributor

cactusa commented Apr 18, 2018

Cool.
Update from me - I finally had time to make backstop run with puppeteer, but most of my scenarios now fail for two reasons.
For some reason the font renders differently now, which shouldn't be possible if I am running a docker container with consistent versions.

screen shot 2018-04-18 at 14 47 21

The second and bigger issue us that for most scenarios the captured test image is just grey, nothing else on it. Naturally it detects close to 100% difference.

screen shot 2018-04-18 at 14 34 39

So far I have tried adding a delay to each scenario also tried with 'asyncCaptureLimit': 1
Couldn't see any other issues posted with this same problem.
Any suggestions?

@VesterDe
Copy link
Contributor

VesterDe commented Apr 18, 2018

I can confirm that I get both of those errors intermittently as well, even if running reference and test one after another on the same machine.

Are your test images gray or empty? Mine are actually just WIDTHxHEIGHT of empty pixels, maybe the gray is just the css on the diffing tool?

Do any of your reference images appear empty as well? I also get empty references sometimes...

Although, as far as I can tell, this never happens if there are no selectors present for the run. So, if I'm just capturing the entire website in one image. In that case, it's never empty...

@cactusa
Copy link
Contributor

cactusa commented Apr 18, 2018

Not sure, but I think it's just grey image of the same size as the reference. I have spotted some of the tests images are half captured and half grey. I have one example where the test capture is of an area slightly above the intended selector...

I am also getting this error at the very start:

>> (node:1) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added. Use emitter.setMaxListeners() to increase limit

And this error on a couple scenarios where the test image is completely missing:

ENGINE ERROR: Navigation Timeout Exceeded: 60000ms exceeded

I guess am not switching to puppeteer just yet. I was really hoping this to work because it doesn't randomly hang executions as chromy does and is faster 😢

@VesterDe
Copy link
Contributor

VesterDe commented Apr 20, 2018

Ok, I've found a bit of a curve ball here... I'd be really grateful if someone can confirm this for me.

I've isolated the failing cases on one of my projects, and it seems that the screenshot will be empty if any of the numbers pertaining to the bounding box of the selected element have decimals. So, if I log the bounding boxes these values produce a screenshot:

Screenshoting header:
{"x":0,"y":0,"width":800,"height":66}
Screenshoting footer:
{"x":0,"y":948,"width":800,"height":703}

And these produce an empty file:

Screenshoting #testimonials:
{"x":0,"y":3475.296875,"width":800,"height":140}
Screenshoting #cta1:
{"x":0,"y":421.203125,"width":800,"height":918.109375}

This is... unusual... But seems consistent on my machine. What also happens is that as soon as one selector in an array fails because of this reason, all the remaining selector in the scenario fail.

It does provide an insight into the randomness we've been seeing, because while width and height are constant, X and Y are relative to the current viewport, and may have different values on different runs.

I've tested this out on a standalone puppeteer run and it seems to happen as well. If someone could reproduce these findings we can make an issue with the puppeteer team.

Edit: Perhaps this issue will lead to a good outcome.

Edit2: Indeed, as per the above issue, Puppeteer@1.1.1 doesn't have this problem, it seems to start with 1.2.0.

@krisdigital
Copy link
Contributor

@VesterDe Great debugging! Maybe the Puppeteer version should be locked to 1.1.1 until the bug is fixed then..

@garris
Copy link
Owner Author

garris commented Apr 21, 2018

@VesterDe Thank you for taking time to investigate this!

@cactusa
Copy link
Contributor

cactusa commented May 9, 2018

@garris is it OK to move puppeteer to 1.1.1 as it is more stable? I can test and make a pull request.

@garris
Copy link
Owner Author

garris commented May 9, 2018

Looks like puppeteer is at 1.4.0. Yes, we definitely want to bump this.

@cactusa
Copy link
Contributor

cactusa commented May 10, 2018

@garris Sadly version 1.4.0 still exhibits the same bug. Some test images are blank some are partially captured. Will try with 1.1.1 next

Update: Weirdly v1.1.1 has the exact same problem. Maybe puppeteer is not the problem here 😭

@jsmecham
Copy link

I am also experiencing this issue with the latest BackstopJS and Puppeteer. Sometimes the images will be blank, sometimes they will be cropped incorrectly and sometimes they will display part of the content and white for the rest of it. Have not been able to find a workaround, yet (tried delays, etc).

@garris
Copy link
Owner Author

garris commented May 10, 2018

Is this fixable by adding a Math.floor() to our capture flow?

@cactusa
Copy link
Contributor

cactusa commented May 14, 2018

@garris could you point me to the file I need to the change in and I'll try to tests this as soon as I have some spare time.

@garris
Copy link
Owner Author

garris commented May 14, 2018

Here is where selector screenshots are called by backstop...

https://github.com/garris/BackstopJS/blob/master/core/util/runPuppet.js#L365

You could look into changing this to a different puppeteer call where you specify the screen coordinates explicitly using rounded boundingBox values.

@cactusa
Copy link
Contributor

cactusa commented Jun 20, 2018

@VesterDe I have been doing some debugging of my own and I managed to reproduce the same results as you, but in my case I get the empty screenshots even for entries that do not have the decimals in their boundingBox object. I think the issue is caused by something else.

@garris I am testing a bug fix for this and will be making a pull request shortly. I have basically done what you suggested, but without rounding the numbers and it worked.

@cactusa
Copy link
Contributor

cactusa commented Jun 20, 2018

My pull request #808

@VesterDe
Copy link
Contributor

VesterDe commented Jun 20, 2018

Cool, can't wait to test it out.

Edit: At first glance, your pull request seems to fix it for me! Great job @cactusa :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants