-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Yarn fail to install depedency from git+ssh #4282
Comments
Also, it seems that the format of version in the package.json has something to do. We have this:
and yarn 0.27.5 hangs on it. When it is changed to full git sha:
yarn works well. |
Hey there! We have changed the Git SHA and tag handling logic substantially quite recently and also added some more fixes around |
I've run into a similar issue and noticed that I tried the recent nightly (
For the record, it currently is working on yarn |
@naganowl Are you sure you have your SSH keys added to your agent etc. ? @Nopik - Yarn 1.0 is out with many fixes on top of 0.27 so I'd highly recommend you try the same thing with a recent 1.0 build. |
The issue is not solved on 1.0.0. The behaviour is different, though. Now on a clean cache, after I call |
@Nopik as I mentioned, I was unable to reproduce this and the thing you describe makes me think |
It doesnt seem that anything is killing yarn. The effect is 100% reproducible, even if I have Here is the run with empty cache and yarn 1.0.1, doing
After it printed final line (typescript-2.5.2.tgz), it waited for like 5-10 seconds, and then quit silently. After it quit like this, re-running:
BUT, re-running it again, same way, it worked. So, it seems a bit of random. When I been writing this comment, I tried it 2 times, and on first try yarn succeeded in about 5th invocation, on second try it was 3rd invocation. |
And again, if I remove the git+ssh dep from my |
@Nopik I bet OOM killer is the reason. Can you try running Turns out OOM killer uses SIGKILL that we don't handle, so it might explain the Since looks like you're using CircleCI, this may also be useful: https://circleci.com/docs/1.0/oom/ |
Well, as I mentioned earlier, OOM doesn't seem to be the case. Logs are silent, |
@Nopik alright. Sorry for insisting on OOM. I still believe something kills Yarn though, probably via SIGKILL. I can get up a build for you to try that catches Since you mention |
Hm, it is unlikely that there is some weird git/ssh config, but we cannot rule that out. That happens on CircleCI environment, so I would suppose their git is pretty standard. Any hint where I could add my own debugs, e.g. for SIGKILL catching? I've added some |
I'd try this file: https://github.com/yarnpkg/yarn/blob/master/src/util/signal-handler.js That said I really can't speculate on how/why yarn is getting killed. It may be somewhere in the system logs even if it is not OOM. That said yarn itself runs on CircleCI and we haven't had any problems like this. I wonder what is different in our setups. |
Btw. we've added in-process DNS caching after 1.0.2 so give that a shot through nightlies too may be: https://yarnpkg.com/en/docs/nightly If yarn was being killed due to making too much network usage or DNS requests, this would help. |
So, after modifying this code:
I do see |
Hey, thanks a lot for sticking with this, very much appreciated! I think I have a lead based on all of the information you have provided so far. My guess is something going wrong at Do you think you can instrument around that code to see if this theory is correct? If so, we may switch to using |
I've seen a similar report about crashes with |
Great! Tell me where to play with it (as my knowledge of yarn codebase is next to zero). So far I've been playing with substituting my git+ssh dep with some empty repo, realized that even if i have just empty index.js + package.json + yarn.lock, the problem still occurs. Been trying to reduce down the dep list to find the culprit, but that is slow and error-prone process (due to apparent randomness), I had hoped to find the smallest subset of deps which cause the problem. |
Indeed, we're getting somewhere. Patching this code:
gives me:
Digging deeper. |
After few more console.logs, it seems to hang around |
Ah, so contrary to my guess, this is hanging at the cloning stage. Weird. Okay, I'll dig more to find what else. |
Indeed,
produces:
with very long (5-10 seconds) delay between |
Another bit:
gives me this when yarn succeeds:
(roughly 200kb of data) and this when fails:
roughly 150kb of data, no |
@Nopik thanks a lot for debugging! I'll try to submit a PR that avoids piping here and see what happens. |
Sure. Though it might just hide the problem under the carpet, would be great to find the real culprit ;) |
@Nopik the culprit at this point seems like Node's inability to regulate the speed difference between the read and write streams through |
Well, I don't think that this is memory pressure. The whole thing has like 200k in size (much of it already went through pipe when problem happens). When I disabled tar extractor, and remove the whole |
Sigh, so weird... So is it inter-process communication then? What if you remove |
So, I've been watching when Now, when It is still a mystery for me, though, why yarn quits 5 seconds later. |
I've been digging deeper into the child spawning vs. stdout. It seems that When I added something like Calling Not sure if that is Node.js bug, or 'feature', didn't found any description of that in the docs. |
So, it looks more like a Node.js error, I'm afraid. When I add |
For reference:
|
Also, I'm adding |
I wrote very simple program to reproduce the problem:
On linux the program should output When bug happens, I ran this program on CircleCI on Node.js 6.1.0 and higher. It seems that I would put it into your consideration to add For my side, I think I'll just bump up the Node.js in my CircleCI builds, and it should be fine. If some problems arise, I will open another issue. Since the apparent solution is found, I will let you close this bug at your discretion, after applying some fixes/workarounds or whatever. If I will learn that this solution is not enough, I can always re-open the bug. |
Is this truly not resolved? I am seeing similar issues on 1.7.0 ? Just wondering if this got lost in the shuffle of tickets. Also this is on node@8.11.3 |
I'm also struggling to have yarn
|
In yarn 1.22.0-4 i still can't install dependencies from git+ssh under Windows 10 i get the error:
but when i manually execute When i use yarn --verbose with v1.22.4:
i am working on this issues for two days |
Closing as fixed in v2 |
Do you want to request a feature or report a bug?
Bug.
What is the current behavior?
I have some application with a number of depedencies, including one of my own package being hosted privately on bitbucket. For some reason, that package provides problems to yarn, on some environments, especially on CircleCI vm.
It is not an access denied issue - yarn is apparently cloning the repo properly, just fails to handle it well.
So, on CircleCI (it works on my osx laptop!) when I'm trying to
yarn install
when mypackage.json
contains"react-base-core": "git+ssh://git@bitbucket.org/[cut]/react-base-core.git#314f26f"
independencies
, yarn hangs forever (ok, at least 15 minutes). When I remove this single package from deps, yarn finishes in a timely manner. The package itself is quite small, about 150kb. When checked out the top-level structure is pretty straightforward:Now, provided that
~/.cache/yarn/v1/
does not contain the package in question, when yarn hangs (it always happens in[2/4] Fetching packages...
step), following appear in~/.cache/yarn/v1
:.tmp/cbfade254ae228af4618ee371302fb66
with the full package content from git (cd into it andgit status
andgit pull
claim that the repo is up to date)npm-react-base-core-1.1.0-314f26f/
with copy of the repoThe content of those folders is almost identical:
Interesting fact:
Needlessly to say,
Gulpfile.js
is a file, contains regular gulpfile content etc. IIRC it never was a folder.error.log
isn't too much helpful:Now, I've been trying on yarn 0.27.5, and the problem was appearing. In act of curiosity/desperation I tried to rollback yarn to older versions:
npm install -g yarn@0.26
+ 0.25 + 0.24 + 0.23 was behaving the same, hanging foreverWhen yarn 0.22 installs the package succesfully:
When developing on desktop MacOS or Linux Ubuntu, it doesn't happen. My team wasn't hitting this issue for long time (been doing
yarn install
a lot before), until we started to use CircleCI.On CCI it is 100% reproducible, no randomization.
The text was updated successfully, but these errors were encountered: