-
Notifications
You must be signed in to change notification settings - Fork 18k
proxy.golang.org: Failed to connect to github.com port 443: Connection refused #72121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
seems to be temporary, but whatever the cause, it's GitHub's load balancer that's the one that rejects the connection, I don't think there's much that can be done about it. |
Still happening here, 15 hours later. I can push and pull and use GitHub just fine, and GitHubStatus.com reports no issues.
|
More information:
pkg.go.dev still does not have the new versions, even after 30 minutes. Not really sure what is going on here, since the issue is clearly not local or intermittent -- but the 30 minute thing seems oddly specific. And I'm not sure why it doesn't apply to the Go package docs. Either the code can be accessed or it can't, right? I'm supposing this is an issue with the Go module proxy, since GitHub hasn't had any connectivity issues on my end, or on their status page, in the last few days. |
Related Issues (Emoji vote if this was helpful or unhelpful; more detailed feedback welcome in this discussion.) |
CC @golang/tools-team. |
Still an issue, 3 tags done and none working/all having that cached connection error I don't think connection error should be cached for 1/2h or not retried
|
Thanks whoever kicked the thing it started working again now (or maybe it did because I pulled a specific sha first) |
Though https://proxy.golang.org/fortio.org/progressbar/@latest is still broken :-( |
Still broken. Please, repair it. Thank you. |
@mholt I believe there's a 30m cache of errors. If there were transient github connectivity issues, they may persist for that long. pkg.go.dev on the other hand has a work queue, which can sometimes get backed up. It looks like https://pkg.go.dev/github.com/caddyserver/certmagic@v0.22.0 is there now. @vault-thirteen: can you please clarify what is still broken? This issue is reporting ingestion issues, perhaps related to connectivity between Google and GitHub. With which modules are you experiencing problems? |
@troian it looks like https://proxy.golang.org/github.com/akash-network/cosmos-sdk/@v/v0.45.16-akash.3.mod is now there as well. |
https://sum.golang.org/lookup/github.com/vault-thirteen/auxie@v0.28.4 That git tag was added more than an hour ago and it is still not visible for Golang. |
this:
|
The problem still exists, I had to run
|
This worked for me:
|
This comment has been minimized.
This comment has been minimized.
Hi all, while we always have some flakiness in fetching from origin servers, the responses here indicate more connectivity issues than normal. I've investigated a bit, and have a few leads as to what could be causing this. I'm currently attempting one mitigation, but if it doesn't work, further intervention will have to wait until Monday. I see that many of the failing versions are eventually being fetched (for example, both https://proxy.golang.org/fortio.org/progressbar/@latest and https://sum.golang.org/lookup/github.com/vault-thirteen/auxie@v0.28.4 are now there). As others have pointed out, fetching directly from the origin server is always an available workaround, though it is generally faster and safer to fetch from the proxy. Setting |
The problem is not fixed. I started updating the chain of dependencies and see that something is broken again. https://sum.golang.org/lookup/github.com/vault-thirteen/!simple-!file-!server@v0.16.3 |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
@vault-thirteen understood. Unfortunately, our next step in investigating this will require a rollout, and due to the timing this will likely have to wait until Monday. Versions are still flowing from GitHub, at a similar rate to a few days ago, so it's not yet clear to me why some seem to be significantly more flaky. |
Should the |
It feels like that would be a nice to have. I understand the security implications but it seems a little unreasonable to have builds fail just because a GitHub limit was hit internally to the Go sum server systems... |
From logs, it looks like these errors are much less frequent after a deployment today, and all of the versions mentioned above have been fetched. If people are still experiencing this with github.com modules, please let me know the failing module name and version and I'll take a look. Otherwise, I will close this tomorrow. Thanks. |
That's great news - can you share what was fixed/changed or a post mortem result? |
@ldemailly we still don't fully understand the root cause. However, our automated deployments of the service responsible for ingesting new module versions were stalled, due to an infrastructure issue. The timing was suspicious: it's possible that the longer-than-usual uptime of the previous deployment could have led to some form of resource exhaustion. ...except that it seems like certain modules were more affected than others, indicating some sort of throttling by Github. In any case, I unblocked our deployments, and the problem seems to have subsided. We will have to keep an eye out for more reports, and I'll discuss with other team members when they are back from vacation. Closing as this seems to be resolved. |
proxy unable to connect to github.com and fetch new version
looks like this issue was happening earlier today #72118
The text was updated successfully, but these errors were encountered: