Skip to content

[usage] Configure DB credentials and connect #10295

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 30, 2022
Merged

Conversation

easyCZ
Copy link
Member

@easyCZ easyCZ commented May 27, 2022

Description

Configures usage component to receive DB connection details and attempts to establish a connection to ensure it works.

Related Issue(s)

How to test

usage component starts in Preview Env k8s

Release Notes

NONE

Documentation

NONE

@easyCZ easyCZ requested a review from a team May 27, 2022 09:04
@easyCZ easyCZ changed the base branch from main to mp/db-conn May 27, 2022 09:04
@github-actions github-actions bot added team: webapp Issue belongs to the WebApp team and removed size/S labels May 27, 2022
@easyCZ easyCZ changed the title [usage] Establish database connection [usage] Define db.Workspace model May 27, 2022
@easyCZ easyCZ changed the title [usage] Define db.Workspace model [usage] Configure DB credentials and connect May 27, 2022
@roboquat roboquat added size/M and removed size/S labels May 27, 2022
@easyCZ easyCZ force-pushed the mp/db-conn-config branch from 9abced8 to d2336b7 Compare May 27, 2022 09:49
@easyCZ
Copy link
Member Author

easyCZ commented May 27, 2022

/werft run

👍 started the job as gitpod-build-mp-db-conn-config.3
(with .werft/ from main)

@geropl geropl self-assigned this May 30, 2022
Base automatically changed from mp/db-conn to main May 30, 2022 08:21
@roboquat roboquat added size/L and removed size/M labels May 30, 2022
@easyCZ easyCZ force-pushed the mp/db-conn-config branch from d2336b7 to 744d666 Compare May 30, 2022 08:23
@roboquat roboquat added size/M and removed size/L labels May 30, 2022
@geropl
Copy link
Member

geropl commented May 30, 2022

The component is up and running.

But I noticed that it's the only component with container restarts, and it seems because the DB was not up, yet:

[error] failed to initialize database, got error dial tcp 10.43.21.69:3306: connect: connection refused
{"@type":"type.googleapis.com/google.devtools.clouderrorreporting.v1beta1.ReportedErrorEvent","error":"dial tcp 10.43.21.69:3306: connect: connection refused","level":"fatal","message":"Failed to establish database connection.","serviceContext":{"service":"usage","version":"commit-744d6665468d952b21d0cc17fbf036b32df09039"},"severity":"CRITICAL","time":"2022-05-30T08:32:39Z"}

Other DB dependent pods use the service-waiter component, e.g. here.

One could argue that "failing fast" is Kubernetes best practice, and that's true to some degree. But experience showed that it's very valuable to be able to quickly tell if "everything is fine", and that's not possible when there's too much noise. service-waiter effectively adds a delay of 30s, which is a nice compromise between "fail quick enough" and "keep good signal to noise ratio".

@easyCZ
Copy link
Member Author

easyCZ commented May 30, 2022

@geropl Yep aware. Wanted to see if this becomes a problem or not. I can add the waiter in a follow-up PR. As it stands, it's only really a problem in preview after a fresh deploy.

And absolutely agree, in an ideal world this is already self-healing in k8s envs.

@roboquat roboquat merged commit 9202ddd into main May 30, 2022
@roboquat roboquat deleted the mp/db-conn-config branch May 30, 2022 08:58
@roboquat roboquat added deployed: webapp Meta team change is running in production deployed Change is completely running in production labels May 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deployed: webapp Meta team change is running in production deployed Change is completely running in production release-note-none size/M team: webapp Issue belongs to the WebApp team
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants