-
Notifications
You must be signed in to change notification settings - Fork 6.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace print with logging #6138
Conversation
34eced3
@@ -754,7 +754,6 @@ def concat_cond(self, **kwargs): | |||
mask = torch.ones_like(noise)[:, :1] | |||
|
|||
mask = torch.mean(mask, dim=1, keepdim=True) | |||
print(mask.shape) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleanup debug logs.
@@ -160,7 +160,6 @@ def __init__( | |||
if isinstance(self.num_classes, int): | |||
self.label_emb = nn.Embedding(num_classes, time_embed_dim) | |||
elif self.num_classes == "continuous": | |||
print("setting up linear c_adm embedding layer") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleanup debug logs.
@@ -518,7 +519,6 @@ def multistep_uni_pc_vary_update(self, x, model_prev_list, t_prev_list, t, order | |||
A_p = C_inv_p | |||
|
|||
if use_corrector: | |||
print('using corrector') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleanup debug logs.
34eced3
to
e8274e9
Compare
@@ -41,8 +41,7 @@ def encode_token_weights(self, token_weight_pairs): | |||
to_encode.append(self.gen_empty_tokens(self.special_tokens, max_token_len)) | |||
else: | |||
to_encode.append(gen_empty_tokens(self.special_tokens, max_token_len)) | |||
print(to_encode) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleanup debugging print added in #6055.
This PR enforces ruff lint rule
T201
that bans usage ofprint
. All existing usage ofprint
in core are replaced withlogging
. All usages in tests/deployment files are annotated with# noqa
.