Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the variance of 0.016 #3

Open
232525 opened this issue Nov 14, 2022 · 3 comments
Open

About the variance of 0.016 #3

232525 opened this issue Nov 14, 2022 · 3 comments

Comments

@232525
Copy link

232525 commented Nov 14, 2022

Hi, authors. In your paper, you mentioned:

Specifically, we set $\epsilon$ to the mean $\mathcal{l}_\infty$ norm of embedding differences between five captions that correspond to the same image. We estimated this based on captions of only 15 MS-COCO images.

I would like to know, have you released the corresponding code detail of the estimate of the variance?

@DavidHuji
Copy link
Owner

DavidHuji commented Nov 14, 2022

Hi,
Yes. We calculate it using the predictions_runner.py script with the flag --ablation_dist and the flag --text_autoencoder (which specifies to use CLIP text encoder rather than an image encoder. see its main method there: calc_distances_of_ready_embeddings).
This will print the results of the average max norm of 900 samples but it will also save the 900 values to a pickle file so you can make sure this estimation could be done also with only 15 samples with high confidence (i.e. the variance of the max norm of different sets is negligible).

@232525
Copy link
Author

232525 commented Nov 15, 2022

Thanks for your reply, I calc the average result on val set, and it seems normal. But something confused me: you mentioned $l_\infty$ norm in your paper, but it seems that the $l1$ norm result is adopted. Is there anything I missed or mistake?
image

@DavidHuji
Copy link
Owner

DavidHuji commented Nov 15, 2022

It prints a few different metrics though the infinity norm is the one that is printed here (line 86).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants