-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multiple negative for MNRL #3139
Comments
All your hard-negatives are used for all your positives i.e say you have 10 (anchor, positives) then each (anchor, positive) would use all the other "positives" as negatives. You can then decide to add negatives aswell but they are used for all (anchor, positives) pair i.e if you have 10 (anchor, positives) and 5 negatives then you would (for each (anchor, positive) pair) have 9 + 5 = 14 negatives. That means that your negatives are not specific to an (anchor, positive) pair (if you want that you should use the |
@Jakobhenningjensen
=> you mean, If I train with 32 data samples, each consisting of {anchor, positive, and 3 negatives},
|
In the MNRL (if I'm correct) you don't have negatives for each anchor/positives but you just provide a list of general hard negatives i.e if your "negative" column has 5 elements then you would have additional 5 negatives (which are all the same) for each sample, since you would just add those 5 elements to your "hard-negatives" for each batch |
I thought it worked as if you provided hard negatives for that specific anchor/positive pairs, but you don't. Im currently looking a implementing a loss that does exactly that |
@Jakobhenningjensen But in docs, input shape for MNRL is (anchor, positive, negative_1, …, negative_n) |
@tomaarsen
where can i read it in your code, positive + multiple negative to n_columns.
I can't find it...
and I have questions following
should all data samples have same length of negatives for MNRL ?
how to garantee for sampling Hard-negative ?
: {a1, p1, n1-1, n1-2, n1-3}, {a2, p2, n2-1, n2-2, n2-3} => a train_sample in batch : {a1, p1, n1-1, n1-2, n1-3, n2-2, n3-1, ....}
i read your explanation from Understanding how (hard) negatives in MNRL are used #3097
The text was updated successfully, but these errors were encountered: