Score Fix for Binary Quantized Vector and Setting Default value in case of shard level rescoring is disabled for oversampling factor #2183
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Change1
Problem
In Disk Vector Mode, while calculating the score for a binary quantized vector, the system was incorrectly using Inner Product as the similarity function. However, for binary quantized vectors, the correct distance metric should be Hamming Distance.
Impact:
This incorrect scoring mechanism led to poor results, particularly in cases with multiple segments. During the process of reducing to the top k candidates, the best candidates (those with the most relevant results) were being filtered out. As a result, the recall was zero, meaning no relevant results were being returned, which critically affected the accuracy of search results.
Expected Behavior:
The system should use Hamming Distance for binary quantized vectors, ensuring that the correct similarity metric is applied. This would allow the best candidates to be retained during the top-k reduction process, thereby maintaining high recall and improving overall search performance in multi-segment scenarios.
Change 2:
Problem:
The system already had a setting to disable shard-level rescoring, allowing for rescoring at the segment level. However, with a recent PR, a default oversampling factor was reintroduced to be applied when the user did not explicitly provide one. Unfortunately, this default oversampling factor was being applied in both cases — regardless of whether shard-level rescoring was enabled or disabled. With Segment level , we don't need high oversampling factor.
Solution:
This PR introduces changes to ensure that the default oversampling factor is only applied when shard-level rescoring is disabled. When segment-level rescoring is enabled, the system overrides the default oversampling factor, ensuring the correct sampling factor is applied based on dimesion.
Related Issues
Resolves #[Issue number to be closed when this PR is merged]
Check List
--signoff
.By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.