Skip to content

Conversation

@troelsy
Copy link
Contributor

@troelsy troelsy commented Feb 5, 2024

After I started using a mask with knnMatchAsync, I found that the result from knnMatchConvert would be clipped at random.

Investigating the issue, I found that knnMatchAsync will initialize all trainIdx to -1, which will be overwritten by the CUDA kernel. A mask can be used to prevent certain features from being matched and this will prevent the CUDA kernel from setting the match distance. knnMatchConvert is not properly incrementing the pointers when trainIdx == -1, so an unmatched feature will get it stuck at if (trainIdx == -1). Eventually the outer for-loop finishes and returns a vector with the matches up until the first missing match distance.

My solution is to increment the counters the same way as a succesful iteration would.

Pull Request Readiness Checklist

See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request

  • I agree to contribute to the project under Apache 2 License.
  • To the best of my knowledge, the proposed patch is not based on a code under GPL or another license that is incompatible with OpenCV
  • The PR is proposed to the proper branch
  • There is a reference to the original bug report and related work
  • There is accuracy test, performance test and test data in opencv_extra repository, if applicable
    Patch to opencv_extra has the same branch name.
  • The feature is well documented and sample code can be built with the project CMake

Copy link
Contributor

@asmorkalov asmorkalov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@asmorkalov asmorkalov merged commit 48b5ded into opencv:4.x Feb 6, 2024
@asmorkalov asmorkalov mentioned this pull request Feb 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants