Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calibration of a 250deg fisheye lens #242

Closed
caselitz opened this issue Nov 13, 2018 · 6 comments
Closed

Calibration of a 250deg fisheye lens #242

caselitz opened this issue Nov 13, 2018 · 6 comments

Comments

@caselitz
Copy link

Hi all,

my goal is to find a suitable camera model for the Entaniya Fisheye M12 250 lens and calibrate it.

My first attempt was to go for the Double Sphere ds_none model recently contributed to Kalibr (thanks!).

@NikolausDemmel and @VladyslavUsenko, in your paper you experimented with multiple wide-angle/fish-eye lenses and showed that the model performed well, however the lens with the biggest FOV was the BF2M2020S23 with 195deg. What's your intuition about a 250deg lens - do you think the model can properly describe it? Or would you recommend a different model available in Kalibr?

I could provide a calibration sequence of my lens in case you are willing to experiment with it (maybe also interesting for future publication?).

Anyhow, maybe even multiple camera models are suited. The actual problem regarding calibration seems to be the point Niko raised here. When I try to calibrate my lens, the majority of points gets removed because of the 80deg check (or actually the use of pinhole-undistored 2d corrdinates instead of bearing-vectors).

It seems that the loss of 195/2-80=17.5deg was acceptable for you to still fit an accurate model, but maybe 250/2-80=45deg is to much - what do you think? With most of the calibration sequences I tried Kalibr failed, probably due to the removal of so many points. However, one sequence actually worked but I don't expect the results to be very precise (have to investigate further), at least for the outer parts of the image as used keypoints where only in a quite small inner circle area of the image space.

As Niko said, the proper fix would be to use a PNP implementation that allows to give bearing-vectors as input and not pinhole-undistored 2d corrdinates. I guess no one is planning to take any action in the near future here? ;-) Unfortunately, I feel my knowledge of the Kalibr code is a bit limited to do that, but in case I wanna give it a try, can anyone give me an idea what has to be done?

From what I understand, the estimateTransformation function needs to be changed to use all points to solve the PNP problem. I see 2 sub-tasks here:

  1. Getting the bearing vectors
  2. Use a PNP solver that takes in bearing vectors

For 1: Using keypointToEuclidean() I get a 3D point (with possibly negative z) that can be used to compute the bearing vector. Straight forward?
For 2: Where to get such a solver? Niko mentioned OpenGV. I'm not sure how easy and desirable the use of an external library in Kalibr is? (However, for my own purposes I can just try to hack it in, if it's not too complicated. Any hints how to do this?). Probably a re-implementation of such a PNP solver in Kalibr would be considerable effort?

Any help is greatly appreciated, thanks! :-)
Best, Tim

@VladyslavUsenko
Copy link
Contributor

Hi Tim,

Yes, it would be nice if you could provide a dataset with your lens. The PnP problem is used only for initialization of the camera pose, but after that all observations (even larger than 80 degrees) should be used for optimization.

As you mentioned, the proper solution would be to use OpenGV with bearing vectors, but I'm not sure if we should add new dependencies for kalibr. An alternative could be to uses several (virtual) rotated pinhole cameras for pose initialization. This way it would be possible to use the same OpenCV function as used now.

Best,
Vlad

@caselitz
Copy link
Author

caselitz commented Nov 14, 2018

Yes, it would be nice if you could provide a dataset with your lens.

Ok, cool. I'll follow up on this.

The PnP problem is used only for initialization of the camera pose, but after that all observations (even larger than 80 degrees) should be used for optimization.

Thanks for clarifying this! Then this is probably not so crucial, at least for the quality of the resulting calibration? But I guess, this assumes the optimization is still properly converging (to a not-too-bad local minimum). And proper initial pose estimates (obtained by solving PNP) are actually important to make such a (non-convex NLLS) problem properly converge, right?

That all observations are used is because the variable success in this line is kinda ignored and all obs are added to the problem? For the non-success case, I actually wonder which value the initial pose T_t_c has as out_T_t_c is not written in estimateTransform() (is that somewhere hidden in the Python C++ wrapping?).

all observations (even larger than 80 degrees) should be used for optimization.

Actually, this does not fit my experience, but admittedly N is only 1 ;) - as I said in my last post, one sequence "worked", but the report showed only reprojection errors for in the inner part of the image, which would say the outer ones were not used, right?

So, from a practical point of view, I need to ensure good enough initialization to make the optimization converge. Any hints on how to do that?

  1. I guess I should make sure to have enough pictures in the sequence that are in the center part of the image and can have a proper initial pose estimate (obtained by solving PNP)?
  2. Niko said that also how initialization is done (starting optimization with a single image and consecutively adding more) matters. [Why is this? If i read calibrateIntrinsics() correctly, first all error terms of all obervations are added, then it is optimized. Or am I on the wrong page here?] If that is the case, any recommendations how to record a calibration sequence (sth like first target in the center than on the sides)?
  3. In this issue it is suggested that smaller calibration sequences work better than bigger ones.

As you mentioned, the proper solution would be to use OpenGV with bearing vectors, but I'm not sure if we should add new dependencies for kalibr. An alternative could be to uses several (virtual) rotated pinhole cameras for pose initialization. This way it would be possible to use the same OpenCV function as used now.

Understand with the dependencies. That's why an own kalibr re-implementation could be a way to go. I like your idea with the multiple pinhole, however I still feel the bearing vectors would be the best solution. But probably much effort, I guess more than the multiple pinholes idea?

Thank for you quick reply Vlad!

Best,
Tim

@caselitz
Copy link
Author

Niko said that also how initialization is done (starting optimization with a single image and consecutively adding more) matters. [Why is this? If i read calibrateIntrinsics() correctly, first all error terms of all obervations are added, then it is optimized. Or am I on the wrong page here?] If that is the case, any recommendations how to record a calibration sequence (sth like first target in the center than on the sides)?

I should probably read the whole code from the beginning once. I guess you referred to the main function where it is looped over the views? From here it makes sense that the order has influence, also as the views (potentially) get shuffled. Question would still be if there is a good order to choose while recording (and using the --no-shuffle option).

@caselitz
Copy link
Author

Yes, it would be nice if you could provide a dataset with your lens.

Ok, cool. I'll follow up on this.

Finally I found the time to record some (11) bag files that you can download here.

@VladyslavUsenko, @NikolausDemmel: Would appreciate if you share any findings/thoughts in case you have a look at the data.

Thanks and happy XMas,
Tim

@VladyslavUsenko
Copy link
Contributor

Hi Tim,

Sorry, but we didn't really had time to check this issue in Kalibr. We've recently open-sourced our internal calibration tool (https://gitlab.com/VladyslavUsenko/basalt) and there it seems to work.

250_deg

The command I used is:

basalt_calibrate --dataset-path bag02.bag --dataset-type bag --result-path calib_result --cam-types eucm

You can give it a try if you still need to calibrate the lens.

Best,
Vlad

@caselitz
Copy link
Author

Hey Vlad (and Niko),

thanks for the update, great work! I'll let you know once I find the time...

Best,
Tim

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants