XM is a scalable and initialization-free solver for global bundle adjustment, leveraging learned depth and convex optimization. This repositary implement XM and its whole structure from motion (SfM) pipeline XM-SfM, achieve huge speed up compare to existing solver.
- Enable joint estimation on camera intrinsics.
- Speed up on preprocess part.
-
04.23.2025
: RSS camera ready. -
03.12.2025
: Release beta version.
- If you already have the observation of 3D landmarks in each camera frame, you can directly pass the view-graph and observations to XM solver. See example2
- If you found the result is not good, that is because the observation have too much noise (solver will converge to global optimal, but the quality of observation indeed influence accuracy). You can refer to example4 and example5 to use XM
$^2$ and Ceres refinement. More details can refer to our paper.
- If you found the result is not good, that is because the observation have too much noise (solver will converge to global optimal, but the quality of observation indeed influence accuracy). You can refer to example4 and example5 to use XM
- If you have images, intrinsics of cameras and corresponding depth map, you will need to install COLMAP and GLOMAP to match corresponding feature and create view-graph.
- If you only have images and intrinsics, you will also need to install depth model to estimate depth map. Here we use Unidepth.
- If you do not have intrinsics: TODO. We are working on this right now.
Before installation you need to download the test dataset from Google Drive for Datasets. The SIMPLE1
& SIMPLE2
datasets are binary files that can directly sent to our solver.
And SIMPLE3
& SIMPLE4
are image datasets. There are also datasets that you can put in assets/Experiment/
folder to check the accuracy in paper. All datasets should be under assets
folder, e.g. assets/SIMPLE1
or assets/Experiment/BAL
After clone the XM repo, make sure you are in the root path of XM folder, and run these in terminal:
conda create -n XM python=3.10
conda activate XM
pip install -r requirements.txt
Directly run
cd XM
cmake -B build .
cmake --build build
cd ..
Note that your terminal should under the XM environment. You can now run example1 and example2.
This part should be replaced in our final release, but for now you will need to build them as a component for our pipeline.
According to Ceres, GLOMAP and COLMAP, you should first build Ceres and pyceres. You can install them into the /deps/
folder together with GLOMAP.
Though you can install pyceres and pycolmap through pip
, we highly recommand build from source because it support CUDA.
Build Ceres from source code, and build pyceres from source in XM
environment.
Build COLMAP from source code, and build pycolmap from source in XM
environment.
We modified a bit on GLOMAP to fit our pipline, so you can directly build from our repository. Note GLOMAP needs COLMAP.
Run the following in root path of XM:
cd deps/glomap/
mkdir build
cd build
cmake .. -GNinja
ninja && sudo ninja install
cd ../../../
For those who what to try experiment in our XM paper, you need to install TEASER++ to calculate accuracy. Otherwise you can comment the last part of our code.
Now you can run example3
Our choice is Unidepth, but you may change to you custom one.
To build Unidepth directly run this:
cd deps/
git clone git@github.com:lpiccinelli-eth/UniDepth.git
cd UniDepth
# Change to your own CUDA version
pip install -e . --extra-index-url https://download.pytorch.org/whl/cu124
You may encouter the same issue as me:
-
If pytorch3d cannot build, please comment the line about pytorch in
Unidepth/requirement.txt
and retry. After successfully installing other dependence, build pytorch3d again. -
If
name 'warnings' is not defined
, you may need to addimport warnings
in the corresponding file. -
It will show some warning about timm, but that do not hurt.
-
If loded together with
XM
orpycolmap
,pyceres
usingimport
, UniDepth must be load before them.
Now you can run example4 and example5.
We recommend that you read examples 1 through 5 in order.
This is purely XM solver, the input is the
Before XM solver, we add codes about how to build the
Now we add COLMAP and GLOMAP to match features and build view-graph, but use ground truth depth tp lift 2D features to 3D.
We add Unidepth to estimate depth information instead of ground truth depth. We also add XM
If you still find the result not accurate enough, try to run Ceres after XM. Note this is only needed when you 2D matching is accurate but you 3D estimation is bad.
If you want to run the experiment in paper, please also install TEASER. Note the run time can vary a lot for different GPUs. Please download dataset from Google Drive and uncompress it in assets/Experiment
folder. We test several datasets in XM paper:
BAL datasets: Please use our pre-processed data since we down-sampled from the original datasets.
Replica datasets: download from nice-slam repository, contains ground truth pose and depth.
IMC and MipNeRF datasets: contains images and ground truth pose
TUM datasets: images and ground truth in given time stamps.