Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running on scannet++ problem #13

Open
Horizon-srt opened this issue Jan 18, 2025 · 0 comments
Open

Running on scannet++ problem #13

Horizon-srt opened this issue Jan 18, 2025 · 0 comments

Comments

@Horizon-srt
Copy link

First of all, thank you very much to the authors for sharing this model!

I successfully tested the TUM dataset according to the instructions. However, when I tried run mm3dgs on scannet++ dataset (scene 8b5caf3398), it breaked after several minutes, with message:

get_scale_shift failed

local variable 'scale' referenced before assignment

SLAM failed. Saving map and results.

And this is one result:

Image

I would like to know how to solve this problem, or whether other code modifications are required to use the scannet++ dataset.

Here is the config file I used:

dataset: "scannetpp"
device: "cuda:0"
method: "mm3dgs"
inputdir: datasets/
scene: "8b5caf3398"
outputdir: output/scannetpp/8b5caf3398
dataloader: "gradslam"
use_gt_depth: false # If False, a monocular depth estimator is used
dpt_model: "midas"
white_background: false
scene_radius_depth_ratio: 2
# iteration: 592
start_idx: 0
stride: 1
# early_stop_idx: 100
desired_height: 480
desired_width: 640
save_iterations:
  - 0
  - 1
eval_every: 5
debug:
  get_runtime_stats: true
  create_video: true
  save_keyframes: false
pipeline:
  convert_SHs_python: false
  compute_cov3D_python: false
  transform_means_python: true
  force_isotropic: false
  use_rgb: false
tracking:
  iters: 100
  use_gt_pose: false
  dynamics_model: "const_velocity"
  use_imu_loss: false
  imu_T_weight: 0.0
  imu_q_weight: 0.0
  use_depth_estimate_loss: false
  pearson_weight: 0.05
  # learning rates
  position_lr: 0.001
  rotation_lr: 0.003
mapping:
  iters: 150
  kf_every: 5
  niqe_kf: true
  niqe_window_size: 5
  kf_window_size: 25
  covisibility_level: 1
  min_covisibility: 0.95
  kf_covisibility: 0.1
  do_BA: false
  use_depth_estimate_loss: true
  pearson_weight: 0.05
  # model params
  sh_degree: 0
  # learning rates
  cam_t_lr: 0.001
  cam_q_lr: 0.003
  position_lr_init: 0.0001
  position_lr_final: 0.0000016
  position_lr_delay_mult: 0.01
  position_lr_max_steps: 30000
  feature_lr: 0.0025
  opacity_lr: 0.05
  scaling_lr: 0.001
  rotation_lr: 0.001
  rgb_lr: 0.0025
  spatial_lr_scale: 1
  percent_dense: 0.01
  lambda_dssim: 0.2
  # TODO: modify these
  min_opacity: 0.005
  densification_interval: 50
  pruning_interval: 50
  size_threshold: 100
  opacity_reset_interval: 500
  densify_from_iter: 0
  densify_until_iter: 50
  densify_grad_threshold: 0.0002
cam:
  image_height: 1168
  image_width: 1752
  fx: 791.3840796518665
  fy: 798.3946606879174
  cx: 875.2340044565483
  cy: 584.9916699995767
  crop_edge: 8
  png_depth_scale: 5000.0
  fps: 60

To run the dataset, I also changed SLAM.py:

def get_dataset_type(name):
    if name.lower() == "replica":
        return ReplicaDataset
    elif name.lower() == "tum":
        return TUMDataset
    elif name.lower() == "utmm":
        return UTMMDataset
    elif name.lower() == "scannetpp":
        return ScannetPPDataset
    else:
        raise ValueError(f"Unknown dataset {name}")

Except that, scannetpp.py was changed like this:

class ScannetPPDataset(GradSLAMDataset):
    def __init__(
        self,
        config_dict,
        basedir,
        sequence,
        ignore_bad: Optional[bool] = False,
        use_train_split: Optional[bool] = True,
        stride: Optional[int] = None,
        start: Optional[int] = 0,
        end: Optional[int] = -1,
        desired_height: Optional[int] = 1168,
        desired_width: Optional[int] = 1752,
        load_embeddings: Optional[bool] = False,
        embedding_dir: Optional[str] = "embeddings",
        embedding_dim: Optional[int] = 512,
        **kwargs,
    ):
        self.input_folder = os.path.join(basedir, sequence)
        # config_dict = {}
        config_dict["dataset_name"] = "scannetpp"
        self.pose_path = None
        self.ignore_bad = ignore_bad
        self.use_train_split = use_train_split
...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant