Skip to content

Semantic Segmentation Labeler

Angel Chang edited this page Sep 23, 2024 · 8 revisions

This is the annotation tool for labeling a surface mesh segmentation with semantic labels. It is used to label reconstructed scans for the ScanNet and Matterport3D projects.

If you use this annotation tool, please cite:

@inproceedings{dai2017scannet,
    title={ScanNet: Richly-annotated 3D Reconstructions of Indoor Scenes},
    author={Dai, Angela and Chang, Angel X. and Savva, Manolis and Halber, Maciej and Funkhouser, Thomas and Nie{\ss}ner, Matthias},
    booktitle = {Proc. Computer Vision and Pattern Recognition (CVPR), IEEE},
    year = {2017}
}

If you have a ply format mesh of the scan, use the ScanNet Segmentator to create a vertex-based segmentation using Felzenswalb and Huttenlocher's Graph Based Image Segmentation algorithm on computed mesh normals. See Preparing assets for annotation for how to prepare your assets.

See Segment Annotation Interface for details on the annotation UI and Segment Annotation Server API on how to view/retrieve the annotations from the server and Segment Annotation Data Model the details of the data model.

Annotation workflow

  1. Scan is uploaded for annotation
  2. Scan is submitted to AMT for multiple workers to annotate
  3. Annotated scan is cleaned and normalized
  4. If judged bad by a reviewer, annotation is rejected, and scan is put back into queue for annotation
  5. Annotations for scan is aggregated

Task Interface

Webpage http://localhost:8010/scans/segment-annotator-single?condition=manual&userId=username&taskMode=new&modelId=xxx.yyy

Specifying Task Configuration

Task configurations are stored in server/proj/scannet/tasks/segment_annotation

You also need to update the default configuration server/proj/scannet/tasks/segment_annotation/default.config.yml to point to the assets you want to annotate by updating the following fields:

scansToAnnotate: <filename of your file with list of scans that you people to annotate>
idPrefix: <your asset source>

You can also specify separate configurations by updating server/proj/scannet/tasks/segment_annotation/index.js and specifying condition=<configName> to match the configuration in the url parameters to segment-annotator-single.

Viewing annotations

Webpage: http://localhost:8010/scans/segment-annotations

Examples

To get a list of annotations:

Use format=json to get result as json.

Exporting annotations

Before you can use the ssc to export annotations, make sure that you added your asset metadata file to (ssc/data/assets.json)[../blob/master/ssc/data/assets.json].

Use (ssc/export-annotated-plys.js)[[../blob/master/ssc/export-annotated-plys.js] to export annotations as ply files with vertex colors and annotations. Replace NODE_BASE_URL=http://localhost:8010 with link to appropriate server.

NODE_BASE_URL=http://localhost:8010 ./export-annotated-ply.js --id bedroom_0065  --source nyuv2 --ann_type raw -n 1 --label_mapping data/nyuv2-scannet-labels-mpr40.combined.tsv --outlabels labels.csv

Output files

  • *.semseg.json - Semantic segmentation json file (format described at https://github.com/ScanNet/ScanNet (Aggregated semantic annotation file))
  • *.annotated.ply - Exported PLY with vertex based segmentation and annotation. This ply has the vertex position (x,y,z), color (red,green,blue), and annotation fields objectId, categoryId (mapping to be provided), NYU40 (attempted automatic mapping to NYU40 label set). categoryId and NYU40 are redundant and should be possible to derive from the objectId + the semseg.json. Vertex color is original scan color. NOTE: We dropped normal from the original reconstructed ply.
  • *.categories.annotated.ply - Same as above except vertex colors map to categories (provided only for visualization purposes)
  • *.instances.annotated.ply - Same as above except vertex colors map to objectId (provided only for visualization purposes)
  • *.nyu40.annotated.ply - Same as above except vertex colors map to nyu40 (provided only for visualization purposes)
Clone this wiki locally