Skip to content

Advanced guide for larger areas

DavixDevelop edited this page Apr 9, 2024 · 18 revisions

Supported raster formats

Name Extension
GeoTIFF .tif
Erdas Imagine .img

This guide is intended for advanced users who wish to use the custom terrain feature for Terra++, but wish to generate a dataset for one or more larger area, like a whole country or any big area... The only condition is that the raster data in each source (folder containing an area's raster data) has similar characteristics (ex, same CRS or projection) and that each raster has the CRS metadata defined. You can check this with a tool like gdalinfo.

For example, if you run gdalinfo dem.tif (where dem.tif is a file from your raster data) and the output contains 'Coordinate System is:...' then this guide is suitable for you. This assumes that all raster files in your raster data contain CRS metadata. If not, you can set the projection to your raster data with set_projection.py or you can use the regular procedure with QGIS here.

Note: If your raster data has a positive North-South resolution (The second value in Pixel Size when you ran gdalinfo was a positive number, meaning the image is a vertically flipped image), the script will still work, but it will correct the file by running gdalwarp on it. This will overwrite the original source file, as typically GeoTIFFs have a negative North-South resolution.

Another way is with QGIS, by simply dragging one raster file from your raster data into the window. If the imported file is in position and you don't need to manually set the CRS (when you right-click on the layer (on the left side) and select Set CRS/Set Layer CRS..., the CRS is already set), then this guide is suitable for you. This also assumes that all raster files in your raster data contain CRS metadata. Again, if not, you can set the projection to your raster data with set_projection.py or you can use the regular procedure with QGIS here.

The following script also supports uploading the dataset via FTP/SFTP. Also, take note that the scripts scan's for supported raster files recursively (all files in each source folder).

A. QGIS & preparation

  • First, download and install the latest release of QGIS in OSGeo4W
  • I highly recommend that you install gdal as it can be useful for the guide

B. Dataset bounds

The create_dataset script itself creates a heightsTemplate.json file in the output folder and it outputs the dataset extent (in EPSG:4326[WGS84]) for each source in the Log Messages (The dataset bounds are,...). You can then use this info to follow Part two: Generating/using your dataset starting at step F. Using your generated dataset, once you finish with this guide.

Note: If you wish to generate the heights config again or get each dataset bounds again after running the create_dataset.py script, without re-generating the tiles yet once more, use create_heights_config.py. It uses the same input variables as create_dataset.py as described in step D. Creating the dataset


If you wish to find the dataset extent manually and you've installed gdal, follow the next steps, else continue to step C. Take note, that if you wish to do this, do the following steps before you run the create_dataset script for the first time.**
  • Set cleanup on line:62 to False and continue on with step C, but before that read the step below
  • After you've run the script (once you finish step D), navigate to your source folder, and with a tool like gdalinfo (ex.gdalinfo Source.vrt), find out the extent of the file Source.vrt. The EPSG of Source.vrt is 3857 (Web Mercator)
  • Example of gdalinfo output and how you would use it the step E. Dataset bounds in Part two: Generating/using your dataset can be seen in the image above.

C. Resampling algorithm's

After testing the various resampling algorithms, I concluded that near produces the most accurate colors. By default, the script uses the near resampling algorithm, but you can choose another one if you wish, but be sure to take a careful look at the above image. You can change the algorithm on line:50 in the script.

D. Creating the dataset

D.0 [Optional] VSIZIP Version

If your dataset consists of only one Zipped Raster File (*.zip), you can use the create_dataset_vsizip.py script. If not, skip this sub-section. Be warned that using this method takes a considerably longer time, so It's advisable to unzip the file first and use the normal script instead.

  • Download create_dataset_vsizip.py
  • Open QGIS as an administrator (Right-click `Run as administrator on Windows)
  • Next, navigate to Plugins/Python Console in the toolbar
  • In the opened bottom panel click on the Show Editor button (Script icon)
  • Using the Open Script... button open the create_dataset_vsizip.py file
  • Once the script opens navigate to line:36
  • Here set source_file_name to the name of the source raster file inside the archive (*.zip), not the name of the archive
  • Next, set the source_file variable to the archive path (*.zip), not just the folder path where the archive is located. Make sure to use / and not \ From here on forward, continue with D.1 Main part, but start where [Continue from here if you followed the D.0 sub-section] is mentioned

D.1 Main part

  • First download the create_dataset.py script
  • Open QGIS as an administrator (Right-click Run as administrator on Windows)
  • Next, navigate to Plugins/Python Console in the toolbar
  • In the opened bottom panel click on the Show Editor button (Script icon)
  • Using the Open Script... button open the create_dataset.py file
  • Once the script opens navigate to line:44
  • Here set the sources array to one or more paths to the folder where your raster data is saved. You can use either \\ or / for the path separator
  • [Continue from here if you followed the D.0 sub-section] Next change the output variable value to the folder where you want to save the dataset (where the zoom folder will be created).
    *Note, if you wish to upload the dataset via FTP/SFTP, change it to the folder path on the server itself, or set it to None to upload to the root path of your FTP/SFTP user
  • Change the zoom variable value to the zoom you wish to render the dataset to. To find out the proper zoom level follow the sub-step Finding out the zoom level in Part two: Generating/using your dataset and return here
  • If your dataset doesn't have the NODATA value defined (when you run gdalinfo on the source file, NODATA value doesn't show up), you can define it via the manual_nodata_value variable
  • If your dataset heights are in feet, change the variable value convert_feet_to_meters to True, else leave it at False

D.2 Uploading via FTP/SFTP (optional)

If you wish to upload the dataset via FTP/SFTP, and not store the dataset locally, follow this sub-section, or else skip it. The script can either upload each PNG individually or create a zip archive named RenderedDataset.zip and upload it via FTP/SFTP.

  • First set the varaible ftp_upload value to True
  • If you wish to upload one zip archive, instead of individual files, set ftp_one_file to True
  • If you wish to upload via SFTP, change ftp_s to True. Else keep it at False
  • Next, set ftp_upload_url to the URL of your server, ex. an IP address (192.168.0.26) or domain (ftp.us.debian.org).
    **It's essential that you don't include 'ftp://' or 'sftp://', only specify the hostname and no additional path (ex. ftp://192.168.0.26:2121/Dataset/Tiled is wrong)
  • Set the variable ftp_upload_port value to your FTP port (by default 21) or SFTP port
  • If you wish to log-in anonymously, skip this step, else set ftp_user and ftp_password to your FTP login credentials

D.3 Saving and running the script for the first time

  • Once you are done with setting up the input parameters, go through them once again to make sure they are set correctly, and then save the changes using the Save button (Floppy disk icon)
  • Navigate to View/Panels/Log Messages in the toolbar
  • Now, run the script by clicking on the Run Script button (green play button)
  • It may lag a bit occasionally, but that is normal. Again, this can take a while
  • The script outputs the progress into the Log Messages panel, and the whole progress is also shown in the progress bar. From 0% to 20%, it's creating the VRT file in the terrarium format, and from 20% to 80% the actual tilling process begins
  • The script skips empty tiles, so the reported number of rendered tiles at the end will in most cases always be less than the number of tiles to render reported at the start
  • Once It's done QGIS will send you a notification and in the Log Messages panel it will tell you how long the script ran

For comparison, using the regular procedure with the loadraster_folder.py script and QMetaTiles plugin, it took 23 minutes to render roughly the size of Canberra, Australia at zoom level 17, while it took 13 minutes with the create_dataset script.

From here on forward, follow Part two: Generating/using your dataset starting at step F. Using your generated dataset, using the heightsTemplate.json file (in the output folder) or you can use info from Log Messages (The dataset bounds are,...).

Need help or have a question?

You can contact me on Discord, under davixdevelop#3914, or you can join us on our BTE Development Hub on Discord, and ask away under the #terraplusplus-support channel.