-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SRTM downloads failing for bigger domain sizes #529
Comments
…ing for bigger domain sizes, by increasing the nSearchPixels/dfMaxSearchDist from the default value of 100 to a value of 120 in gdal_util GDALFilBandNoData(). This at least makes it work for domains up to 30 km x 30 km in size, though something else is needed for even bigger domains. For issue #529
Should we call this good enough for now? I'm not sure what else we can do since the issue is no data values arising from the reprojection to UTM coordinates. There's no option to specify a different projection during the clipping. The file is stored in EPSG 4326. Only other options I can think of:
|
1. seems like it should work, though it would probably still require some kind of scaling to make sure the larger area is always big enough for clipping after the reprojection. The current method would require some kind of scaling to work for larger dem selections anyways, my fix for the 30 km x 30 km grids was just a temporary fix and no clue up to what grid size the method starts to break down again without scaling. Scaling the current method could probably be done by taking the difference between the original corners reprojected and the new corners after reprojection, find the largest length of no data values, but not sure how to scale the required additional area for 1.
Properly implemented, 1. would also be better than the current method, as there would no longer be any no data values left to fill in the first place. Seems like we already do something like this for weather model data downloads as well, though not sure if the requested weather model bounding box is just some percent larger than the dem or if it is scaled somehow to the dem.
|
@latwood I agree, I think we should implement 1. Do you want to take a stab at implementing this, using what we do in the wx model downloads as a reference? We can target this for 3.12.0. |
Example: missoula aka 46.8721, -113.9940, the WindNinja gui srtm download works for this point with radius of 17.937 miles (28.8668033 km) but breaks for anything 17.938 miles or bigger. fetch_dem breaks at even smaller sizes, though I got it to work for a 15 km radius (9.32057 miles). Ran into this when trying to do 30 km x 30 km (18.6411 miles) srtm dem downloads with fetch_dem.
Narrowed down the issue and found a temporary fix: appears to be breaking when filling the no data values after successfully downloading and warping the dataset, it appears to not quite fill all the no data values for these cases. Setting nSearchPixels/dfMaxSearchDist to 120 instead of to the default 100 appears to fix the problem for at least the 30 km x 30 km case. I originally thought it also needed nSmoothingIterations increased from 0 to 1 to work, but turns out it was just an increase to nSearchPixels/dfMaxSearchDist.
[https://github.com/firelab/windninja/blob/345bce3d73f21c6336ff79b3bf04756bac27623a/src/ninja/srtmclient.cpp#L180]
[https://github.com/firelab/windninja/blob/345bce3d73f21c6336ff79b3bf04756bac27623a/src/ninja/gdal_util.cpp#L1021]
[https://github.com/firelab/windninja/blob/345bce3d73f21c6336ff79b3bf04756bac27623a/src/ninja/gdal_util.cpp#L1003C24-L1003C42]
[https://github.com/firelab/windninja/blob/345bce3d73f21c6336ff79b3bf04756bac27623a/src/ninja/gdal_util.cpp#L596]
Looking over the other dem download methods, it appears that they use different no data filling methods, specifically the ascii.cpp functions rather than the gdal_util.cpp functions, if it is even being done at all for some of the datasets. In fact, seems to mostly be done separately from the dem downloading for most of the other dem types.
[https://github.com/firelab/windninja/blob/345bce3d73f21c6336ff79b3bf04756bac27623a/src/ninja/ascii_grid.cpp#L926]
Anyhow, this appears to be related to issue #205, stumbled on that issue by accident. Sounds like how to scale this has been thought about before, but maybe not in this way. In addition, looking at the resulting no data filled srtm dems in paraview, I suspect that further investigation into how the no data filling affects the various dems in openfoam might prove to be interesting.
One possible crazy idea I thought of, is I think technically we should be checking to make sure there aren't too many no data values to fill in the first place before the warp, just to make sure the dem isn't just straight no data values, then could do some kind of try retry of the fill no data function, increment the value by 50 till it works. Well, calculating some nSearchPixels/dfMaxSearchDist based off the pixel sizes before and after the warp might be smarter and more efficient though. Have to think on this.
Jason brought up the idea that, if you have a flat dem and then a quick drop off to the edge where there are no data values to be filled, the averaging could produce bumps in the filled no data locations, with bigger number of pixels to be averaged it could produce jumps in the terrain no data fill values. So say x0 = nan, x1 = nan, x2 = nan, x3 = 50, x4 = 100, x5 = 150, x6 to end = 200 m. (50+100+150)/3 = 100 m which is higher than the 50 m at the edge, (50+100)/2 = 75 m still higher than the 50 m at the edge but a smaller bump, (50+100+150+200+200+200+200) limit to nSearchPixels/dfMaxSearchDist to be more and more would go to a result closer to 200 m, a big jump from 50 m.
The text was updated successfully, but these errors were encountered: