Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] <feature request> #327

Open
jedalong opened this issue Jun 15, 2023 · 3 comments
Open

[FEATURE] <feature request> #327

jedalong opened this issue Jun 15, 2023 · 3 comments

Comments

@jedalong
Copy link

Hi Mark et al!
I am working on building some simple automated tools for combining OSM data with animal tracking data for movement ecology problems (e.g., where and when did the chicken cross the road?). Package osmdata is perfect for this!

One constant issue we run into is that query calls are too large if the bounding box (or volume of features requiested) are too big. But these issues are all situation dependent. Is there any tool/method for in advance getting an estimate of query size that can be used to guess if a query will time out / throw an error (or estimate how long it will take)?

Specifically, what I'm interested in knowing is if there are ways to a priori estimate where queries need to be split, made smaller etc or if that is simply something you find out after trying and failing...

SIDENOTE: The query splitting workflow from vignette 4 was not working as expected (for me), but i can explore what was going wrong when i get a bit more time and provide proper feedback there...

Cheers
Jed

@jmaspons
Copy link
Collaborator

Hello,

I think that it's not possible to know the runtime of the query beforehand. I would try to build the minimal query that you need (eg. filter for roads if that's the only feature type you are interested in) and add a big timeout

@mpadge
Copy link
Member

mpadge commented Jun 15, 2023

Great to hear from you Jed! My workflows for jobs that are too big for the overpass API is to manually download pbf files from https://geofabrik.de (or with the osmextract package if you like, but use that only to download, not to read into R), use osmium-tool to filter by both geography and key-val pairs, output that to .osm/.xml format, and then read that in with osmdata.

One day i'll find time to wrap osmium C++ code into an R package to enable that workflow within R, but until then, you can also easily set it up via bunches of system calls, like these examples (for linux-only). Feel free to ask further questions.

@jedalong
Copy link
Author

OK thanks both for the comments. I will look further into downloading pbf files and then see what might work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants