-
Notifications
You must be signed in to change notification settings - Fork 185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
list_tables() returns a maximum of 1000 table names #58
Comments
so the BQ API has a limit on how many results it will return.
from past experience, the slick thing to do here is to fetch lazily, which we should make sure we do. |
I'm not sure what
|
@craigcitro do you know of any big public datasets that we could use for testing here? |
@hadley I think the easier approach is to expose the param that controls number of items per API call (either as a param or just for tests), and then just set that to 1 and try something with at least 2 items. |
Oh, duh, yeah. |
Duplicate of #108 |
A little late to this... and it was two jobs ago, but IIRC bq_get() for this large number of tables was super slow because it was returning a bunch of additional unnecessary info... just a warning. |
Good call @TroyHernandez -- just filed #153 for that. |
length(list_tables(project, dataset, max_results = 1001))
[1] 1000
I have over 3000 tables in the dataset and could really use all of their names.
The text was updated successfully, but these errors were encountered: