-
Notifications
You must be signed in to change notification settings - Fork 6.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add example of creating AutoML Tables client with non-default endpoint ('new' sdk) #3929
Conversation
Hi Amy - try running
|
@sirtorry Correct me if I'm wrong, but I think we want to consolidate all of our AutoML samples into automl/. |
I was just following what appeared to be the convention for the location of the 'new sdk' examples— LMK. |
tables/automl/endpoint_test.py
Outdated
|
||
def test_client_creation(capsys): | ||
client = automl_tables_set_endpoint.create_client_with_endpoint(PROJECT) | ||
print(client.list_datasets()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be possible to have the create_client_with_endpoint
method print something, and then assert on your print? Then, you would be testing your sample, rather than the actual AutoML underlying function.
Also, if testing is something you're not 100% comfortable with, do feel free to put some time on my calendar and we can take 20-30 min to have a faster back and forth about what you think is most appropriate for this sample :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 to @leahecole 's suggestion.
For the set endpoint kind of samples, we typically use a really simple API call to verify that the client was created and operates correctly.
See example here.
I suggest copying the output of the regular Tables list_datasets sample.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In this case, no need to check anything re: the 'list_datasets' output specifics. There may or may not be any defined, but the details aren't relevant here. (I picked that call arbitrarily as a simple connectivity check, so I don't think that sample is particularly relevant).
In fact, I think we can remove an assertion on the output altogether-- if the call fails, it will throw an exception, which will fail the test. Would you prefer the test explicitly catch the exception + then raise an error, or just let it be thrown?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 to just executing list_datasets
since the goal is to check that the client was constructed correctly.
+1 to separate move |
I'm fine with moving these samples as part of a separate PR. |
Your PR has attempted to merge for 3 hours. Please check that all required checks have passed, you have an automerge label, and that all your reviewers have approved the PR |
Your PR has attempted to merge for 3 hours. Please check that all required checks have passed, you have an automerge label, and that all your reviewers have approved the PR |
Merge-on-green attempted to merge your PR for 6 hours, but it was not mergeable because either one of your required status checks failed, or one of your required reviews was not approved. Learn more about your required status checks here: https://help.github.com/en/github/administering-a-repository/enabling-required-status-checks. You can remove and reapply the label to re-run the bot. |
…t ('new' sdk) (GoogleCloudPlatform#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com>
…t ('new' sdk) [(#3929)](GoogleCloudPlatform/python-docs-samples#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com>
…t ('new' sdk) [(#3929)](GoogleCloudPlatform/python-docs-samples#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com>
* Tables Notebooks [(#2090)](GoogleCloudPlatform/python-docs-samples#2090) * initial commit * update census * update notebooks * remove the reference to a bug [(#2100)](GoogleCloudPlatform/python-docs-samples#2100) as the bug has been fixed in the public client lib * delete this file. [(#2102)](GoogleCloudPlatform/python-docs-samples#2102) * rename file name [(#2103)](GoogleCloudPlatform/python-docs-samples#2103) * trying to fix images [(#2101)](GoogleCloudPlatform/python-docs-samples#2101) * remove typo in installation [(#2110)](GoogleCloudPlatform/python-docs-samples#2110) * Rename census_income_prediction.ipynb to getting_started_notebook.ipynb [(#2115)](GoogleCloudPlatform/python-docs-samples#2115) renaming the notebooks as Getting Started (will be in sync with the doc). It will be great if the folder could be renamed too * added back missing file package import [(#2150)](GoogleCloudPlatform/python-docs-samples#2150) * added back missing file import [(#2145)](GoogleCloudPlatform/python-docs-samples#2145) * remove incorrect reference to Iris dataset [(#2203)](GoogleCloudPlatform/python-docs-samples#2203) * conversion to jupyter/colab [(#2340)](GoogleCloudPlatform/python-docs-samples#2340) plus bug fixes * updated for the Jupyter support [(#2337)](GoogleCloudPlatform/python-docs-samples#2337) * updated readme for support Jupyter [(#2336)](GoogleCloudPlatform/python-docs-samples#2336) to approve with the updated notebook supporting jupyter * conversion to jupyer/colab [(#2339)](GoogleCloudPlatform/python-docs-samples#2339) plus bug fixes * conversion of notebook for jupyter/Colab [(#2338)](GoogleCloudPlatform/python-docs-samples#2338) conversion of the notebook to support both Jupyter and Colab + bug fixes * [BLOCKED] AutoML Tables: Docs samples updated to use new (pending) client [(#2276)](GoogleCloudPlatform/python-docs-samples#2276) * AutoML Tables: Docs samples updated to use new (pending) client * Linter warnings * add product recommendation for automl tables notebook [(#2257)](GoogleCloudPlatform/python-docs-samples#2257) * added colab filtering notebook * update to tables client * update readme * tell user to restart kernel for automl * AutoML Tables: Notebook samples updated to use new tables client [(#2424)](GoogleCloudPlatform/python-docs-samples#2424) * fix users bug and emphasize kernal restart [(#2407)](GoogleCloudPlatform/python-docs-samples#2407) * fix problems with automl docs [(#2501)](GoogleCloudPlatform/python-docs-samples#2501) Today when we try to use the function `batch_predict` follow the docs we receive and error saying: `the paramaters should be a pandas.Dataframe` it’s happens because the first parameter of the function `batch_predict` is a pandas.Dataframe. To solve this problem we need to use de named parameters of python. * Fix typo in GCS URI parameter [(#2459)](GoogleCloudPlatform/python-docs-samples#2459) * fix: fix tables notebook links and bugs [(#2601)](GoogleCloudPlatform/python-docs-samples#2601) * feat(tables): update samples to show explainability [(#2523)](GoogleCloudPlatform/python-docs-samples#2523) * show xai * local feature importance * use updated client * use fixed library * use new model * Auto-update dependencies. [(#2005)](GoogleCloudPlatform/python-docs-samples#2005) * Auto-update dependencies. * Revert update of appengine/flexible/datastore. * revert update of appengine/flexible/scipy * revert update of bigquery/bqml * revert update of bigquery/cloud-client * revert update of bigquery/datalab-migration * revert update of bigtable/quickstart * revert update of compute/api * revert update of container_registry/container_analysis * revert update of dataflow/run_template * revert update of datastore/cloud-ndb * revert update of dialogflow/cloud-client * revert update of dlp * revert update of functions/imagemagick * revert update of functions/ocr/app * revert update of healthcare/api-client/fhir * revert update of iam/api-client * revert update of iot/api-client/gcs_file_to_device * revert update of iot/api-client/mqtt_example * revert update of language/automl * revert update of run/image-processing * revert update of vision/automl * revert update testing/requirements.txt * revert update of vision/cloud-client/detect * revert update of vision/cloud-client/product_search * revert update of jobs/v2/api_client * revert update of jobs/v3/api_client * revert update of opencensus * revert update of translate/cloud-client * revert update to speech/cloud-client Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> * Update dependency google-cloud-automl to v0.10.0 [(#3033)](GoogleCloudPlatform/python-docs-samples#3033) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * Simplify noxfile setup. [(#2806)](GoogleCloudPlatform/python-docs-samples#2806) * chore(deps): update dependency requests to v2.23.0 * Simplify noxfile and add version control. * Configure appengine/standard to only test Python 2.7. * Update Kokokro configs to match noxfile. * Add requirements-test to each folder. * Remove Py2 versions from everything execept appengine/standard. * Remove conftest.py. * Remove appengine/standard/conftest.py * Remove 'no-sucess-flaky-report' from pytest.ini. * Add GAE SDK back to appengine/standard tests. * Fix typo. * Roll pytest to python 2 version. * Add a bunch of testing requirements. * Remove typo. * Add appengine lib directory back in. * Add some additional requirements. * Fix issue with flake8 args. * Even more requirements. * Readd appengine conftest.py. * Add a few more requirements. * Even more Appengine requirements. * Add webtest for appengine/standard/mailgun. * Add some additional requirements. * Add workaround for issue with mailjet-rest. * Add responses for appengine/standard/mailjet. Co-authored-by: Renovate Bot <bot@renovateapp.com> * chore: some lint fixes [(#3750)](GoogleCloudPlatform/python-docs-samples#3750) * automl: tables code sample clean-up [(#3571)](GoogleCloudPlatform/python-docs-samples#3571) * delete unused tables_dataset samples * delete args code associated with unused automl_tables samples * delete tests associated with unused automl_tables samples * restore get_dataset method/yargs without region tagging * Restore update_dataset methodsa without region tagging Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * add example of creating AutoML Tables client with non-default endpoint ('new' sdk) [(#3929)](GoogleCloudPlatform/python-docs-samples#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> * Replace GCLOUD_PROJECT with GOOGLE_CLOUD_PROJECT. [(#4022)](GoogleCloudPlatform/python-docs-samples#4022) * chore(deps): update dependency google-cloud-automl to v1 [(#4127)](GoogleCloudPlatform/python-docs-samples#4127) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [google-cloud-automl](https://togithub.com/googleapis/python-automl) | major | `==0.10.0` -> `==1.0.1` | --- ### Release Notes <details> <summary>googleapis/python-automl</summary> ### [`v1.0.1`](https://togithub.com/googleapis/python-automl/blob/master/CHANGELOG.md#​101-httpswwwgithubcomgoogleapispython-automlcomparev100v101-2020-06-18) [Compare Source](https://togithub.com/googleapis/python-automl/compare/v0.10.0...v1.0.1) </details> --- ### Renovate configuration :date: **Schedule**: At any time (no schedule defined). :vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied. :recycle: **Rebasing**: Never, or you tick the rebase/retry checkbox. :no_bell: **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples). * [tables/automl] fix: update the csv file and the dataset name [(#4188)](GoogleCloudPlatform/python-docs-samples#4188) fixes #4177 fixes #4178 * samples: Automl table batch test [(#4267)](GoogleCloudPlatform/python-docs-samples#4267) * added rtest req.txt * samples: added automl batch predict test * added missing package * Update tables/automl/batch_predict_test.py Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> * samples: fixed wrong format on GCS input Uri [(#4270)](GoogleCloudPlatform/python-docs-samples#4270) ## Description Current predict sample indicates that it can multiples GCS URI inputs but it should be singular. ## Checklist - [X] Please **merge** this PR for me once it is approved. * chore(deps): update dependency pytest to v5.4.3 [(#4279)](GoogleCloudPlatform/python-docs-samples#4279) * chore(deps): update dependency pytest to v5.4.3 * specify pytest for python 2 in appengine Co-authored-by: Leah Cole <coleleah@google.com> * Update automl_tables_predict.py with batch_predict_bq sample [(#4142)](GoogleCloudPlatform/python-docs-samples#4142) Added a new method `batch_predict_bq` demonstrating running batch_prediction using BigQuery. Added notes in comments about asynchronicity for `batch_predict` method. The region `automl_tables_batch_predict_bq` will be used on cloud.google.com (currently both sections for GCS and BigQuery use the same sample code which is incorrect). Fixes #4141 Note: It's a good idea to open an issue first for discussion. - [x] Please **merge** this PR for me once it is approved. * Update dependency pytest to v6 [(#4390)](GoogleCloudPlatform/python-docs-samples#4390) * chore: exclude notebooks * chore: update templates * chore: add codeowners and fix tests * chore: ignore warnings from sphinx * chore: fix tables client * test: fix unit tests Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> Co-authored-by: florencep <florenceperot@google.com> Co-authored-by: Mike Burton <mb-github@niskala.org> Co-authored-by: Lars Wander <lwander@users.noreply.github.com> Co-authored-by: Michael Hu <Michael.an.hu@gmail.com> Co-authored-by: Michael Hu <michaelanhu@gmail.com> Co-authored-by: Alefh Sousa <alefh.sousa@gmail.com> Co-authored-by: DPEBot <dpebot@google.com> Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> Co-authored-by: WhiteSource Renovate <bot@renovateapp.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Anthony <wens.ajw@gmail.com> Co-authored-by: Amy <amy@infosleuth.net> Co-authored-by: Mike <45373284+munkhuushmgl@users.noreply.github.com> Co-authored-by: Leah Cole <coleleah@google.com> Co-authored-by: Sergei Dorogin <github@dorogin.com>
* Tables Notebooks [(#2090)](GoogleCloudPlatform/python-docs-samples#2090) * initial commit * update census * update notebooks * remove the reference to a bug [(#2100)](GoogleCloudPlatform/python-docs-samples#2100) as the bug has been fixed in the public client lib * delete this file. [(#2102)](GoogleCloudPlatform/python-docs-samples#2102) * rename file name [(#2103)](GoogleCloudPlatform/python-docs-samples#2103) * trying to fix images [(#2101)](GoogleCloudPlatform/python-docs-samples#2101) * remove typo in installation [(#2110)](GoogleCloudPlatform/python-docs-samples#2110) * Rename census_income_prediction.ipynb to getting_started_notebook.ipynb [(#2115)](GoogleCloudPlatform/python-docs-samples#2115) renaming the notebooks as Getting Started (will be in sync with the doc). It will be great if the folder could be renamed too * added back missing file package import [(#2150)](GoogleCloudPlatform/python-docs-samples#2150) * added back missing file import [(#2145)](GoogleCloudPlatform/python-docs-samples#2145) * remove incorrect reference to Iris dataset [(#2203)](GoogleCloudPlatform/python-docs-samples#2203) * conversion to jupyter/colab [(#2340)](GoogleCloudPlatform/python-docs-samples#2340) plus bug fixes * updated for the Jupyter support [(#2337)](GoogleCloudPlatform/python-docs-samples#2337) * updated readme for support Jupyter [(#2336)](GoogleCloudPlatform/python-docs-samples#2336) to approve with the updated notebook supporting jupyter * conversion to jupyer/colab [(#2339)](GoogleCloudPlatform/python-docs-samples#2339) plus bug fixes * conversion of notebook for jupyter/Colab [(#2338)](GoogleCloudPlatform/python-docs-samples#2338) conversion of the notebook to support both Jupyter and Colab + bug fixes * [BLOCKED] AutoML Tables: Docs samples updated to use new (pending) client [(#2276)](GoogleCloudPlatform/python-docs-samples#2276) * AutoML Tables: Docs samples updated to use new (pending) client * Linter warnings * add product recommendation for automl tables notebook [(#2257)](GoogleCloudPlatform/python-docs-samples#2257) * added colab filtering notebook * update to tables client * update readme * tell user to restart kernel for automl * AutoML Tables: Notebook samples updated to use new tables client [(#2424)](GoogleCloudPlatform/python-docs-samples#2424) * fix users bug and emphasize kernal restart [(#2407)](GoogleCloudPlatform/python-docs-samples#2407) * fix problems with automl docs [(#2501)](GoogleCloudPlatform/python-docs-samples#2501) Today when we try to use the function `batch_predict` follow the docs we receive and error saying: `the paramaters should be a pandas.Dataframe` it’s happens because the first parameter of the function `batch_predict` is a pandas.Dataframe. To solve this problem we need to use de named parameters of python. * Fix typo in GCS URI parameter [(#2459)](GoogleCloudPlatform/python-docs-samples#2459) * fix: fix tables notebook links and bugs [(#2601)](GoogleCloudPlatform/python-docs-samples#2601) * feat(tables): update samples to show explainability [(#2523)](GoogleCloudPlatform/python-docs-samples#2523) * show xai * local feature importance * use updated client * use fixed library * use new model * Auto-update dependencies. [(#2005)](GoogleCloudPlatform/python-docs-samples#2005) * Auto-update dependencies. * Revert update of appengine/flexible/datastore. * revert update of appengine/flexible/scipy * revert update of bigquery/bqml * revert update of bigquery/cloud-client * revert update of bigquery/datalab-migration * revert update of bigtable/quickstart * revert update of compute/api * revert update of container_registry/container_analysis * revert update of dataflow/run_template * revert update of datastore/cloud-ndb * revert update of dialogflow/cloud-client * revert update of dlp * revert update of functions/imagemagick * revert update of functions/ocr/app * revert update of healthcare/api-client/fhir * revert update of iam/api-client * revert update of iot/api-client/gcs_file_to_device * revert update of iot/api-client/mqtt_example * revert update of language/automl * revert update of run/image-processing * revert update of vision/automl * revert update testing/requirements.txt * revert update of vision/cloud-client/detect * revert update of vision/cloud-client/product_search * revert update of jobs/v2/api_client * revert update of jobs/v3/api_client * revert update of opencensus * revert update of translate/cloud-client * revert update to speech/cloud-client Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> * Update dependency google-cloud-automl to v0.10.0 [(#3033)](GoogleCloudPlatform/python-docs-samples#3033) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * Simplify noxfile setup. [(#2806)](GoogleCloudPlatform/python-docs-samples#2806) * chore(deps): update dependency requests to v2.23.0 * Simplify noxfile and add version control. * Configure appengine/standard to only test Python 2.7. * Update Kokokro configs to match noxfile. * Add requirements-test to each folder. * Remove Py2 versions from everything execept appengine/standard. * Remove conftest.py. * Remove appengine/standard/conftest.py * Remove 'no-sucess-flaky-report' from pytest.ini. * Add GAE SDK back to appengine/standard tests. * Fix typo. * Roll pytest to python 2 version. * Add a bunch of testing requirements. * Remove typo. * Add appengine lib directory back in. * Add some additional requirements. * Fix issue with flake8 args. * Even more requirements. * Readd appengine conftest.py. * Add a few more requirements. * Even more Appengine requirements. * Add webtest for appengine/standard/mailgun. * Add some additional requirements. * Add workaround for issue with mailjet-rest. * Add responses for appengine/standard/mailjet. Co-authored-by: Renovate Bot <bot@renovateapp.com> * chore: some lint fixes [(#3750)](GoogleCloudPlatform/python-docs-samples#3750) * automl: tables code sample clean-up [(#3571)](GoogleCloudPlatform/python-docs-samples#3571) * delete unused tables_dataset samples * delete args code associated with unused automl_tables samples * delete tests associated with unused automl_tables samples * restore get_dataset method/yargs without region tagging * Restore update_dataset methodsa without region tagging Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * add example of creating AutoML Tables client with non-default endpoint ('new' sdk) [(#3929)](GoogleCloudPlatform/python-docs-samples#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> * Replace GCLOUD_PROJECT with GOOGLE_CLOUD_PROJECT. [(#4022)](GoogleCloudPlatform/python-docs-samples#4022) * chore(deps): update dependency google-cloud-automl to v1 [(#4127)](GoogleCloudPlatform/python-docs-samples#4127) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [google-cloud-automl](https://togithub.com/googleapis/python-automl) | major | `==0.10.0` -> `==1.0.1` | --- ### Release Notes <details> <summary>googleapis/python-automl</summary> ### [`v1.0.1`](https://togithub.com/googleapis/python-automl/blob/master/CHANGELOG.md#​101-httpswwwgithubcomgoogleapispython-automlcomparev100v101-2020-06-18) [Compare Source](https://togithub.com/googleapis/python-automl/compare/v0.10.0...v1.0.1) </details> --- ### Renovate configuration :date: **Schedule**: At any time (no schedule defined). :vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied. :recycle: **Rebasing**: Never, or you tick the rebase/retry checkbox. :no_bell: **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples). * [tables/automl] fix: update the csv file and the dataset name [(#4188)](GoogleCloudPlatform/python-docs-samples#4188) fixes #4177 fixes #4178 * samples: Automl table batch test [(#4267)](GoogleCloudPlatform/python-docs-samples#4267) * added rtest req.txt * samples: added automl batch predict test * added missing package * Update tables/automl/batch_predict_test.py Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> * samples: fixed wrong format on GCS input Uri [(#4270)](GoogleCloudPlatform/python-docs-samples#4270) ## Description Current predict sample indicates that it can multiples GCS URI inputs but it should be singular. ## Checklist - [X] Please **merge** this PR for me once it is approved. * chore(deps): update dependency pytest to v5.4.3 [(#4279)](GoogleCloudPlatform/python-docs-samples#4279) * chore(deps): update dependency pytest to v5.4.3 * specify pytest for python 2 in appengine Co-authored-by: Leah Cole <coleleah@google.com> * Update automl_tables_predict.py with batch_predict_bq sample [(#4142)](GoogleCloudPlatform/python-docs-samples#4142) Added a new method `batch_predict_bq` demonstrating running batch_prediction using BigQuery. Added notes in comments about asynchronicity for `batch_predict` method. The region `automl_tables_batch_predict_bq` will be used on cloud.google.com (currently both sections for GCS and BigQuery use the same sample code which is incorrect). Fixes #4141 Note: It's a good idea to open an issue first for discussion. - [x] Please **merge** this PR for me once it is approved. * Update dependency pytest to v6 [(#4390)](GoogleCloudPlatform/python-docs-samples#4390) * chore: exclude notebooks * chore: update templates * chore: add codeowners and fix tests * chore: ignore warnings from sphinx * chore: fix tables client * test: fix unit tests Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> Co-authored-by: florencep <florenceperot@google.com> Co-authored-by: Mike Burton <mb-github@niskala.org> Co-authored-by: Lars Wander <lwander@users.noreply.github.com> Co-authored-by: Michael Hu <Michael.an.hu@gmail.com> Co-authored-by: Michael Hu <michaelanhu@gmail.com> Co-authored-by: Alefh Sousa <alefh.sousa@gmail.com> Co-authored-by: DPEBot <dpebot@google.com> Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> Co-authored-by: WhiteSource Renovate <bot@renovateapp.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Anthony <wens.ajw@gmail.com> Co-authored-by: Amy <amy@infosleuth.net> Co-authored-by: Mike <45373284+munkhuushmgl@users.noreply.github.com> Co-authored-by: Leah Cole <coleleah@google.com> Co-authored-by: Sergei Dorogin <github@dorogin.com>
* Tables Notebooks [(#2090)](#2090) * initial commit * update census * update notebooks * remove the reference to a bug [(#2100)](#2100) as the bug has been fixed in the public client lib * delete this file. [(#2102)](#2102) * rename file name [(#2103)](#2103) * trying to fix images [(#2101)](#2101) * remove typo in installation [(#2110)](#2110) * Rename census_income_prediction.ipynb to getting_started_notebook.ipynb [(#2115)](#2115) renaming the notebooks as Getting Started (will be in sync with the doc). It will be great if the folder could be renamed too * added back missing file package import [(#2150)](#2150) * added back missing file import [(#2145)](#2145) * remove incorrect reference to Iris dataset [(#2203)](#2203) * conversion to jupyter/colab [(#2340)](#2340) plus bug fixes * updated for the Jupyter support [(#2337)](#2337) * updated readme for support Jupyter [(#2336)](#2336) to approve with the updated notebook supporting jupyter * conversion to jupyer/colab [(#2339)](#2339) plus bug fixes * conversion of notebook for jupyter/Colab [(#2338)](#2338) conversion of the notebook to support both Jupyter and Colab + bug fixes * [BLOCKED] AutoML Tables: Docs samples updated to use new (pending) client [(#2276)](#2276) * AutoML Tables: Docs samples updated to use new (pending) client * Linter warnings * add product recommendation for automl tables notebook [(#2257)](#2257) * added colab filtering notebook * update to tables client * update readme * tell user to restart kernel for automl * AutoML Tables: Notebook samples updated to use new tables client [(#2424)](#2424) * fix users bug and emphasize kernal restart [(#2407)](#2407) * fix problems with automl docs [(#2501)](#2501) Today when we try to use the function `batch_predict` follow the docs we receive and error saying: `the paramaters should be a pandas.Dataframe` it’s happens because the first parameter of the function `batch_predict` is a pandas.Dataframe. To solve this problem we need to use de named parameters of python. * Fix typo in GCS URI parameter [(#2459)](#2459) * fix: fix tables notebook links and bugs [(#2601)](#2601) * feat(tables): update samples to show explainability [(#2523)](#2523) * show xai * local feature importance * use updated client * use fixed library * use new model * Auto-update dependencies. [(#2005)](#2005) * Auto-update dependencies. * Revert update of appengine/flexible/datastore. * revert update of appengine/flexible/scipy * revert update of bigquery/bqml * revert update of bigquery/cloud-client * revert update of bigquery/datalab-migration * revert update of bigtable/quickstart * revert update of compute/api * revert update of container_registry/container_analysis * revert update of dataflow/run_template * revert update of datastore/cloud-ndb * revert update of dialogflow/cloud-client * revert update of dlp * revert update of functions/imagemagick * revert update of functions/ocr/app * revert update of healthcare/api-client/fhir * revert update of iam/api-client * revert update of iot/api-client/gcs_file_to_device * revert update of iot/api-client/mqtt_example * revert update of language/automl * revert update of run/image-processing * revert update of vision/automl * revert update testing/requirements.txt * revert update of vision/cloud-client/detect * revert update of vision/cloud-client/product_search * revert update of jobs/v2/api_client * revert update of jobs/v3/api_client * revert update of opencensus * revert update of translate/cloud-client * revert update to speech/cloud-client Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> * Update dependency google-cloud-automl to v0.10.0 [(#3033)](#3033) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * Simplify noxfile setup. [(#2806)](#2806) * chore(deps): update dependency requests to v2.23.0 * Simplify noxfile and add version control. * Configure appengine/standard to only test Python 2.7. * Update Kokokro configs to match noxfile. * Add requirements-test to each folder. * Remove Py2 versions from everything execept appengine/standard. * Remove conftest.py. * Remove appengine/standard/conftest.py * Remove 'no-sucess-flaky-report' from pytest.ini. * Add GAE SDK back to appengine/standard tests. * Fix typo. * Roll pytest to python 2 version. * Add a bunch of testing requirements. * Remove typo. * Add appengine lib directory back in. * Add some additional requirements. * Fix issue with flake8 args. * Even more requirements. * Readd appengine conftest.py. * Add a few more requirements. * Even more Appengine requirements. * Add webtest for appengine/standard/mailgun. * Add some additional requirements. * Add workaround for issue with mailjet-rest. * Add responses for appengine/standard/mailjet. Co-authored-by: Renovate Bot <bot@renovateapp.com> * chore: some lint fixes [(#3750)](#3750) * automl: tables code sample clean-up [(#3571)](#3571) * delete unused tables_dataset samples * delete args code associated with unused automl_tables samples * delete tests associated with unused automl_tables samples * restore get_dataset method/yargs without region tagging * Restore update_dataset methodsa without region tagging Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * add example of creating AutoML Tables client with non-default endpoint ('new' sdk) [(#3929)](#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> * Replace GCLOUD_PROJECT with GOOGLE_CLOUD_PROJECT. [(#4022)](#4022) * chore(deps): update dependency google-cloud-automl to v1 [(#4127)](#4127) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [google-cloud-automl](https://togithub.com/googleapis/python-automl) | major | `==0.10.0` -> `==1.0.1` | --- ### Release Notes <details> <summary>googleapis/python-automl</summary> ### [`v1.0.1`](https://togithub.com/googleapis/python-automl/blob/master/CHANGELOG.md#​101-httpswwwgithubcomgoogleapispython-automlcomparev100v101-2020-06-18) [Compare Source](https://togithub.com/googleapis/python-automl/compare/v0.10.0...v1.0.1) </details> --- ### Renovate configuration :date: **Schedule**: At any time (no schedule defined). :vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied. :recycle: **Rebasing**: Never, or you tick the rebase/retry checkbox. :no_bell: **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples). * [tables/automl] fix: update the csv file and the dataset name [(#4188)](#4188) fixes #4177 fixes #4178 * samples: Automl table batch test [(#4267)](#4267) * added rtest req.txt * samples: added automl batch predict test * added missing package * Update tables/automl/batch_predict_test.py Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> * samples: fixed wrong format on GCS input Uri [(#4270)](#4270) ## Description Current predict sample indicates that it can multiples GCS URI inputs but it should be singular. ## Checklist - [X] Please **merge** this PR for me once it is approved. * chore(deps): update dependency pytest to v5.4.3 [(#4279)](#4279) * chore(deps): update dependency pytest to v5.4.3 * specify pytest for python 2 in appengine Co-authored-by: Leah Cole <coleleah@google.com> * Update automl_tables_predict.py with batch_predict_bq sample [(#4142)](#4142) Added a new method `batch_predict_bq` demonstrating running batch_prediction using BigQuery. Added notes in comments about asynchronicity for `batch_predict` method. The region `automl_tables_batch_predict_bq` will be used on cloud.google.com (currently both sections for GCS and BigQuery use the same sample code which is incorrect). Fixes #4141 Note: It's a good idea to open an issue first for discussion. - [x] Please **merge** this PR for me once it is approved. * Update dependency pytest to v6 [(#4390)](#4390) * chore: exclude notebooks * chore: update templates * chore: add codeowners and fix tests * chore: ignore warnings from sphinx * chore: fix tables client * test: fix unit tests Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> Co-authored-by: florencep <florenceperot@google.com> Co-authored-by: Mike Burton <mb-github@niskala.org> Co-authored-by: Lars Wander <lwander@users.noreply.github.com> Co-authored-by: Michael Hu <Michael.an.hu@gmail.com> Co-authored-by: Michael Hu <michaelanhu@gmail.com> Co-authored-by: Alefh Sousa <alefh.sousa@gmail.com> Co-authored-by: DPEBot <dpebot@google.com> Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> Co-authored-by: WhiteSource Renovate <bot@renovateapp.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Anthony <wens.ajw@gmail.com> Co-authored-by: Amy <amy@infosleuth.net> Co-authored-by: Mike <45373284+munkhuushmgl@users.noreply.github.com> Co-authored-by: Leah Cole <coleleah@google.com> Co-authored-by: Sergei Dorogin <github@dorogin.com>
* Tables Notebooks [(#2090)](#2090) * initial commit * update census * update notebooks * remove the reference to a bug [(#2100)](#2100) as the bug has been fixed in the public client lib * delete this file. [(#2102)](#2102) * rename file name [(#2103)](#2103) * trying to fix images [(#2101)](#2101) * remove typo in installation [(#2110)](#2110) * Rename census_income_prediction.ipynb to getting_started_notebook.ipynb [(#2115)](#2115) renaming the notebooks as Getting Started (will be in sync with the doc). It will be great if the folder could be renamed too * added back missing file package import [(#2150)](#2150) * added back missing file import [(#2145)](#2145) * remove incorrect reference to Iris dataset [(#2203)](#2203) * conversion to jupyter/colab [(#2340)](#2340) plus bug fixes * updated for the Jupyter support [(#2337)](#2337) * updated readme for support Jupyter [(#2336)](#2336) to approve with the updated notebook supporting jupyter * conversion to jupyer/colab [(#2339)](#2339) plus bug fixes * conversion of notebook for jupyter/Colab [(#2338)](#2338) conversion of the notebook to support both Jupyter and Colab + bug fixes * [BLOCKED] AutoML Tables: Docs samples updated to use new (pending) client [(#2276)](#2276) * AutoML Tables: Docs samples updated to use new (pending) client * Linter warnings * add product recommendation for automl tables notebook [(#2257)](#2257) * added colab filtering notebook * update to tables client * update readme * tell user to restart kernel for automl * AutoML Tables: Notebook samples updated to use new tables client [(#2424)](#2424) * fix users bug and emphasize kernal restart [(#2407)](#2407) * fix problems with automl docs [(#2501)](#2501) Today when we try to use the function `batch_predict` follow the docs we receive and error saying: `the paramaters should be a pandas.Dataframe` it’s happens because the first parameter of the function `batch_predict` is a pandas.Dataframe. To solve this problem we need to use de named parameters of python. * Fix typo in GCS URI parameter [(#2459)](#2459) * fix: fix tables notebook links and bugs [(#2601)](#2601) * feat(tables): update samples to show explainability [(#2523)](#2523) * show xai * local feature importance * use updated client * use fixed library * use new model * Auto-update dependencies. [(#2005)](#2005) * Auto-update dependencies. * Revert update of appengine/flexible/datastore. * revert update of appengine/flexible/scipy * revert update of bigquery/bqml * revert update of bigquery/cloud-client * revert update of bigquery/datalab-migration * revert update of bigtable/quickstart * revert update of compute/api * revert update of container_registry/container_analysis * revert update of dataflow/run_template * revert update of datastore/cloud-ndb * revert update of dialogflow/cloud-client * revert update of dlp * revert update of functions/imagemagick * revert update of functions/ocr/app * revert update of healthcare/api-client/fhir * revert update of iam/api-client * revert update of iot/api-client/gcs_file_to_device * revert update of iot/api-client/mqtt_example * revert update of language/automl * revert update of run/image-processing * revert update of vision/automl * revert update testing/requirements.txt * revert update of vision/cloud-client/detect * revert update of vision/cloud-client/product_search * revert update of jobs/v2/api_client * revert update of jobs/v3/api_client * revert update of opencensus * revert update of translate/cloud-client * revert update to speech/cloud-client Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> * Update dependency google-cloud-automl to v0.10.0 [(#3033)](#3033) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * Simplify noxfile setup. [(#2806)](#2806) * chore(deps): update dependency requests to v2.23.0 * Simplify noxfile and add version control. * Configure appengine/standard to only test Python 2.7. * Update Kokokro configs to match noxfile. * Add requirements-test to each folder. * Remove Py2 versions from everything execept appengine/standard. * Remove conftest.py. * Remove appengine/standard/conftest.py * Remove 'no-sucess-flaky-report' from pytest.ini. * Add GAE SDK back to appengine/standard tests. * Fix typo. * Roll pytest to python 2 version. * Add a bunch of testing requirements. * Remove typo. * Add appengine lib directory back in. * Add some additional requirements. * Fix issue with flake8 args. * Even more requirements. * Readd appengine conftest.py. * Add a few more requirements. * Even more Appengine requirements. * Add webtest for appengine/standard/mailgun. * Add some additional requirements. * Add workaround for issue with mailjet-rest. * Add responses for appengine/standard/mailjet. Co-authored-by: Renovate Bot <bot@renovateapp.com> * chore: some lint fixes [(#3750)](#3750) * automl: tables code sample clean-up [(#3571)](#3571) * delete unused tables_dataset samples * delete args code associated with unused automl_tables samples * delete tests associated with unused automl_tables samples * restore get_dataset method/yargs without region tagging * Restore update_dataset methodsa without region tagging Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * add example of creating AutoML Tables client with non-default endpoint ('new' sdk) [(#3929)](#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> * Replace GCLOUD_PROJECT with GOOGLE_CLOUD_PROJECT. [(#4022)](#4022) * chore(deps): update dependency google-cloud-automl to v1 [(#4127)](#4127) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [google-cloud-automl](https://togithub.com/googleapis/python-automl) | major | `==0.10.0` -> `==1.0.1` | --- ### Release Notes <details> <summary>googleapis/python-automl</summary> ### [`v1.0.1`](https://togithub.com/googleapis/python-automl/blob/master/CHANGELOG.md#​101-httpswwwgithubcomgoogleapispython-automlcomparev100v101-2020-06-18) [Compare Source](https://togithub.com/googleapis/python-automl/compare/v0.10.0...v1.0.1) </details> --- ### Renovate configuration :date: **Schedule**: At any time (no schedule defined). :vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied. :recycle: **Rebasing**: Never, or you tick the rebase/retry checkbox. :no_bell: **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples). * [tables/automl] fix: update the csv file and the dataset name [(#4188)](#4188) fixes #4177 fixes #4178 * samples: Automl table batch test [(#4267)](#4267) * added rtest req.txt * samples: added automl batch predict test * added missing package * Update tables/automl/batch_predict_test.py Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> * samples: fixed wrong format on GCS input Uri [(#4270)](#4270) ## Description Current predict sample indicates that it can multiples GCS URI inputs but it should be singular. ## Checklist - [X] Please **merge** this PR for me once it is approved. * chore(deps): update dependency pytest to v5.4.3 [(#4279)](#4279) * chore(deps): update dependency pytest to v5.4.3 * specify pytest for python 2 in appengine Co-authored-by: Leah Cole <coleleah@google.com> * Update automl_tables_predict.py with batch_predict_bq sample [(#4142)](#4142) Added a new method `batch_predict_bq` demonstrating running batch_prediction using BigQuery. Added notes in comments about asynchronicity for `batch_predict` method. The region `automl_tables_batch_predict_bq` will be used on cloud.google.com (currently both sections for GCS and BigQuery use the same sample code which is incorrect). Fixes #4141 Note: It's a good idea to open an issue first for discussion. - [x] Please **merge** this PR for me once it is approved. * Update dependency pytest to v6 [(#4390)](#4390) * chore: exclude notebooks * chore: update templates * chore: add codeowners and fix tests * chore: ignore warnings from sphinx * chore: fix tables client * test: fix unit tests Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> Co-authored-by: florencep <florenceperot@google.com> Co-authored-by: Mike Burton <mb-github@niskala.org> Co-authored-by: Lars Wander <lwander@users.noreply.github.com> Co-authored-by: Michael Hu <Michael.an.hu@gmail.com> Co-authored-by: Michael Hu <michaelanhu@gmail.com> Co-authored-by: Alefh Sousa <alefh.sousa@gmail.com> Co-authored-by: DPEBot <dpebot@google.com> Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> Co-authored-by: WhiteSource Renovate <bot@renovateapp.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Anthony <wens.ajw@gmail.com> Co-authored-by: Amy <amy@infosleuth.net> Co-authored-by: Mike <45373284+munkhuushmgl@users.noreply.github.com> Co-authored-by: Leah Cole <coleleah@google.com> Co-authored-by: Sergei Dorogin <github@dorogin.com>
* Tables Notebooks [(#2090)](GoogleCloudPlatform/python-docs-samples#2090) * initial commit * update census * update notebooks * remove the reference to a bug [(#2100)](GoogleCloudPlatform/python-docs-samples#2100) as the bug has been fixed in the public client lib * delete this file. [(#2102)](GoogleCloudPlatform/python-docs-samples#2102) * rename file name [(#2103)](GoogleCloudPlatform/python-docs-samples#2103) * trying to fix images [(#2101)](GoogleCloudPlatform/python-docs-samples#2101) * remove typo in installation [(#2110)](GoogleCloudPlatform/python-docs-samples#2110) * Rename census_income_prediction.ipynb to getting_started_notebook.ipynb [(#2115)](GoogleCloudPlatform/python-docs-samples#2115) renaming the notebooks as Getting Started (will be in sync with the doc). It will be great if the folder could be renamed too * added back missing file package import [(#2150)](GoogleCloudPlatform/python-docs-samples#2150) * added back missing file import [(#2145)](GoogleCloudPlatform/python-docs-samples#2145) * remove incorrect reference to Iris dataset [(#2203)](GoogleCloudPlatform/python-docs-samples#2203) * conversion to jupyter/colab [(#2340)](GoogleCloudPlatform/python-docs-samples#2340) plus bug fixes * updated for the Jupyter support [(#2337)](GoogleCloudPlatform/python-docs-samples#2337) * updated readme for support Jupyter [(#2336)](GoogleCloudPlatform/python-docs-samples#2336) to approve with the updated notebook supporting jupyter * conversion to jupyer/colab [(#2339)](GoogleCloudPlatform/python-docs-samples#2339) plus bug fixes * conversion of notebook for jupyter/Colab [(#2338)](GoogleCloudPlatform/python-docs-samples#2338) conversion of the notebook to support both Jupyter and Colab + bug fixes * [BLOCKED] AutoML Tables: Docs samples updated to use new (pending) client [(#2276)](GoogleCloudPlatform/python-docs-samples#2276) * AutoML Tables: Docs samples updated to use new (pending) client * Linter warnings * add product recommendation for automl tables notebook [(#2257)](GoogleCloudPlatform/python-docs-samples#2257) * added colab filtering notebook * update to tables client * update readme * tell user to restart kernel for automl * AutoML Tables: Notebook samples updated to use new tables client [(#2424)](GoogleCloudPlatform/python-docs-samples#2424) * fix users bug and emphasize kernal restart [(#2407)](GoogleCloudPlatform/python-docs-samples#2407) * fix problems with automl docs [(#2501)](GoogleCloudPlatform/python-docs-samples#2501) Today when we try to use the function `batch_predict` follow the docs we receive and error saying: `the paramaters should be a pandas.Dataframe` it’s happens because the first parameter of the function `batch_predict` is a pandas.Dataframe. To solve this problem we need to use de named parameters of python. * Fix typo in GCS URI parameter [(#2459)](GoogleCloudPlatform/python-docs-samples#2459) * fix: fix tables notebook links and bugs [(#2601)](GoogleCloudPlatform/python-docs-samples#2601) * feat(tables): update samples to show explainability [(#2523)](GoogleCloudPlatform/python-docs-samples#2523) * show xai * local feature importance * use updated client * use fixed library * use new model * Auto-update dependencies. [(#2005)](GoogleCloudPlatform/python-docs-samples#2005) * Auto-update dependencies. * Revert update of appengine/flexible/datastore. * revert update of appengine/flexible/scipy * revert update of bigquery/bqml * revert update of bigquery/cloud-client * revert update of bigquery/datalab-migration * revert update of bigtable/quickstart * revert update of compute/api * revert update of container_registry/container_analysis * revert update of dataflow/run_template * revert update of datastore/cloud-ndb * revert update of dialogflow/cloud-client * revert update of dlp * revert update of functions/imagemagick * revert update of functions/ocr/app * revert update of healthcare/api-client/fhir * revert update of iam/api-client * revert update of iot/api-client/gcs_file_to_device * revert update of iot/api-client/mqtt_example * revert update of language/automl * revert update of run/image-processing * revert update of vision/automl * revert update testing/requirements.txt * revert update of vision/cloud-client/detect * revert update of vision/cloud-client/product_search * revert update of jobs/v2/api_client * revert update of jobs/v3/api_client * revert update of opencensus * revert update of translate/cloud-client * revert update to speech/cloud-client Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> * Update dependency google-cloud-automl to v0.10.0 [(#3033)](GoogleCloudPlatform/python-docs-samples#3033) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * Simplify noxfile setup. [(#2806)](GoogleCloudPlatform/python-docs-samples#2806) * chore(deps): update dependency requests to v2.23.0 * Simplify noxfile and add version control. * Configure appengine/standard to only test Python 2.7. * Update Kokokro configs to match noxfile. * Add requirements-test to each folder. * Remove Py2 versions from everything execept appengine/standard. * Remove conftest.py. * Remove appengine/standard/conftest.py * Remove 'no-sucess-flaky-report' from pytest.ini. * Add GAE SDK back to appengine/standard tests. * Fix typo. * Roll pytest to python 2 version. * Add a bunch of testing requirements. * Remove typo. * Add appengine lib directory back in. * Add some additional requirements. * Fix issue with flake8 args. * Even more requirements. * Readd appengine conftest.py. * Add a few more requirements. * Even more Appengine requirements. * Add webtest for appengine/standard/mailgun. * Add some additional requirements. * Add workaround for issue with mailjet-rest. * Add responses for appengine/standard/mailjet. Co-authored-by: Renovate Bot <bot@renovateapp.com> * chore: some lint fixes [(#3750)](GoogleCloudPlatform/python-docs-samples#3750) * automl: tables code sample clean-up [(#3571)](GoogleCloudPlatform/python-docs-samples#3571) * delete unused tables_dataset samples * delete args code associated with unused automl_tables samples * delete tests associated with unused automl_tables samples * restore get_dataset method/yargs without region tagging * Restore update_dataset methodsa without region tagging Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> * add example of creating AutoML Tables client with non-default endpoint ('new' sdk) [(#3929)](GoogleCloudPlatform/python-docs-samples#3929) * add example of creating client with non-default endpoint * more test file cleanup * move connectivity print stmt out of test fn Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> * Replace GCLOUD_PROJECT with GOOGLE_CLOUD_PROJECT. [(#4022)](GoogleCloudPlatform/python-docs-samples#4022) * chore(deps): update dependency google-cloud-automl to v1 [(#4127)](GoogleCloudPlatform/python-docs-samples#4127) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [google-cloud-automl](https://togithub.com/googleapis/python-automl) | major | `==0.10.0` -> `==1.0.1` | --- ### Release Notes <details> <summary>googleapis/python-automl</summary> ### [`v1.0.1`](https://togithub.com/googleapis/python-automl/blob/master/CHANGELOG.md#​101-httpswwwgithubcomgoogleapispython-automlcomparev100v101-2020-06-18) [Compare Source](https://togithub.com/googleapis/python-automl/compare/v0.10.0...v1.0.1) </details> --- ### Renovate configuration :date: **Schedule**: At any time (no schedule defined). :vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied. :recycle: **Rebasing**: Never, or you tick the rebase/retry checkbox. :no_bell: **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples). * [tables/automl] fix: update the csv file and the dataset name [(#4188)](GoogleCloudPlatform/python-docs-samples#4188) fixes #4177 fixes #4178 * samples: Automl table batch test [(#4267)](GoogleCloudPlatform/python-docs-samples#4267) * added rtest req.txt * samples: added automl batch predict test * added missing package * Update tables/automl/batch_predict_test.py Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> * samples: fixed wrong format on GCS input Uri [(#4270)](GoogleCloudPlatform/python-docs-samples#4270) ## Description Current predict sample indicates that it can multiples GCS URI inputs but it should be singular. ## Checklist - [X] Please **merge** this PR for me once it is approved. * chore(deps): update dependency pytest to v5.4.3 [(#4279)](GoogleCloudPlatform/python-docs-samples#4279) * chore(deps): update dependency pytest to v5.4.3 * specify pytest for python 2 in appengine Co-authored-by: Leah Cole <coleleah@google.com> * Update automl_tables_predict.py with batch_predict_bq sample [(#4142)](GoogleCloudPlatform/python-docs-samples#4142) Added a new method `batch_predict_bq` demonstrating running batch_prediction using BigQuery. Added notes in comments about asynchronicity for `batch_predict` method. The region `automl_tables_batch_predict_bq` will be used on cloud.google.com (currently both sections for GCS and BigQuery use the same sample code which is incorrect). Fixes #4141 Note: It's a good idea to open an issue first for discussion. - [x] Please **merge** this PR for me once it is approved. * Update dependency pytest to v6 [(#4390)](GoogleCloudPlatform/python-docs-samples#4390) * chore: exclude notebooks * chore: update templates * chore: add codeowners and fix tests * chore: ignore warnings from sphinx * chore: fix tables client * test: fix unit tests Co-authored-by: Torry Yang <sirtorry@users.noreply.github.com> Co-authored-by: florencep <florenceperot@google.com> Co-authored-by: Mike Burton <mb-github@niskala.org> Co-authored-by: Lars Wander <lwander@users.noreply.github.com> Co-authored-by: Michael Hu <Michael.an.hu@gmail.com> Co-authored-by: Michael Hu <michaelanhu@gmail.com> Co-authored-by: Alefh Sousa <alefh.sousa@gmail.com> Co-authored-by: DPEBot <dpebot@google.com> Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh <dmahugh@gmail.com> Co-authored-by: WhiteSource Renovate <bot@renovateapp.com> Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com> Co-authored-by: Takashi Matsuo <tmatsuo@google.com> Co-authored-by: Anthony <wens.ajw@gmail.com> Co-authored-by: Amy <amy@infosleuth.net> Co-authored-by: Mike <45373284+munkhuushmgl@users.noreply.github.com> Co-authored-by: Leah Cole <coleleah@google.com> Co-authored-by: Sergei Dorogin <github@dorogin.com>
Torry, this is related to b/146659153. @sirtorry
I may not be using the right conventions for this sample code— LMK. (also not sure I'm doing the right things w.r.t. the tests...)
Edit: also, can you share what linter to run? It looks like you all require 4 spaces, not 2, for one thing.
/assign @sirtorry