-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat!: migrate to use microgen (#71)
* feat!: migrate to use microgen * update * update * update * update
- Loading branch information
1 parent
f6eb330
commit ce25e94
Showing
196 changed files
with
48,837 additions
and
47,388 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,168 @@ | ||
# 2.0.0 Migration Guide | ||
|
||
The 2.0 release of the `google-cloud-dataproc` client is a significant upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for earlier versions of this library will likely require updates to use this version. This document describes the changes that have been made, and what you need to do to update your usage. | ||
|
||
If you experience issues or have questions, please file an [issue](https://github.com/googleapis/python-dataproc/issues). | ||
|
||
## Supported Python Versions | ||
|
||
> **WARNING**: Breaking change | ||
The 2.0.0 release requires Python 3.6+. | ||
|
||
|
||
## Method Calls | ||
|
||
> **WARNING**: Breaking change | ||
Methods expect request objects. We provide a script that will convert most common use cases. | ||
|
||
* Install the library | ||
|
||
```py | ||
python3 -m pip install google-cloud-dataproc | ||
``` | ||
|
||
* The script `fixup_dataproc_v1_keywords.py` is shipped with the library. It expects an input directory (with the code to convert) and an empty destination directory. | ||
|
||
```sh | ||
$ fixup_dataproc_v1_keywords.py --input-directory .samples/ --output-directory samples/ | ||
``` | ||
|
||
**Before:** | ||
```py | ||
from google.cloud import dataproc | ||
|
||
client = dataproc.ClusterControllerClient() | ||
|
||
clusters = client.list_clusters(project_id="project_id", region="region") | ||
``` | ||
|
||
|
||
**After:** | ||
```py | ||
from google.cloud import dataproc | ||
|
||
client = dataproc.ClusterControllerClient() | ||
|
||
clusters = client.list_clusters(request={ | ||
'project_id' : "project_id", 'region' : "region" | ||
}) | ||
``` | ||
|
||
### More Details | ||
|
||
In `google-cloud-dataproc<2.0.0`, parameters required by the API were positional parameters and optional parameters were keyword parameters. | ||
|
||
**Before:** | ||
```py | ||
def get_cluster( | ||
self, | ||
project_id, | ||
region, | ||
cluster_name, | ||
retry=google.api_core.gapic_v1.method.DEFAULT, | ||
timeout=google.api_core.gapic_v1.method.DEFAULT, | ||
metadata=None, | ||
): | ||
``` | ||
|
||
In the 2.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional. | ||
|
||
Some methods have additional keyword only parameters. The available parameters depend on the [`google.api.method_signature` annotation](https://github.com/googleapis/googleapis/blob/master/google/cloud/dataproc/v1/clusters.proto#L88) specified by the API producer. | ||
|
||
|
||
**After:** | ||
```py | ||
def get_cluster( | ||
self, | ||
request: clusters.GetClusterRequest = None, | ||
*, | ||
project_id: str = None, | ||
region: str = None, | ||
cluster_name: str = None, | ||
retry: retries.Retry = gapic_v1.method.DEFAULT, | ||
timeout: float = None, | ||
metadata: Sequence[Tuple[str, str]] = (), | ||
) -> clusters.Cluster: | ||
``` | ||
|
||
> **NOTE:** The `request` parameter and flattened keyword parameters for the API are mutually exclusive. | ||
> Passing both will result in an error. | ||
|
||
Both of these calls are valid: | ||
|
||
```py | ||
response = client.get_cluster( | ||
request={ | ||
"project_id": project_id, | ||
"region": region, | ||
"cluster_name": cluster_name | ||
} | ||
) | ||
``` | ||
|
||
```py | ||
response = client.get_cluster( | ||
project_id=project_id, | ||
region=region, | ||
cluster_name=cluster_name | ||
) | ||
``` | ||
|
||
This call is invalid because it mixes `request` with a keyword argument `cluster_name`. Executing this code | ||
will result in an error. | ||
|
||
```py | ||
response = client.get_cluster( | ||
request={ | ||
"project_id": project_id, | ||
"region": region | ||
}, | ||
cluster_name=cluster_name | ||
) | ||
``` | ||
|
||
|
||
|
||
## Enums and Types | ||
|
||
|
||
> **WARNING**: Breaking change | ||
The submodules `enums` and `types` have been removed. | ||
|
||
**Before:** | ||
```py | ||
|
||
from google.cloud import dataproc | ||
|
||
status = dataproc.enums.ClusterStatus.State.CREATING | ||
cluster = dataproc.types.Cluster(cluster_name="name") | ||
``` | ||
|
||
|
||
**After:** | ||
```py | ||
from google.cloud import dataproc | ||
|
||
status = dataproc.ClusterStatus.State.CREATING | ||
cluster = dataproc.Cluster(cluster_name="name") | ||
``` | ||
|
||
## Path Helper Methods | ||
The following path helper methods have been removed. Please construct the paths manually. | ||
|
||
```py | ||
project = 'my-project' | ||
location = 'project-location' | ||
region = 'project-region' | ||
workflow_template = 'template' | ||
autoscaling_policy = 'policy' | ||
|
||
location_path = f'projects/{project}/locations/{location}' | ||
region_path = f'projects/{project}/regions/{region}' | ||
workflow_template_path = f'projects/{project}/regions/{region}/workflowTemplates/{workflow_template}' | ||
autoscaling_policy_path = f'projects/{project}/locations/{location}/autoscalingPolicies/{autoscaling_policy}' | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
../UPGRADING.md |
15 changes: 15 additions & 0 deletions
15
packages/google-cloud-dataproc/docs/dataproc_v1/services.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
Services for Google Cloud Dataproc v1 API | ||
========================================= | ||
|
||
.. automodule:: google.cloud.dataproc_v1.services.autoscaling_policy_service | ||
:members: | ||
:inherited-members: | ||
.. automodule:: google.cloud.dataproc_v1.services.cluster_controller | ||
:members: | ||
:inherited-members: | ||
.. automodule:: google.cloud.dataproc_v1.services.job_controller | ||
:members: | ||
:inherited-members: | ||
.. automodule:: google.cloud.dataproc_v1.services.workflow_template_service | ||
:members: | ||
:inherited-members: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
Types for Google Cloud Dataproc v1 API | ||
====================================== | ||
|
||
.. automodule:: google.cloud.dataproc_v1.types | ||
:members: |
15 changes: 15 additions & 0 deletions
15
packages/google-cloud-dataproc/docs/dataproc_v1beta2/services.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
Services for Google Cloud Dataproc v1beta2 API | ||
============================================== | ||
|
||
.. automodule:: google.cloud.dataproc_v1beta2.services.autoscaling_policy_service | ||
:members: | ||
:inherited-members: | ||
.. automodule:: google.cloud.dataproc_v1beta2.services.cluster_controller | ||
:members: | ||
:inherited-members: | ||
.. automodule:: google.cloud.dataproc_v1beta2.services.job_controller | ||
:members: | ||
:inherited-members: | ||
.. automodule:: google.cloud.dataproc_v1beta2.services.workflow_template_service | ||
:members: | ||
:inherited-members: |
5 changes: 5 additions & 0 deletions
5
packages/google-cloud-dataproc/docs/dataproc_v1beta2/types.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
Types for Google Cloud Dataproc v1beta2 API | ||
=========================================== | ||
|
||
.. automodule:: google.cloud.dataproc_v1beta2.types | ||
:members: |
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.