diff --git a/gcloud/datastore/_generated/__init__.py b/gcloud/datastore/_generated/__init__.py new file mode 100644 index 000000000000..19a0f26e68de --- /dev/null +++ b/gcloud/datastore/_generated/__init__.py @@ -0,0 +1,15 @@ +# Copyright 2015 Google Inc. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Generated protobuf modules for Google Cloud Datastore API.""" diff --git a/gcloud/datastore/_datastore_pb2.py b/gcloud/datastore/_generated/datastore_pb2.py similarity index 100% rename from gcloud/datastore/_datastore_pb2.py rename to gcloud/datastore/_generated/datastore_pb2.py diff --git a/gcloud/datastore/_entity_pb2.py b/gcloud/datastore/_generated/entity_pb2.py similarity index 100% rename from gcloud/datastore/_entity_pb2.py rename to gcloud/datastore/_generated/entity_pb2.py diff --git a/gcloud/datastore/_query_pb2.py b/gcloud/datastore/_generated/query_pb2.py similarity index 100% rename from gcloud/datastore/_query_pb2.py rename to gcloud/datastore/_generated/query_pb2.py diff --git a/gcloud/datastore/batch.py b/gcloud/datastore/batch.py index f29cef8a566d..d6dbd98ded54 100644 --- a/gcloud/datastore/batch.py +++ b/gcloud/datastore/batch.py @@ -23,7 +23,7 @@ from gcloud.datastore import helpers from gcloud.datastore.key import _dataset_ids_equal -from gcloud.datastore import _datastore_pb2 +from gcloud.datastore._generated import datastore_pb2 as _datastore_pb2 class Batch(object): @@ -101,7 +101,7 @@ def connection(self): def _add_partial_key_entity_pb(self): """Adds a new mutation for an entity with a partial key. - :rtype: :class:`gcloud.datastore._entity_pb2.Entity` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Entity` :returns: The newly created entity protobuf that will be updated and sent with a commit. """ @@ -110,7 +110,7 @@ def _add_partial_key_entity_pb(self): def _add_complete_key_entity_pb(self): """Adds a new mutation for an entity with a completed key. - :rtype: :class:`gcloud.datastore._entity_pb2.Entity` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Entity` :returns: The newly created entity protobuf that will be updated and sent with a commit. """ @@ -119,7 +119,7 @@ def _add_complete_key_entity_pb(self): def _add_delete_key_pb(self): """Adds a new mutation for a key to be deleted. - :rtype: :class:`gcloud.datastore._entity_pb2.Key` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Key` :returns: The newly created key protobuf that will be deleted when sent with a commit. """ @@ -136,7 +136,7 @@ def mutations(self): This getter returns the Mutation protobuf that has been built-up so far. - :rtype: :class:`gcloud.datastore._datastore_pb2.Mutation` + :rtype: :class:`gcloud.datastore._generated.datastore_pb2.Mutation` :returns: The Mutation protobuf to be sent in the commit request. """ return self._mutation @@ -249,7 +249,7 @@ def _assign_entity_to_pb(entity_pb, entity): Helper method for ``Batch.put``. - :type entity_pb: :class:`gcloud.datastore._entity_pb2.Entity` + :type entity_pb: :class:`gcloud.datastore._generated.entity_pb2.Entity` :param entity_pb: The entity owned by a mutation. :type entity: :class:`gcloud.datastore.entity.Entity` diff --git a/gcloud/datastore/client.py b/gcloud/datastore/client.py index 0767a54c33a6..e4a2ee3cb276 100644 --- a/gcloud/datastore/client.py +++ b/gcloud/datastore/client.py @@ -90,7 +90,7 @@ def _extended_lookup(connection, dataset_id, key_pbs, :type dataset_id: string :param dataset_id: The ID of the dataset of which to make the request. - :type key_pbs: list of :class:`gcloud.datastore._entity_pb2.Key` + :type key_pbs: list of :class:`gcloud.datastore._generated.entity_pb2.Key` :param key_pbs: The keys to retrieve from the datastore. :type missing: an empty list or None. @@ -113,7 +113,7 @@ def _extended_lookup(connection, dataset_id, key_pbs, the given transaction. Incompatible with ``eventual==True``. - :rtype: list of :class:`gcloud.datastore._entity_pb2.Entity` + :rtype: list of :class:`gcloud.datastore._generated.entity_pb2.Entity` :returns: The requested entities. :raises: :class:`ValueError` if missing / deferred are not null or empty list. diff --git a/gcloud/datastore/connection.py b/gcloud/datastore/connection.py index 74657a13c054..6b48f74e1d30 100644 --- a/gcloud/datastore/connection.py +++ b/gcloud/datastore/connection.py @@ -19,8 +19,8 @@ from gcloud import connection from gcloud.environment_vars import GCD_HOST from gcloud.exceptions import make_exception -from gcloud.datastore import _datastore_pb2 -from gcloud.datastore import _entity_pb2 +from gcloud.datastore._generated import datastore_pb2 as _datastore_pb2 +from gcloud.datastore._generated import entity_pb2 as _entity_pb2 class Connection(connection.Connection): @@ -151,8 +151,8 @@ def lookup(self, dataset_id, key_pbs, Maps the ``DatastoreService.Lookup`` protobuf RPC. This method deals only with protobufs - (:class:`gcloud.datastore._entity_pb2.Key` and - :class:`gcloud.datastore._entity_pb2.Entity`) and is used + (:class:`gcloud.datastore._generated.entity_pb2.Key` and + :class:`gcloud.datastore._generated.entity_pb2.Entity`) and is used under the hood in :func:`gcloud.datastore.get`: >>> from gcloud import datastore @@ -168,7 +168,8 @@ def lookup(self, dataset_id, key_pbs, :type dataset_id: string :param dataset_id: The ID of the dataset to look up the keys. - :type key_pbs: list of :class:`gcloud.datastore._entity_pb2.Key` + :type key_pbs: list of + :class:`gcloud.datastore._generated.entity_pb2.Key` :param key_pbs: The keys to retrieve from the datastore. :type eventual: boolean @@ -184,9 +185,9 @@ def lookup(self, dataset_id, key_pbs, :rtype: tuple :returns: A triple of (``results``, ``missing``, ``deferred``) where both ``results`` and ``missing`` are lists of - :class:`gcloud.datastore._entity_pb2.Entity` and + :class:`gcloud.datastore._generated.entity_pb2.Entity` and ``deferred`` is a list of - :class:`gcloud.datastore._entity_pb2.Key`. + :class:`gcloud.datastore._generated.entity_pb2.Key`. """ lookup_request = _datastore_pb2.LookupRequest() _set_read_options(lookup_request, eventual, transaction_id) @@ -239,7 +240,7 @@ def run_query(self, dataset_id, query_pb, namespace=None, :type dataset_id: string :param dataset_id: The ID of the dataset over which to run the query. - :type query_pb: :class:`gcloud.datastore._query_pb2.Query` + :type query_pb: :class:`gcloud.datastore._generated.query_pb2.Query` :param query_pb: The Protobuf representing the query to run. :type namespace: string @@ -279,7 +280,7 @@ def begin_transaction(self, dataset_id): :type dataset_id: string :param dataset_id: The ID dataset to which the transaction applies. - :rtype: :class:`._datastore_pb2.BeginTransactionResponse` + :rtype: :class:`._generated.datastore_pb2.BeginTransactionResponse` :returns': the result protobuf for the begin transaction request. """ request = _datastore_pb2.BeginTransactionRequest() @@ -297,7 +298,7 @@ def commit(self, dataset_id, mutation_pb, transaction_id): :type dataset_id: string :param dataset_id: The ID dataset to which the transaction applies. - :type mutation_pb: :class:`._datastore_pb2.Mutation` + :type mutation_pb: :class:`._generated.datastore_pb2.Mutation` :param mutation_pb: The protobuf for the mutations being saved. :type transaction_id: string or None @@ -307,8 +308,8 @@ def commit(self, dataset_id, mutation_pb, transaction_id): :rtype: tuple :returns': The pair of the number of index updates and a list of - :class:`._entity_pb2.Key` for each incomplete key that was - completed in the commit. + :class:`._generated.entity_pb2.Key` for each incomplete key + that was completed in the commit. """ request = _datastore_pb2.CommitRequest() @@ -351,10 +352,11 @@ def allocate_ids(self, dataset_id, key_pbs): :param dataset_id: The ID of the dataset to which the transaction belongs. - :type key_pbs: list of :class:`gcloud.datastore._entity_pb2.Key` + :type key_pbs: list of + :class:`gcloud.datastore._generated.entity_pb2.Key` :param key_pbs: The keys for which the backend should allocate IDs. - :rtype: list of :class:`gcloud.datastore._entity_pb2.Key` + :rtype: list of :class:`gcloud.datastore._generated.entity_pb2.Key` :returns: An equal number of keys, with IDs filled in by the backend. """ request = _datastore_pb2.AllocateIdsRequest() @@ -390,10 +392,10 @@ def _prepare_key_for_request(key_pb): # pragma: NO COVER copied from helpers This is copied from `helpers` to avoid a cycle: _implicit_environ -> connection -> helpers -> key -> _implicit_environ - :type key_pb: :class:`gcloud.datastore._entity_pb2.Key` + :type key_pb: :class:`gcloud.datastore._generated.entity_pb2.Key` :param key_pb: A key to be added to a request. - :rtype: :class:`gcloud.datastore._entity_pb2.Key` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Key` :returns: A key which will be added to a request. It will be the original if nothing needs to be changed. """ @@ -411,7 +413,7 @@ def _add_keys_to_request(request_field_pb, key_pbs): :type request_field_pb: `RepeatedCompositeFieldContainer` :param request_field_pb: A repeated proto field that contains keys. - :type key_pbs: list of :class:`gcloud.datastore._entity_pb2.Key` + :type key_pbs: list of :class:`gcloud.datastore._generated.entity_pb2.Key` :param key_pbs: The keys to add to a request. """ for key_pb in key_pbs: @@ -422,13 +424,13 @@ def _add_keys_to_request(request_field_pb, key_pbs): def _parse_commit_response(commit_response_pb): """Extract response data from a commit response. - :type commit_response_pb: :class:`._datastore_pb2.CommitResponse` + :type commit_response_pb: :class:`._generated.datastore_pb2.CommitResponse` :param commit_response_pb: The protobuf response from a commit request. :rtype: tuple :returns': The pair of the number of index updates and a list of - :class:`._entity_pb2.Key` for each incomplete key that was - completed in the commit. + :class:`._generated.entity_pb2.Key` for each incomplete key + that was completed in the commit. """ mut_result = commit_response_pb.mutation_result index_updates = mut_result.index_updates diff --git a/gcloud/datastore/helpers.py b/gcloud/datastore/helpers.py index 383207a6ebfd..b7488e1b3d3b 100644 --- a/gcloud/datastore/helpers.py +++ b/gcloud/datastore/helpers.py @@ -24,7 +24,7 @@ from gcloud._helpers import _datetime_from_microseconds from gcloud._helpers import _microseconds_from_datetime -from gcloud.datastore import _entity_pb2 +from gcloud.datastore._generated import entity_pb2 as _entity_pb2 from gcloud.datastore.entity import Entity from gcloud.datastore.key import Key @@ -76,7 +76,7 @@ def find_true_dataset_id(dataset_id, connection): def _get_meaning(value_pb, is_list=False): """Get the meaning from a protobuf value. - :type value_pb: :class:`gcloud.datastore._entity_pb2.Value` + :type value_pb: :class:`gcloud.datastore._generated.entity_pb2.Value` :param value_pb: The protobuf value to be checked for an associated meaning. @@ -121,7 +121,7 @@ def entity_from_protobuf(pb): The protobuf should be one returned from the Cloud Datastore Protobuf API. - :type pb: :class:`gcloud.datastore._entity_pb2.Entity` + :type pb: :class:`gcloud.datastore._generated.entity_pb2.Entity` :param pb: The Protobuf representing the entity. :rtype: :class:`gcloud.datastore.entity.Entity` @@ -173,7 +173,7 @@ def entity_to_protobuf(entity): :type entity: :class:`gcloud.datastore.entity.Entity` :param entity: The entity to be turned into a protobuf. - :rtype: :class:`gcloud.datastore._entity_pb2.Entity` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Entity` :returns: The protobuf representing the entity. """ entity_pb = _entity_pb2.Entity() @@ -223,7 +223,7 @@ def key_from_protobuf(pb): The protobuf should be one returned from the Cloud Datastore Protobuf API. - :type pb: :class:`gcloud.datastore._entity_pb2.Key` + :type pb: :class:`gcloud.datastore._generated.entity_pb2.Key` :param pb: The Protobuf representing the key. :rtype: :class:`gcloud.datastore.key.Key` @@ -317,7 +317,7 @@ def _get_value_from_value_pb(value_pb): Some work is done to coerce the return value into a more useful type (particularly in the case of a timestamp value, or a key value). - :type value_pb: :class:`gcloud.datastore._entity_pb2.Value` + :type value_pb: :class:`gcloud.datastore._generated.entity_pb2.Value` :param value_pb: The Value Protobuf. :returns: The value provided by the Protobuf. @@ -364,7 +364,7 @@ def _set_protobuf_value(value_pb, val): Some value types (entities, keys, lists) cannot be directly assigned; this function handles them correctly. - :type value_pb: :class:`gcloud.datastore._entity_pb2.Value` + :type value_pb: :class:`gcloud.datastore._generated.entity_pb2.Value` :param value_pb: The value protobuf to which the value is being assigned. :type val: :class:`datetime.datetime`, boolean, float, integer, string, @@ -394,10 +394,10 @@ def _set_protobuf_value(value_pb, val): def _prepare_key_for_request(key_pb): """Add protobuf keys to a request object. - :type key_pb: :class:`gcloud.datastore._entity_pb2.Key` + :type key_pb: :class:`gcloud.datastore._generated.entity_pb2.Key` :param key_pb: A key to be added to a request. - :rtype: :class:`gcloud.datastore._entity_pb2.Key` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Key` :returns: A key which will be added to a request. It will be the original if nothing needs to be changed. """ diff --git a/gcloud/datastore/key.py b/gcloud/datastore/key.py index abeff12385fe..48b33fb5475d 100644 --- a/gcloud/datastore/key.py +++ b/gcloud/datastore/key.py @@ -17,7 +17,7 @@ import copy import six -from gcloud.datastore import _entity_pb2 +from gcloud.datastore._generated import entity_pb2 as _entity_pb2 class Key(object): @@ -235,7 +235,7 @@ def completed_key(self, id_or_name): def to_protobuf(self): """Return a protobuf corresponding to the key. - :rtype: :class:`gcloud.datastore._entity_pb2.Key` + :rtype: :class:`gcloud.datastore._generated.entity_pb2.Key` :returns: The protobuf representing the key. """ key = _entity_pb2.Key() diff --git a/gcloud/datastore/query.py b/gcloud/datastore/query.py index 93cf5a9db225..efa0075b1a64 100644 --- a/gcloud/datastore/query.py +++ b/gcloud/datastore/query.py @@ -17,7 +17,7 @@ import base64 from gcloud._helpers import _ensure_tuple_or_list -from gcloud.datastore import _query_pb2 +from gcloud.datastore._generated import query_pb2 as _query_pb2 from gcloud.datastore import helpers from gcloud.datastore.key import Key @@ -456,7 +456,7 @@ def _pb_from_query(query): :type query: :class:`Query` :param query: The source query. - :rtype: :class:`gcloud.datastore._query_pb2.Query` + :rtype: :class:`gcloud.datastore._generated.query_pb2.Query` :returns: A protobuf that can be sent to the protobuf API. N.b. that it does not contain "in-flight" fields for ongoing query executions (cursors, offset, limit). diff --git a/gcloud/datastore/test_batch.py b/gcloud/datastore/test_batch.py index 8e542469ee10..1261eacb7269 100644 --- a/gcloud/datastore/test_batch.py +++ b/gcloud/datastore/test_batch.py @@ -26,7 +26,7 @@ def _makeOne(self, client): return self._getTargetClass()(client) def test_ctor(self): - from gcloud.datastore._datastore_pb2 import Mutation + from gcloud.datastore._generated import datastore_pb2 _DATASET = 'DATASET' _NAMESPACE = 'NAMESPACE' connection = _Connection() @@ -37,7 +37,7 @@ def test_ctor(self): self.assertEqual(batch.connection, connection) self.assertEqual(batch.namespace, _NAMESPACE) self.assertTrue(batch._id is None) - self.assertTrue(isinstance(batch.mutations, Mutation)) + self.assertTrue(isinstance(batch.mutations, datastore_pb2.Mutation)) self.assertEqual(batch._partial_key_entities, []) def test_current(self): @@ -350,8 +350,8 @@ def is_partial(self): return self._id is None def to_protobuf(self): - from gcloud.datastore import _entity_pb2 - key = self._key = _entity_pb2.Key() + from gcloud.datastore._generated import entity_pb2 + key = self._key = entity_pb2.Key() # Don't assign it, because it will just get ripped out # key.partition_id.dataset_id = self.dataset_id diff --git a/gcloud/datastore/test_client.py b/gcloud/datastore/test_client.py index 58b92bb3a7a3..c7f540d7052f 100644 --- a/gcloud/datastore/test_client.py +++ b/gcloud/datastore/test_client.py @@ -16,9 +16,9 @@ def _make_entity_pb(dataset_id, kind, integer_id, name=None, str_val=None): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() entity_pb.key.partition_id.dataset_id = dataset_id path_element = entity_pb.key.path_element.add() path_element.kind = kind @@ -314,14 +314,14 @@ def test_get_multi_miss(self): self.assertEqual(results, []) def test_get_multi_miss_w_missing(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.key import Key KIND = 'Kind' ID = 1234 # Make a missing entity pb to be returned from mock backend. - missed = _entity_pb2.Entity() + missed = entity_pb2.Entity() missed.key.partition_id.dataset_id = self.DATASET_ID path_element = missed.key.path_element.add() path_element.kind = KIND @@ -378,7 +378,7 @@ def test_get_multi_miss_w_deferred(self): [key.to_protobuf()]) def test_get_multi_w_deferred_from_backend_but_not_passed(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity from gcloud.datastore.key import Key @@ -387,9 +387,9 @@ def test_get_multi_w_deferred_from_backend_but_not_passed(self): key2 = Key('Kind', 2345, dataset_id=self.DATASET_ID) key2_pb = key2.to_protobuf() - entity1_pb = _entity_pb2.Entity() + entity1_pb = entity_pb2.Entity() entity1_pb.key.CopyFrom(key1_pb) - entity2_pb = _entity_pb2.Entity() + entity2_pb = entity_pb2.Entity() entity2_pb.key.CopyFrom(key2_pb) creds = object() diff --git a/gcloud/datastore/test_connection.py b/gcloud/datastore/test_connection.py index 730e02231f5e..2f49f4bde06d 100644 --- a/gcloud/datastore/test_connection.py +++ b/gcloud/datastore/test_connection.py @@ -30,8 +30,8 @@ def _make_key_pb(self, dataset_id, id=1234): return Key(*path_args, dataset_id=dataset_id).to_protobuf() def _make_query_pb(self, kind): - from gcloud.datastore import _query_pb2 - pb = _query_pb2.Query() + from gcloud.datastore._generated import query_pb2 + pb = query_pb2.Query() pb.kind.add().name = kind return pb @@ -234,11 +234,11 @@ def test_build_api_url_w_explicit_base_version(self): URI) def test_lookup_single_key_empty_response(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' key_pb = self._make_key_pb(DATASET_ID) - rsp_pb = _datastore_pb2.LookupResponse() + rsp_pb = datastore_pb2.LookupResponse() conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -255,7 +255,7 @@ def test_lookup_single_key_empty_response(self): self.assertEqual(len(deferred), 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) @@ -263,11 +263,11 @@ def test_lookup_single_key_empty_response(self): _compare_key_pb_after_request(self, key_pb, keys[0]) def test_lookup_single_key_empty_response_w_eventual(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' key_pb = self._make_key_pb(DATASET_ID) - rsp_pb = _datastore_pb2.LookupResponse() + rsp_pb = datastore_pb2.LookupResponse() conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -285,14 +285,14 @@ def test_lookup_single_key_empty_response_w_eventual(self): self.assertEqual(len(deferred), 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) self.assertEqual(len(keys), 1) _compare_key_pb_after_request(self, key_pb, keys[0]) self.assertEqual(request.read_options.read_consistency, - _datastore_pb2.ReadOptions.EVENTUAL) + datastore_pb2.ReadOptions.EVENTUAL) self.assertEqual(request.read_options.transaction, b'') def test_lookup_single_key_empty_response_w_eventual_and_transaction(self): @@ -304,12 +304,12 @@ def test_lookup_single_key_empty_response_w_eventual_and_transaction(self): eventual=True, transaction_id=TRANSACTION) def test_lookup_single_key_empty_response_w_transaction(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' TRANSACTION = b'TRANSACTION' key_pb = self._make_key_pb(DATASET_ID) - rsp_pb = _datastore_pb2.LookupResponse() + rsp_pb = datastore_pb2.LookupResponse() conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -327,7 +327,7 @@ def test_lookup_single_key_empty_response_w_transaction(self): self.assertEqual(len(deferred), 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) @@ -336,13 +336,13 @@ def test_lookup_single_key_empty_response_w_transaction(self): self.assertEqual(request.read_options.transaction, TRANSACTION) def test_lookup_single_key_nonempty_response(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import entity_pb2 DATASET_ID = 'DATASET' key_pb = self._make_key_pb(DATASET_ID) - rsp_pb = _datastore_pb2.LookupResponse() - entity = _entity_pb2.Entity() + rsp_pb = datastore_pb2.LookupResponse() + entity = entity_pb2.Entity() entity.key.CopyFrom(key_pb) rsp_pb.found.add(entity=entity) conn = self._makeOne() @@ -362,7 +362,7 @@ def test_lookup_single_key_nonempty_response(self): self.assertEqual(found.key.path_element[0].id, 1234) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) @@ -370,12 +370,12 @@ def test_lookup_single_key_nonempty_response(self): _compare_key_pb_after_request(self, key_pb, keys[0]) def test_lookup_multiple_keys_empty_response(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' key_pb1 = self._make_key_pb(DATASET_ID) key_pb2 = self._make_key_pb(DATASET_ID, id=2345) - rsp_pb = _datastore_pb2.LookupResponse() + rsp_pb = datastore_pb2.LookupResponse() conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -392,7 +392,7 @@ def test_lookup_multiple_keys_empty_response(self): self.assertEqual(len(deferred), 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) @@ -401,12 +401,12 @@ def test_lookup_multiple_keys_empty_response(self): _compare_key_pb_after_request(self, key_pb2, keys[1]) def test_lookup_multiple_keys_w_missing(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' key_pb1 = self._make_key_pb(DATASET_ID) key_pb2 = self._make_key_pb(DATASET_ID, id=2345) - rsp_pb = _datastore_pb2.LookupResponse() + rsp_pb = datastore_pb2.LookupResponse() er_1 = rsp_pb.missing.add() er_1.entity.key.CopyFrom(key_pb1) er_2 = rsp_pb.missing.add() @@ -428,7 +428,7 @@ def test_lookup_multiple_keys_w_missing(self): [key_pb1, key_pb2]) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) @@ -437,12 +437,12 @@ def test_lookup_multiple_keys_w_missing(self): _compare_key_pb_after_request(self, key_pb2, keys[1]) def test_lookup_multiple_keys_w_deferred(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' key_pb1 = self._make_key_pb(DATASET_ID) key_pb2 = self._make_key_pb(DATASET_ID, id=2345) - rsp_pb = _datastore_pb2.LookupResponse() + rsp_pb = datastore_pb2.LookupResponse() rsp_pb.deferred.add().CopyFrom(key_pb1) rsp_pb.deferred.add().CopyFrom(key_pb2) conn = self._makeOne() @@ -466,7 +466,7 @@ def test_lookup_multiple_keys_w_deferred(self): self.assertEqual(cw['headers']['Content-Type'], 'application/x-protobuf') self.assertEqual(cw['headers']['User-Agent'], conn.USER_AGENT) - rq_class = _datastore_pb2.LookupRequest + rq_class = datastore_pb2.LookupRequest request = rq_class() request.ParseFromString(cw['body']) keys = list(request.key) @@ -475,18 +475,18 @@ def test_lookup_multiple_keys_w_deferred(self): _compare_key_pb_after_request(self, key_pb2, keys[1]) def test_run_query_w_eventual_no_transaction(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import query_pb2 DATASET_ID = 'DATASET' KIND = 'Nonesuch' CURSOR = b'\x00' q_pb = self._make_query_pb(KIND) - rsp_pb = _datastore_pb2.RunQueryResponse() + rsp_pb = datastore_pb2.RunQueryResponse() rsp_pb.batch.end_cursor = CURSOR - no_more = _query_pb2.QueryResultBatch.NO_MORE_RESULTS + no_more = query_pb2.QueryResultBatch.NO_MORE_RESULTS rsp_pb.batch.more_results = no_more - rsp_pb.batch.entity_result_type = _query_pb2.EntityResult.FULL + rsp_pb.batch.entity_result_type = query_pb2.EntityResult.FULL conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -505,29 +505,29 @@ def test_run_query_w_eventual_no_transaction(self): self.assertEqual(skipped, 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.RunQueryRequest + rq_class = datastore_pb2.RunQueryRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.partition_id.namespace, '') self.assertEqual(request.query, q_pb) self.assertEqual(request.read_options.read_consistency, - _datastore_pb2.ReadOptions.EVENTUAL) + datastore_pb2.ReadOptions.EVENTUAL) self.assertEqual(request.read_options.transaction, b'') def test_run_query_wo_eventual_w_transaction(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import query_pb2 DATASET_ID = 'DATASET' KIND = 'Nonesuch' CURSOR = b'\x00' TRANSACTION = b'TRANSACTION' q_pb = self._make_query_pb(KIND) - rsp_pb = _datastore_pb2.RunQueryResponse() + rsp_pb = datastore_pb2.RunQueryResponse() rsp_pb.batch.end_cursor = CURSOR - no_more = _query_pb2.QueryResultBatch.NO_MORE_RESULTS + no_more = query_pb2.QueryResultBatch.NO_MORE_RESULTS rsp_pb.batch.more_results = no_more - rsp_pb.batch.entity_result_type = _query_pb2.EntityResult.FULL + rsp_pb.batch.entity_result_type = query_pb2.EntityResult.FULL conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -546,46 +546,46 @@ def test_run_query_wo_eventual_w_transaction(self): self.assertEqual(skipped, 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.RunQueryRequest + rq_class = datastore_pb2.RunQueryRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.partition_id.namespace, '') self.assertEqual(request.query, q_pb) self.assertEqual(request.read_options.read_consistency, - _datastore_pb2.ReadOptions.DEFAULT) + datastore_pb2.ReadOptions.DEFAULT) self.assertEqual(request.read_options.transaction, TRANSACTION) def test_run_query_w_eventual_and_transaction(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import query_pb2 DATASET_ID = 'DATASET' KIND = 'Nonesuch' CURSOR = b'\x00' TRANSACTION = b'TRANSACTION' q_pb = self._make_query_pb(KIND) - rsp_pb = _datastore_pb2.RunQueryResponse() + rsp_pb = datastore_pb2.RunQueryResponse() rsp_pb.batch.end_cursor = CURSOR - no_more = _query_pb2.QueryResultBatch.NO_MORE_RESULTS + no_more = query_pb2.QueryResultBatch.NO_MORE_RESULTS rsp_pb.batch.more_results = no_more - rsp_pb.batch.entity_result_type = _query_pb2.EntityResult.FULL + rsp_pb.batch.entity_result_type = query_pb2.EntityResult.FULL conn = self._makeOne() self.assertRaises(ValueError, conn.run_query, DATASET_ID, q_pb, eventual=True, transaction_id=TRANSACTION) def test_run_query_wo_namespace_empty_result(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import query_pb2 DATASET_ID = 'DATASET' KIND = 'Nonesuch' CURSOR = b'\x00' q_pb = self._make_query_pb(KIND) - rsp_pb = _datastore_pb2.RunQueryResponse() + rsp_pb = datastore_pb2.RunQueryResponse() rsp_pb.batch.end_cursor = CURSOR - no_more = _query_pb2.QueryResultBatch.NO_MORE_RESULTS + no_more = query_pb2.QueryResultBatch.NO_MORE_RESULTS rsp_pb.batch.more_results = no_more - rsp_pb.batch.entity_result_type = _query_pb2.EntityResult.FULL + rsp_pb.batch.entity_result_type = query_pb2.EntityResult.FULL conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -603,21 +603,21 @@ def test_run_query_wo_namespace_empty_result(self): self.assertEqual(skipped, 0) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.RunQueryRequest + rq_class = datastore_pb2.RunQueryRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.partition_id.namespace, '') self.assertEqual(request.query, q_pb) def test_run_query_w_namespace_nonempty_result(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import entity_pb2 DATASET_ID = 'DATASET' KIND = 'Kind' - entity = _entity_pb2.Entity() + entity = entity_pb2.Entity() q_pb = self._make_query_pb(KIND) - rsp_pb = _datastore_pb2.RunQueryResponse() + rsp_pb = datastore_pb2.RunQueryResponse() rsp_pb.batch.entity_result.add(entity=entity) rsp_pb.batch.entity_result_type = 1 # FULL rsp_pb.batch.more_results = 3 # NO_MORE_RESULTS @@ -635,18 +635,18 @@ def test_run_query_w_namespace_nonempty_result(self): self.assertEqual(len(pbs), 1) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.RunQueryRequest + rq_class = datastore_pb2.RunQueryRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.partition_id.namespace, 'NS') self.assertEqual(request.query, q_pb) def test_begin_transaction(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' TRANSACTION = b'TRANSACTION' - rsp_pb = _datastore_pb2.BeginTransactionResponse() + rsp_pb = datastore_pb2.BeginTransactionResponse() rsp_pb.transaction = TRANSACTION conn = self._makeOne() URI = '/'.join([ @@ -661,20 +661,20 @@ def test_begin_transaction(self): self.assertEqual(conn.begin_transaction(DATASET_ID), TRANSACTION) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.BeginTransactionRequest + rq_class = datastore_pb2.BeginTransactionRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.isolation_level, rq_class.SERIALIZABLE) def test_commit_wo_transaction(self): from gcloud._testing import _Monkey - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 from gcloud.datastore import connection as MUT DATASET_ID = 'DATASET' key_pb = self._make_key_pb(DATASET_ID) - rsp_pb = _datastore_pb2.CommitResponse() - mutation = _datastore_pb2.Mutation() + rsp_pb = datastore_pb2.CommitResponse() + mutation = datastore_pb2.Mutation() insert = mutation.upsert.add() insert.key.CopyFrom(key_pb) prop = insert.property.add() @@ -705,7 +705,7 @@ def mock_parse(response): self.assertTrue(result is expected_result) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.CommitRequest + rq_class = datastore_pb2.CommitRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.transaction, b'') @@ -715,13 +715,13 @@ def mock_parse(response): def test_commit_w_transaction(self): from gcloud._testing import _Monkey - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 from gcloud.datastore import connection as MUT DATASET_ID = 'DATASET' key_pb = self._make_key_pb(DATASET_ID) - rsp_pb = _datastore_pb2.CommitResponse() - mutation = _datastore_pb2.Mutation() + rsp_pb = datastore_pb2.CommitResponse() + mutation = datastore_pb2.Mutation() insert = mutation.upsert.add() insert.key.CopyFrom(key_pb) prop = insert.property.add() @@ -752,7 +752,7 @@ def mock_parse(response): self.assertTrue(result is expected_result) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.CommitRequest + rq_class = datastore_pb2.CommitRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.transaction, b'xact') @@ -761,11 +761,11 @@ def mock_parse(response): self.assertEqual(_parsed, [rsp_pb]) def test_rollback_ok(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' TRANSACTION = b'xact' - rsp_pb = _datastore_pb2.RollbackResponse() + rsp_pb = datastore_pb2.RollbackResponse() conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -779,16 +779,16 @@ def test_rollback_ok(self): self.assertEqual(conn.rollback(DATASET_ID, TRANSACTION), None) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.RollbackRequest + rq_class = datastore_pb2.RollbackRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(request.transaction, TRANSACTION) def test_allocate_ids_empty(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' - rsp_pb = _datastore_pb2.AllocateIdsResponse() + rsp_pb = datastore_pb2.AllocateIdsResponse() conn = self._makeOne() URI = '/'.join([ conn.api_base_url, @@ -802,13 +802,13 @@ def test_allocate_ids_empty(self): self.assertEqual(conn.allocate_ids(DATASET_ID, []), []) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.AllocateIdsRequest + rq_class = datastore_pb2.AllocateIdsRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(list(request.key), []) def test_allocate_ids_non_empty(self): - from gcloud.datastore import _datastore_pb2 + from gcloud.datastore._generated import datastore_pb2 DATASET_ID = 'DATASET' before_key_pbs = [ @@ -819,7 +819,7 @@ def test_allocate_ids_non_empty(self): self._make_key_pb(DATASET_ID), self._make_key_pb(DATASET_ID, id=2345), ] - rsp_pb = _datastore_pb2.AllocateIdsResponse() + rsp_pb = datastore_pb2.AllocateIdsResponse() rsp_pb.key.add().CopyFrom(after_key_pbs[0]) rsp_pb.key.add().CopyFrom(after_key_pbs[1]) conn = self._makeOne() @@ -836,7 +836,7 @@ def test_allocate_ids_non_empty(self): after_key_pbs) cw = http._called_with self._verifyProtobufCall(cw, URI, conn) - rq_class = _datastore_pb2.AllocateIdsRequest + rq_class = datastore_pb2.AllocateIdsRequest request = rq_class() request.ParseFromString(cw['body']) self.assertEqual(len(request.key), len(before_key_pbs)) @@ -851,30 +851,30 @@ def _callFUT(self, commit_response_pb): return _parse_commit_response(commit_response_pb) def test_it(self): - from gcloud.datastore import _datastore_pb2 - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import datastore_pb2 + from gcloud.datastore._generated import entity_pb2 index_updates = 1337 keys = [ - _entity_pb2.Key( + entity_pb2.Key( path_element=[ - _entity_pb2.Key.PathElement( + entity_pb2.Key.PathElement( kind='Foo', id=1234, ), ], ), - _entity_pb2.Key( + entity_pb2.Key( path_element=[ - _entity_pb2.Key.PathElement( + entity_pb2.Key.PathElement( kind='Bar', name='baz', ), ], ), ] - response = _datastore_pb2.CommitResponse( - mutation_result=_datastore_pb2.MutationResult( + response = datastore_pb2.CommitResponse( + mutation_result=datastore_pb2.MutationResult( index_updates=index_updates, insert_auto_id_key=keys, ), diff --git a/gcloud/datastore/test_helpers.py b/gcloud/datastore/test_helpers.py index 36177c9334f2..de2f7ed41d78 100644 --- a/gcloud/datastore/test_helpers.py +++ b/gcloud/datastore/test_helpers.py @@ -22,12 +22,12 @@ def _callFUT(self, val): return entity_from_protobuf(val) def test_it(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 _DATASET_ID = 'DATASET' _KIND = 'KIND' _ID = 1234 - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() entity_pb.key.partition_id.dataset_id = _DATASET_ID entity_pb.key.path_element.add(kind=_KIND, id=_ID) prop_pb = entity_pb.property.add() @@ -71,12 +71,12 @@ def test_it(self): self.assertEqual(key.id, _ID) def test_mismatched_value_indexed(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 _DATASET_ID = 'DATASET' _KIND = 'KIND' _ID = 1234 - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() entity_pb.key.partition_id.dataset_id = _DATASET_ID entity_pb.key.path_element.add(kind=_KIND, id=_ID) @@ -96,18 +96,18 @@ def test_mismatched_value_indexed(self): self._callFUT(entity_pb) def test_entity_no_key(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() entity = self._callFUT(entity_pb) self.assertEqual(entity.key, None) self.assertEqual(dict(entity), {}) def test_entity_with_meaning(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() prop = entity_pb.property.add() prop.value.meaning = meaning = 9 prop.value.string_value = val = u'something' @@ -119,7 +119,7 @@ def test_entity_with_meaning(self): self.assertEqual(entity._meanings, {name: (meaning, val)}) def test_nested_entity_no_key(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 DATASET_ID = 's~FOO' KIND = 'KIND' @@ -127,12 +127,12 @@ def test_nested_entity_no_key(self): OUTSIDE_NAME = 'OBAR' INSIDE_VALUE = 1337 - entity_inside = _entity_pb2.Entity() + entity_inside = entity_pb2.Entity() inside_prop = entity_inside.property.add() inside_prop.name = INSIDE_NAME inside_prop.value.integer_value = INSIDE_VALUE - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() entity_pb.key.partition_id.dataset_id = DATASET_ID element = entity_pb.key.path_element.add() element.kind = KIND @@ -175,15 +175,15 @@ def _compareEntityProto(self, entity_pb1, entity_pb2): self.assertEqual(val1, val2) def test_empty(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity entity = Entity() entity_pb = self._callFUT(entity) - self._compareEntityProto(entity_pb, _entity_pb2.Entity()) + self._compareEntityProto(entity_pb, entity_pb2.Entity()) def test_key_only(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity from gcloud.datastore.key import Key @@ -193,7 +193,7 @@ def test_key_only(self): entity = Entity(key=key) entity_pb = self._callFUT(entity) - expected_pb = _entity_pb2.Entity() + expected_pb = entity_pb2.Entity() expected_pb.key.partition_id.dataset_id = dataset_id path_elt = expected_pb.key.path_element.add() path_elt.kind = kind @@ -202,7 +202,7 @@ def test_key_only(self): self._compareEntityProto(entity_pb, expected_pb) def test_simple_fields(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity entity = Entity() @@ -212,7 +212,7 @@ def test_simple_fields(self): entity[name2] = value2 = u'some-string' entity_pb = self._callFUT(entity) - expected_pb = _entity_pb2.Entity() + expected_pb = entity_pb2.Entity() prop1 = expected_pb.property.add() prop1.name = name1 prop1.value.integer_value = value1 @@ -223,20 +223,20 @@ def test_simple_fields(self): self._compareEntityProto(entity_pb, expected_pb) def test_with_empty_list(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity entity = Entity() entity['foo'] = [] entity_pb = self._callFUT(entity) - self._compareEntityProto(entity_pb, _entity_pb2.Entity()) + self._compareEntityProto(entity_pb, entity_pb2.Entity()) def test_inverts_to_protobuf(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.helpers import entity_from_protobuf - original_pb = _entity_pb2.Entity() + original_pb = entity_pb2.Entity() # Add a key. original_pb.key.partition_id.dataset_id = dataset_id = 'DATASET' elem1 = original_pb.key.path_element.add() @@ -259,7 +259,7 @@ def test_inverts_to_protobuf(self): # Add a nested (entity) property. prop3 = original_pb.property.add() prop3.name = 'entity-baz' - sub_pb = _entity_pb2.Entity() + sub_pb = entity_pb2.Entity() sub_prop1 = sub_pb.property.add() sub_prop1.name = 'x' sub_prop1.value.double_value = 3.14 @@ -291,7 +291,7 @@ def test_inverts_to_protobuf(self): self._compareEntityProto(original_pb, new_pb) def test_meaning_with_change(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity entity = Entity() @@ -300,7 +300,7 @@ def test_meaning_with_change(self): entity._meanings[name] = (9, 1337) entity_pb = self._callFUT(entity) - expected_pb = _entity_pb2.Entity() + expected_pb = entity_pb2.Entity() prop = expected_pb.property.add() prop.name = name prop.value.integer_value = value @@ -317,8 +317,8 @@ def _callFUT(self, val): return key_from_protobuf(val) def _makePB(self, dataset_id=None, namespace=None, path=()): - from gcloud.datastore._entity_pb2 import Key - pb = Key() + from gcloud.datastore._generated import entity_pb2 + pb = entity_pb2.Key() if dataset_id is not None: pb.partition_id.dataset_id = dataset_id if namespace is not None: @@ -474,9 +474,9 @@ def _callFUT(self, pb): return _get_value_from_value_pb(pb) def _makePB(self, attr_name, value): - from gcloud.datastore._entity_pb2 import Value + from gcloud.datastore._generated import entity_pb2 - pb = Value() + pb = entity_pb2.Value() setattr(pb, attr_name, value) return pb @@ -491,10 +491,10 @@ def test_datetime(self): self.assertEqual(self._callFUT(pb), utc) def test_key(self): - from gcloud.datastore._entity_pb2 import Value + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.key import Key - pb = Value() + pb = entity_pb2.Value() expected = Key('KIND', 1234, dataset_id='DATASET').to_protobuf() pb.key_value.CopyFrom(expected) found = self._callFUT(pb) @@ -521,10 +521,10 @@ def test_unicode(self): self.assertEqual(self._callFUT(pb), u'str') def test_entity(self): - from gcloud.datastore._entity_pb2 import Value + from gcloud.datastore._generated import entity_pb2 from gcloud.datastore.entity import Entity - pb = Value() + pb = entity_pb2.Value() entity_pb = pb.entity_value entity_pb.key.path_element.add(kind='KIND') entity_pb.key.partition_id.dataset_id = 'DATASET' @@ -536,9 +536,9 @@ def test_entity(self): self.assertEqual(entity['foo'], 'Foo') def test_list(self): - from gcloud.datastore._entity_pb2 import Value + from gcloud.datastore._generated import entity_pb2 - pb = Value() + pb = entity_pb2.Value() list_pb = pb.list_value item_pb = list_pb.add() item_pb.string_value = 'Foo' @@ -548,9 +548,9 @@ def test_list(self): self.assertEqual(items, ['Foo', 'Bar']) def test_unknown(self): - from gcloud.datastore._entity_pb2 import Value + from gcloud.datastore._generated import entity_pb2 - pb = Value() + pb = entity_pb2.Value() self.assertEqual(self._callFUT(pb), None) @@ -562,9 +562,8 @@ def _callFUT(self, value_pb, val): return _set_protobuf_value(value_pb, val) def _makePB(self): - from gcloud.datastore._entity_pb2 import Value - - return Value() + from gcloud.datastore._generated import entity_pb2 + return entity_pb2.Value() def test_datetime(self): import calendar @@ -699,19 +698,19 @@ def _callFUT(self, key_pb): return _prepare_key_for_request(key_pb) def test_prepare_dataset_id_valid(self): - from gcloud.datastore import _entity_pb2 - key = _entity_pb2.Key() + from gcloud.datastore._generated import entity_pb2 + key = entity_pb2.Key() key.partition_id.dataset_id = 'foo' new_key = self._callFUT(key) self.assertFalse(new_key is key) - key_without = _entity_pb2.Key() + key_without = entity_pb2.Key() new_key.ClearField('partition_id') self.assertEqual(new_key, key_without) def test_prepare_dataset_id_unset(self): - from gcloud.datastore import _entity_pb2 - key = _entity_pb2.Key() + from gcloud.datastore._generated import entity_pb2 + key = entity_pb2.Key() new_key = self._callFUT(key) self.assertTrue(new_key is key) @@ -776,25 +775,25 @@ def _callFUT(self, *args, **kwargs): return _get_meaning(*args, **kwargs) def test_no_meaning(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - value_pb = _entity_pb2.Value() + value_pb = entity_pb2.Value() result = self._callFUT(value_pb) self.assertEqual(result, None) def test_single(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - value_pb = _entity_pb2.Value() + value_pb = entity_pb2.Value() value_pb.meaning = meaning = 22 value_pb.string_value = u'hi' result = self._callFUT(value_pb) self.assertEqual(meaning, result) def test_empty_list_value(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - value_pb = _entity_pb2.Value() + value_pb = entity_pb2.Value() value_pb.list_value.add() value_pb.list_value.pop() @@ -802,9 +801,9 @@ def test_empty_list_value(self): self.assertEqual(None, result) def test_list_value(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - value_pb = _entity_pb2.Value() + value_pb = entity_pb2.Value() meaning = 9 sub_value_pb1 = value_pb.list_value.add() sub_value_pb2 = value_pb.list_value.add() @@ -817,9 +816,9 @@ def test_list_value(self): self.assertEqual(meaning, result) def test_list_value_disagreeing(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - value_pb = _entity_pb2.Value() + value_pb = entity_pb2.Value() meaning1 = 9 meaning2 = 10 sub_value_pb1 = value_pb.list_value.add() @@ -834,9 +833,9 @@ def test_list_value_disagreeing(self): self._callFUT(value_pb, is_list=True) def test_list_value_partially_unset(self): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 - value_pb = _entity_pb2.Value() + value_pb = entity_pb2.Value() meaning1 = 9 sub_value_pb1 = value_pb.list_value.add() sub_value_pb2 = value_pb.list_value.add() @@ -858,7 +857,7 @@ def __init__(self, prefix, from_missing=False): self.from_missing = from_missing def lookup(self, dataset_id, key_pbs): - from gcloud.datastore import _entity_pb2 + from gcloud.datastore._generated import entity_pb2 # Store the arguments called with. self._called_dataset_id = dataset_id @@ -866,7 +865,7 @@ def lookup(self, dataset_id, key_pbs): key_pb, = key_pbs - response = _entity_pb2.Entity() + response = entity_pb2.Entity() response.key.CopyFrom(key_pb) response.key.partition_id.dataset_id = self.prefix + dataset_id diff --git a/gcloud/datastore/test_key.py b/gcloud/datastore/test_key.py index 5432004b3748..77e7b5156565 100644 --- a/gcloud/datastore/test_key.py +++ b/gcloud/datastore/test_key.py @@ -333,11 +333,11 @@ def test_completed_key_on_complete(self): self.assertRaises(ValueError, key.completed_key, 5678) def test_to_protobuf_defaults(self): - from gcloud.datastore._entity_pb2 import Key as KeyPB + from gcloud.datastore._generated import entity_pb2 _KIND = 'KIND' key = self._makeOne(_KIND, dataset_id=self._DEFAULT_DATASET) pb = key.to_protobuf() - self.assertTrue(isinstance(pb, KeyPB)) + self.assertTrue(isinstance(pb, entity_pb2.Key)) # Check partition ID. self.assertEqual(pb.partition_id.dataset_id, self._DEFAULT_DATASET) diff --git a/gcloud/datastore/test_query.py b/gcloud/datastore/test_query.py index a0e944b98b61..f1d3a3a169a8 100644 --- a/gcloud/datastore/test_query.py +++ b/gcloud/datastore/test_query.py @@ -326,13 +326,13 @@ def _makeOne(self, *args, **kw): return self._getTargetClass()(*args, **kw) def _addQueryResults(self, connection, cursor=_END, more=False): - from gcloud.datastore import _entity_pb2 - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import entity_pb2 + from gcloud.datastore._generated import query_pb2 - MORE = _query_pb2.QueryResultBatch.NOT_FINISHED - NO_MORE = _query_pb2.QueryResultBatch.MORE_RESULTS_AFTER_LIMIT + MORE = query_pb2.QueryResultBatch.NOT_FINISHED + NO_MORE = query_pb2.QueryResultBatch.MORE_RESULTS_AFTER_LIMIT _ID = 123 - entity_pb = _entity_pb2.Entity() + entity_pb = entity_pb2.Entity() entity_pb.key.partition_id.dataset_id = self._DATASET path_element = entity_pb.key.path_element.add() path_element.kind = self._KIND @@ -531,7 +531,7 @@ def _callFUT(self, query): return _pb_from_query(query) def test_empty(self): - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import query_pb2 pb = self._callFUT(_Query()) self.assertEqual(list(pb.projection), []) @@ -540,7 +540,7 @@ def test_empty(self): self.assertEqual(list(pb.group_by), []) self.assertEqual(pb.filter.property_filter.property.name, '') cfilter = pb.filter.composite_filter - self.assertEqual(cfilter.operator, _query_pb2.CompositeFilter.AND) + self.assertEqual(cfilter.operator, query_pb2.CompositeFilter.AND) self.assertEqual(list(cfilter.filter), []) self.assertEqual(pb.start_cursor, b'') self.assertEqual(pb.end_cursor, b'') @@ -559,12 +559,12 @@ def test_kind(self): def test_ancestor(self): from gcloud.datastore.key import Key from gcloud.datastore.helpers import _prepare_key_for_request - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import query_pb2 ancestor = Key('Ancestor', 123, dataset_id='DATASET') pb = self._callFUT(_Query(ancestor=ancestor)) cfilter = pb.filter.composite_filter - self.assertEqual(cfilter.operator, _query_pb2.CompositeFilter.AND) + self.assertEqual(cfilter.operator, query_pb2.CompositeFilter.AND) self.assertEqual(len(cfilter.filter), 1) pfilter = cfilter.filter[0].property_filter self.assertEqual(pfilter.property.name, '__key__') @@ -572,15 +572,15 @@ def test_ancestor(self): self.assertEqual(pfilter.value.key_value, ancestor_pb) def test_filter(self): - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import query_pb2 query = _Query(filters=[('name', '=', u'John')]) query.OPERATORS = { - '=': _query_pb2.PropertyFilter.EQUAL, + '=': query_pb2.PropertyFilter.EQUAL, } pb = self._callFUT(query) cfilter = pb.filter.composite_filter - self.assertEqual(cfilter.operator, _query_pb2.CompositeFilter.AND) + self.assertEqual(cfilter.operator, query_pb2.CompositeFilter.AND) self.assertEqual(len(cfilter.filter), 1) pfilter = cfilter.filter[0].property_filter self.assertEqual(pfilter.property.name, 'name') @@ -589,16 +589,16 @@ def test_filter(self): def test_filter_key(self): from gcloud.datastore.key import Key from gcloud.datastore.helpers import _prepare_key_for_request - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import query_pb2 key = Key('Kind', 123, dataset_id='DATASET') query = _Query(filters=[('__key__', '=', key)]) query.OPERATORS = { - '=': _query_pb2.PropertyFilter.EQUAL, + '=': query_pb2.PropertyFilter.EQUAL, } pb = self._callFUT(query) cfilter = pb.filter.composite_filter - self.assertEqual(cfilter.operator, _query_pb2.CompositeFilter.AND) + self.assertEqual(cfilter.operator, query_pb2.CompositeFilter.AND) self.assertEqual(len(cfilter.filter), 1) pfilter = cfilter.filter[0].property_filter self.assertEqual(pfilter.property.name, '__key__') @@ -606,15 +606,15 @@ def test_filter_key(self): self.assertEqual(pfilter.value.key_value, key_pb) def test_order(self): - from gcloud.datastore import _query_pb2 + from gcloud.datastore._generated import query_pb2 pb = self._callFUT(_Query(order=['a', '-b', 'c'])) self.assertEqual([item.property.name for item in pb.order], ['a', 'b', 'c']) self.assertEqual([item.direction for item in pb.order], - [_query_pb2.PropertyOrder.ASCENDING, - _query_pb2.PropertyOrder.DESCENDING, - _query_pb2.PropertyOrder.ASCENDING]) + [query_pb2.PropertyOrder.ASCENDING, + query_pb2.PropertyOrder.DESCENDING, + query_pb2.PropertyOrder.ASCENDING]) def test_group_by(self): pb = self._callFUT(_Query(group_by=['a', 'b', 'c'])) diff --git a/gcloud/datastore/test_transaction.py b/gcloud/datastore/test_transaction.py index 6569aa7003ae..beb1ce0f76d9 100644 --- a/gcloud/datastore/test_transaction.py +++ b/gcloud/datastore/test_transaction.py @@ -25,7 +25,7 @@ def _makeOne(self, client, **kw): return self._getTargetClass()(client, **kw) def test_ctor_defaults(self): - from gcloud.datastore._datastore_pb2 import Mutation + from gcloud.datastore._generated import datastore_pb2 _DATASET = 'DATASET' connection = _Connection() @@ -35,7 +35,7 @@ def test_ctor_defaults(self): self.assertEqual(xact.connection, connection) self.assertEqual(xact.id, None) self.assertEqual(xact._status, self._getTargetClass()._INITIAL) - self.assertTrue(isinstance(xact.mutations, Mutation)) + self.assertTrue(isinstance(xact.mutations, datastore_pb2.Mutation)) self.assertEqual(len(xact._partial_key_entities), 0) def test_current(self): @@ -160,9 +160,9 @@ class Foo(Exception): def _make_key(kind, id, dataset_id): - from gcloud.datastore._entity_pb2 import Key + from gcloud.datastore._generated import entity_pb2 - key = Key() + key = entity_pb2.Key() key.partition_id.dataset_id = dataset_id elem = key.path_element.add() elem.kind = kind