-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Labels
api: datastoreIssues related to the Datastore API.Issues related to the Datastore API.
Description
Current plan of attack:
- Remove
--plugin=protoc-gen-grpc=$(GRPC_PLUGIN)
from "helper" modules that don't use gRPC (e.g.google.bigtable.v1.bigtable_data
) (this way sub-packages that don't need gRPC can use them) (Updating Makefile to only use gRPC when needed. #1354) - Ditch shared generated modules for those included with
protobuf
andgoogleapis-common-protos
(Use googleapis common protos #1353)Move shared (i.e. non-Bigtable specific) modules fromgcloud.bigtable._generated
togcloud._generated
(this way all sub-packages can have access to shared protobuf message classes) and then subsequently update thebigtable
imports to accommodate this change - Pull out
_get_pb_property_value
ingcloud.bigtable.cluster
into core (gcloud._helpers
) and separate out a_has_field
piece sincepb_message.HasField
doesn't work on non-message fields inproto3
(Moving _get_pb_property_value from bigtable into core. #1329) - Update
Makefile
to reflect changes above (i.e. make sure it can be run without changing the code) (See Bringing auto-gen import re-writing up to date with files. #1316) - Update
Makefile
to automatically add the_foo.proto
files to the repo (See Making pb2 auto-gen script also copy over .proto files. #1317) - Fold
Batch.add_auto_id_entity
intoBatch.put
(theauto_id_entity
mutation is removed in favor of just usinginsert
with a partial key) (see Removing datastore Batch.add_auto_id_entity. #1296) - Loosen
Batch
implementation's reliance onMutation
message class (it is used in aLookupRequest
and their combined structure changes inv1beta3
) (See Renaming datastore Batch.mutation to mutations. #1306, Puting helpers in datastore Batch for getting new mutations. #1319, Using protobuf CommitRequest in datastore Connection.commit. #1341) - Remove use of
serializable
inClient.transaction
andTransaction
(it is no longer an option inv1beta3
, so we will curtail it's use inv1beta2
before the switch) (See Removing serializable option from datastore Transaction. #1294) - Manually partition the
_datastore_v1_pb2.py
module into 3 modules that just import portions of the namespace. These modules (datastore_pb2.py
,entity_pb2.py
,query_pb2.py
) will be according to the membership in the new proto definitions forv1beta3
. (See Replacing datastore pb uses with entity shim. #1297, Replacing datastore pb uses with query shim. #1299, Replacing datastore pb uses with "datastore" shim. #1301, Creating _generated pb directory for datastore #1328) - Switch over imports from directly using
_datastore_v1_pb2.py
to using our "shim" imports to mock the structure ofv1beta3
. (See Replacing datastore pb uses with entity shim. #1297, Replacing datastore pb uses with query shim. #1299, Replacing datastore pb uses with "datastore" shim. #1301, Creating _generated pb directory for datastore #1328) - Make sure
API_BASE_URL
for theConnection
class is used (rather than a parent's version) (See Explicitly using API_BASE_URL from current connection in datastore. #1293) - Change
Client.dataset_id
toClient.project
(in advance of the rename toPartitionId.project_id
inv1beta3
) (Replace dataset id with project in datastore #1330) - Change the return type of
Connection.commit
to be a tuple ofindex_updates
andmutation_results
(they were previously on the same result object but are being split apart inv1beta3
) (See Reducing datastore commit reliance on structure of response. #1314) - Replace uses of
HasField
for non-message values with_has_field
(see above) (Moving _get_pb_property_value from bigtable into core. #1329) - Remove
v1beta2
generatedpb2
file and old.proto
definition (can also remove_datastore_v1_pb2.py
from thepylintrc_default
ignored file) and delete our "shim" modules (Upgrading Makefile to generate datastore v1beta3. #1355, Upgrading Makefile to generate datastore v1beta3. #1428) - Update
Makefile
to incorporate protobuf definitions forv1beta3
(I did this in a side-project, Use googleapis common protos #1353, Updating Makefile to only use gRPC when needed. #1354, Upgrading Makefile to generate datastore v1beta3. #1355, Upgrading Makefile to generate datastore v1beta3. #1428) -
Rewrite our "shim" imports to use the actualdatastore._generated
files forv1beta3
. - Switch
API_BASE_URL
fromhttps://www.googleapis.com
tohttps://datastore.googleapis.com
and drop'https://www.googleapis.com/auth/userinfo.email
from the scope list (Updating datastore URI template for v1beta3. #1339, Updating datastore URI template for v1beta3. #1406) - Accommodate renames / retyped / removed
-
CommitRequest.mutation --> CommitRequest.mutations
(Updating CommitRequest, Mutation and helpers for v1beta3. #1461) -
LookupRequest.key --> LookupRequest.keys
(Handling datastore renames key -> keys #1358, Handling datastore renames key -> keys #1456) -
AllocateIdsResponse.key --> AllocateIdsResponse.keys
(Handling datastore renames key -> keys #1358, Handling datastore renames key -> keys #1456) -
AllocateIdsRequest.key --> AllocateIdsRequest.keys
(Handling datastore renames key -> keys #1358, Handling datastore renames key -> keys #1456) -
Key.path_element --> Key.path
(Renaming path_element->path in Key. #1360, Renaming path_element->path in Key. #1457) -
Entity.property --> Entity.properties
(Upgrading Entity.property to properties map in datastore. #1458) -
Query.group_by --> Query.distinct_on
(Handling datastore renames on Query and QueryResultBatch. #1357, Handling datastore renames on Query and QueryResultBatch. #1455) -
Query.limit --> Query.limit.value
(Handling datastore renames on Query and QueryResultBatch. #1357, Handling datastore renames on Query and QueryResultBatch. #1455) -
QueryResultBatch.entity_result --> QueryResultBatch.entity_results
(Handling datastore renames on Query and QueryResultBatch. #1357, Handling datastore renames on Query and QueryResultBatch. #1455) -
PartitionId.namespace --> PartitionId.namespace_id
(Handle datastore renames on PartitionId #1359, Handle datastore renames on PartitionId #1452) -
PartitionId.dataset_id --> PartitionId.project_id
(Handle datastore renames on PartitionId #1359, Handle datastore renames on PartitionId #1452) -
Value.indexed --> Value.exclude_from_indexes
(Rename Value.indexed->exclude_from_indexes. #1365, Rename Value.indexed->exclude_from_indexes. #1453) -
Value.list_value --> Value.array_value
(Upgrading list_value -> array_value for v1beta3. #1460) -
Value.null_value
added (Adding support for null and geo point values in v1beta3. #1464) -
Value.timestamp_microseconds_value --> Value.timestamp_value
(with type change) (Moving _pb_timestamp_to_datetime into core. #1361, Upgrading timestamp_microseconds_value to timestamp_value. #1459) -
Value.geo_point_value
added (Adding support for null and geo point values in v1beta3. #1464) -
Value.blob_key_value
removed -
CompositeFilter.filter --> CompositeFilter.filters
(Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
CompositeFilter.operation --> CompositeFilter.op
(Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
PropertyFilter.operation --> PropertyFilter.op
(Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
CompositeFilter.AND --> CompositeFilter.OPERATOR_UNSPECIFIED
(as default value) (Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
ReadOptions.DEFAULT --> ReadOptions.READ_CONSISTENCY_UNSPECIFIED
(as default value) (Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454)
-
- Deal with the fact that
Entity.property
is not amap
calledEntity.properties
(somewhat different semantics) (Adding helpers for interacting with properties in Entity protobuf. #1340) - Tear out dataset (project) prefix code (Removing hacks that avoid using project ID in key protos. #1466)
- Re-vamp environment variable usage (
dataset_id
no longer needed?) (Removing custom dataset ID environment variable. #1465) - Remove use of
isolation_level
inConnection.begin_transaction
(Removing use of isolation level in datastore. #1343, Removing use of isolation level in datastore. #1407)
New Features:
- Add
GqlQuery
support (Missing GQL support #304) -
QueryResultBatch.skipped_cursor
added (only set whenskipped_results != 0
) -
EntityResult.cursor
added (set by the backend when the entity result is part of aRunQueryResponse.batch.entity_results
response) - Parse protobuf errors
Metadata
Metadata
Assignees
Labels
api: datastoreIssues related to the Datastore API.Issues related to the Datastore API.