Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
…1374 and other performance/bug fixes

Issue # 1374:
    Use the latest Dataflow SDK version.

Issue # 1373: Unable to deal with new cloudbuild.googleapis.com/Build assets:

    The core issue was that the discovery_name of this new asset type is
incorrectly reported as cloudbuild.googleapis.com/Build rather than 'Build'. Try
to deal with that by correcting any discovery_name with a '/' in it. But there
are other fixes necessary to speed processing.

Other performance/bug fixes:

- Use the discovery document generated schema if we have one over any resource
  generated one. This is a big performance improvement as determining the schema
  from the resource is time consuming, it's also not productive as if we have an
  API resource schema, it should always match the resource json anyways.

- Add ancestors, update_time, location, json_data to discovery generated schema.
  This prevents those properties from being dropped if we always rely on it.

- Sanitize discovery document generated schemas. If we are to always rely on
  them, it's possible they could be invalid, so enforce the bigquery
  rules on them as well.

- Use copy.deepcopy less, only when we copy a source into a destination field.

- Prevent bigquery columns with BQ_FORBIDDEN_PREFIXES from being created. There
  are some bigtable resources that can include these prefixes.

- Some BigQuery model resources had NaN and Infinity values for numeric fields.
  Try to handle those in sanitization.

- When merging schemas, stop after we have BQ_MAX_COLUMNS fields. This helps to
  stop the merge process earlier. (It can take forever if there are many unique
  fields and many elements).

- When enforcing schema on a resource, recognize when we are handling addition
  properties and add the additional property fields to the value of the
  additional property key value list in push_down_additional_properties. This
  produced more regular schemas.

- Add ignore_unknown_values to the load job so that we don't fail if resource
  contains fields not present in the schema.

- Accept and pass --add-load-date-suffix via main.py.

- Better naming of some local variables for readability.

- Some format changes suggested by Intellij.
  • Loading branch information
bmenasha committed Nov 15, 2024
1 parent ca49968 commit 04f1059
Show file tree
Hide file tree
Showing 13 changed files with 812 additions and 315 deletions.
214 changes: 156 additions & 58 deletions tools/asset-inventory/asset_inventory/api_schema.py

Large diffs are not rendered by default.

365 changes: 248 additions & 117 deletions tools/asset-inventory/asset_inventory/bigquery_schema.py

Large diffs are not rendered by default.

24 changes: 13 additions & 11 deletions tools/asset-inventory/asset_inventory/export.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ def export_to_gcs(parent, gcs_destination, content_type, asset_types):
Invoke either the cloudasset.organizations.exportAssets or
cloudasset.projects.exportAssets method depending on if parent is a project
or orgniaztion.
or organization.
Args:
parent: Either `project/<project-id>` or `organization/<organization#>`.
gcs_destination: GCS uri to export to.
Expand All @@ -65,10 +65,10 @@ def export_to_gcs(parent, gcs_destination, content_type, asset_types):
output_config = asset_v1.types.OutputConfig()
output_config.gcs_destination.uri = gcs_destination
operation = Clients.cloudasset().export_assets(
parent,
output_config,
content_type=content_type,
asset_types=asset_types)
{'parent': parent,
'output_config': output_config,
'content_type': content_type,
'asset_types': asset_types})
return operation.result()


Expand Down Expand Up @@ -128,17 +128,19 @@ def add_argparse_args(ap, required=False):
'This MUST be run with a service account owned by a project with the '
'Cloud Asset API enabled. The gcloud generated user credentials'
' do not work. This requires:\n\n'
' 1. Enable the Cloud Asset Inventory API on a project (https://console.cloud.google.com/apis/api/cloudasset.googleapis.com/overview)\n'
' 2. Create a service acocunt owned by this project\n'
'1. Enable the Cloud Asset Inventory API on a project ('
'https://console.cloud.google.com/apis/api/cloudasset.googleapis.com/overview)\n'
' 2. Create a service account owned by this project\n'
' 3. Give the service account roles/cloudasset.viewer at the organization layer\n'
' 4. Run on a GCE instance started with this service account,\n'
' or downloadthe private key and set GOOGLE_APPLICATION_CREDENTIALS to the file name\n'
' or download the private key and set GOOGLE_APPLICATION_CREDENTIALS to the file name\n'
' 5. Run this command.\n\n'
'If the GCS bucket being written to is owned by a different project then'
' the project that you enabled the API on, then you must also grant the'
' "service-<project-id>@gcp-sa-cloudasset.iam.gserviceaccount.com" account'
' objectAdmin privleges to the bucket:\n'
' gsutil iam ch serviceAccount:service-<project-id>@gcp-sa-cloudasset.iam.gserviceaccount.com:objectAdmin gs://<bucket>\n'
' objectAdmin privileges to the bucket:\n'
'gsutil iam ch serviceAccount:service-<project-id>@gcp-sa-cloudasset.iam.gserviceaccount.com:objectAdmin '
'gs://<bucket>\n'
'\n\n')
ap.add_argument(
'--parent',
Expand Down Expand Up @@ -172,7 +174,7 @@ def content_types_argument(string):

ap.add_argument(
'--asset-types',
help=('Comma seprated list of asset types to export such as '
help=('Comma separated list of asset types to export such as '
'"google.compute.Firewall,google.compute.HealthCheck"'
' default is `*` for everything'),
type=lambda x: [y.strip() for y in x.split(',')],
Expand Down
Loading

0 comments on commit 04f1059

Please sign in to comment.