-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Google Cloud Platform support #13598
Conversation
a95f01f
to
731ab3c
Compare
7d23cb9
to
7b7ce7a
Compare
a852fec
to
3a052b4
Compare
Until go modules is fully supported in beats, Functionbeat on GCP are using vendoring. I have applied the required patches, it should not change the code a lot. Packaging still needs some adjustments to include vendor in the folder |
bc3d378
to
70619c9
Compare
562062c
to
486914f
Compare
I think one can also use |
d5cb053
to
0872410
Compare
Only compile with TLS1.3 support for go1.13 or newer is used. If an older go version is used we stick with TLS1.2 max. The change also introduces TLSVersionMin/Max and TLSVersionDefaultMin/Max constants, so to keep the tests intact.
import "crypto/tls" | ||
|
||
const ( | ||
TLSVersionSSL30 TLSVersion = tls.VersionSSL30 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
exported const TLSVersionSSL30 should have comment (or a comment on this block) or be unexported
import "crypto/tls" | ||
|
||
const ( | ||
TLSVersionSSL30 TLSVersion = tls.VersionSSL30 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
exported const TLSVersionSSL30 should have comment (or a comment on this block) or be unexported
import "crypto/tls" | ||
|
||
const ( | ||
TLSVersionSSL30 TLSVersion = tls.VersionSSL30 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
exported const TLSVersionSSL30 should have comment (or a comment on this block) or be unexported
Failing tests are unrelated. |
This PR introduces the support for Google Cloud Platform to Functionbeat. This branch is located in the `elastic/beats` repository, so anyone on our team has access to it. To use the API to deploy, remove and update functions, users need to set the environment variable `GOOGLE_APPLICATION_CREDENTIALS`. This variable should point to a JSON file which contains all the relevant information for Google to authenticate. (About authentication for GCP libs: https://cloud.google.com/docs/authentication/getting-started) * Cloud Functions Developer * Cloud Functions Service Agent * Service Account User * Storage Admin * Storage Object Admin Note: Cloud Functions Developer role is in beta. We should not make GCP support GA, until it becomes stable. ```yaml functionbeat.provider.gcp.location_id: "europe-west1" functionbeat.provider.gcp.project_id: "my-project-123456" functionbeat.provider.gcp.storage_name: "functionbeat-deploy" functionbeat.provider.gcp.functions: ``` Function templates can be exported into YAML. With this YAML configuration, users can deploy the function using the [Google Cloud Deployment Manager](https://cloud.google.com/deployment-manager/). A function under the folder `pkg/pubsub` is available to get events from Google Pub/Sub. ```yaml # Define the list of function availables, each function required to have a unique name. # Create a function that accepts events coming from Google Pub/Sub. - name: pubsub enabled: false type: pubsub # Description of the method to help identify them when you run multiples functions. description: "Google Cloud Function for Pub/Sub" # The maximum memory allocated for this function, the configured size must be a factor of 64. # Default is 256MiB. #memory_size: 256MiB # Execution timeout in seconds. If the function does not finish in time, # it is considered failed and terminated. Default is 60s. #timeout: 60s # Email of the service account of the function. Defaults to {projectid}@appspot.gserviceaccount.com #service_account_email: {projectid}@appspot.gserviceaccount.com # Labels of the function. #labels: # mylabel: label # VPC Connector this function can connect to. # Format: projects/*/locations/*/connectors/* or fully-qualified URI #vpc_connector: "" # Number of maximum instances running at the same time. Default is unlimited. #maximum_instances: 0 trigger: event_type: "providers/cloud.pubsub/eventTypes/topic.publish" resource: "projects/_/pubsub/myPubSub" #service: "pubsub.googleapis.com" # Optional fields that you can specify to add additional information to the # output. Fields can be scalar values, arrays, dictionaries, or any nested # combination of these. #fields: # env: staging # Define custom processors for this function. #processors: # - dissect: # tokenizer: "%{key1} %{key2}" ``` A function under the folder pkg/storage is available to get events from Google Cloud Storage. ```yaml # Create a function that accepts events coming from Google Cloud Storage. - name: storage enabled: false type: storage # Description of the method to help identify them when you run multiples functions. description: "Google Cloud Function for Cloud Storage" # The maximum memory allocated for this function, the configured size must be a factor of 64. # Default is 256MiB. #memory_size: 256MiB # Execution timeout in seconds. If the function does not finish in time, # it is considered failed and terminated. Default is 60s. #timeout: 60s # Email of the service account of the function. Defaults to {projectid}@appspot.gserviceaccount.com #service_account_email: {projectid}@appspot.gserviceaccount.com # Labels of the function. #labels: # mylabel: label # VPC Connector this function can connect to. # Format: projects/*/locations/*/connectors/* or fully-qualified URI #vpc_connector: "" # Number of maximum instances running at the same time. Default is unlimited. #maximum_instances: 0 # Optional fields that you can specify to add additional information to the # output. Fields can be scalar values, arrays, dictionaries, or any nested # combination of these. #fields: # env: staging # Define custom processors for this function. #processors: # - dissect: # tokenizer: "%{key1} %{key2}" ``` * `cloud.google.com/go/functions/metadata` * `cloud.google.com/go/storage` (cherry picked from commit e8e18d0)
* Add Google Cloud Platform support (#13598) This PR introduces the support for Google Cloud Platform to Functionbeat. This branch is located in the `elastic/beats` repository, so anyone on our team has access to it. To use the API to deploy, remove and update functions, users need to set the environment variable `GOOGLE_APPLICATION_CREDENTIALS`. This variable should point to a JSON file which contains all the relevant information for Google to authenticate. (About authentication for GCP libs: https://cloud.google.com/docs/authentication/getting-started) * Cloud Functions Developer * Cloud Functions Service Agent * Service Account User * Storage Admin * Storage Object Admin Note: Cloud Functions Developer role is in beta. We should not make GCP support GA, until it becomes stable. ```yaml functionbeat.provider.gcp.location_id: "europe-west1" functionbeat.provider.gcp.project_id: "my-project-123456" functionbeat.provider.gcp.storage_name: "functionbeat-deploy" functionbeat.provider.gcp.functions: ``` Function templates can be exported into YAML. With this YAML configuration, users can deploy the function using the [Google Cloud Deployment Manager](https://cloud.google.com/deployment-manager/). A function under the folder `pkg/pubsub` is available to get events from Google Pub/Sub. ```yaml # Define the list of function availables, each function required to have a unique name. # Create a function that accepts events coming from Google Pub/Sub. - name: pubsub enabled: false type: pubsub # Description of the method to help identify them when you run multiples functions. description: "Google Cloud Function for Pub/Sub" # The maximum memory allocated for this function, the configured size must be a factor of 64. # Default is 256MiB. #memory_size: 256MiB # Execution timeout in seconds. If the function does not finish in time, # it is considered failed and terminated. Default is 60s. #timeout: 60s # Email of the service account of the function. Defaults to {projectid}@appspot.gserviceaccount.com #service_account_email: {projectid}@appspot.gserviceaccount.com # Labels of the function. #labels: # mylabel: label # VPC Connector this function can connect to. # Format: projects/*/locations/*/connectors/* or fully-qualified URI #vpc_connector: "" # Number of maximum instances running at the same time. Default is unlimited. #maximum_instances: 0 trigger: event_type: "providers/cloud.pubsub/eventTypes/topic.publish" resource: "projects/_/pubsub/myPubSub" #service: "pubsub.googleapis.com" # Optional fields that you can specify to add additional information to the # output. Fields can be scalar values, arrays, dictionaries, or any nested # combination of these. #fields: # env: staging # Define custom processors for this function. #processors: # - dissect: # tokenizer: "%{key1} %{key2}" ``` A function under the folder pkg/storage is available to get events from Google Cloud Storage. ```yaml # Create a function that accepts events coming from Google Cloud Storage. - name: storage enabled: false type: storage # Description of the method to help identify them when you run multiples functions. description: "Google Cloud Function for Cloud Storage" # The maximum memory allocated for this function, the configured size must be a factor of 64. # Default is 256MiB. #memory_size: 256MiB # Execution timeout in seconds. If the function does not finish in time, # it is considered failed and terminated. Default is 60s. #timeout: 60s # Email of the service account of the function. Defaults to {projectid}@appspot.gserviceaccount.com #service_account_email: {projectid}@appspot.gserviceaccount.com # Labels of the function. #labels: # mylabel: label # VPC Connector this function can connect to. # Format: projects/*/locations/*/connectors/* or fully-qualified URI #vpc_connector: "" # Number of maximum instances running at the same time. Default is unlimited. #maximum_instances: 0 # Optional fields that you can specify to add additional information to the # output. Fields can be scalar values, arrays, dictionaries, or any nested # combination of these. #fields: # env: staging # Define custom processors for this function. #processors: # - dissect: # tokenizer: "%{key1} %{key2}" ``` * `cloud.google.com/go/functions/metadata` * `cloud.google.com/go/storage` (cherry picked from commit e8e18d0) * fix vendor * update notice * add missing vendor * update notice
This PR introduces the support for Google Cloud Platform to Functionbeat. This branch is located in the `elastic/beats` repository, so anyone on our team has access to it. ### Manager #### Authentication To use the API to deploy, remove and update functions, users need to set the environment variable `GOOGLE_APPLICATION_CREDENTIALS`. This variable should point to a JSON file which contains all the relevant information for Google to authenticate. (About authentication for GCP libs: https://cloud.google.com/docs/authentication/getting-started) #### Required roles * Cloud Functions Developer * Cloud Functions Service Agent * Service Account User * Storage Admin * Storage Object Admin Note: Cloud Functions Developer role is in beta. We should not make GCP support GA, until it becomes stable. #### Configuration ```yaml # Configure functions to run on Google Cloud Platform, currently, we assume that the credentials # are present in the environment to correctly create the function when using the CLI. # # Configure which region your project is located in. functionbeat.provider.gcp.location_id: "europe-west1" # Configure which Google Cloud project to deploy your functions. functionbeat.provider.gcp.project_id: "my-project-123456" # Configure the Google Cloud Storage we should upload the function artifact. functionbeat.provider.gcp.storage_name: "functionbeat-deploy" functionbeat.provider.gcp.functions: ``` #### Export Function templates can be exported into YAML. With this YAML configuration, users can deploy the function using the [Google Cloud Deployment Manager](https://cloud.google.com/deployment-manager/). ### New functions #### Google Pub/Sub A function under the folder `pkg/pubsub` is available to get events from Google Pub/Sub. ##### Configuration ```yaml # Define the list of function availables, each function required to have a unique name. # Create a function that accepts events coming from Google Pub/Sub. - name: pubsub enabled: false type: pubsub # Description of the method to help identify them when you run multiples functions. description: "Google Cloud Function for Pub/Sub" # The maximum memory allocated for this function, the configured size must be a factor of 64. # Default is 256MiB. #memory_size: 256MiB # Execution timeout in seconds. If the function does not finish in time, # it is considered failed and terminated. Default is 60s. #timeout: 60s # Email of the service account of the function. Defaults to {projectid}@appspot.gserviceaccount.com #service_account_email: {projectid}@appspot.gserviceaccount.com # Labels of the function. #labels: # mylabel: label # VPC Connector this function can connect to. # Format: projects/*/locations/*/connectors/* or fully-qualified URI #vpc_connector: "" # Number of maximum instances running at the same time. Default is unlimited. #maximum_instances: 0 trigger: event_type: "providers/cloud.pubsub/eventTypes/topic.publish" resource: "projects/_/pubsub/myPubSub" #service: "pubsub.googleapis.com" # Optional fields that you can specify to add additional information to the # output. Fields can be scalar values, arrays, dictionaries, or any nested # combination of these. #fields: # env: staging # Define custom processors for this function. #processors: # - dissect: # tokenizer: "%{key1} %{key2}" ``` #### Google Cloud Storage A function under the folder pkg/storage is available to get events from Google Cloud Storage. ##### Configuration ```yaml # Create a function that accepts events coming from Google Cloud Storage. - name: storage enabled: false type: storage # Description of the method to help identify them when you run multiples functions. description: "Google Cloud Function for Cloud Storage" # The maximum memory allocated for this function, the configured size must be a factor of 64. # Default is 256MiB. #memory_size: 256MiB # Execution timeout in seconds. If the function does not finish in time, # it is considered failed and terminated. Default is 60s. #timeout: 60s # Email of the service account of the function. Defaults to {projectid}@appspot.gserviceaccount.com #service_account_email: {projectid}@appspot.gserviceaccount.com # Labels of the function. #labels: # mylabel: label # VPC Connector this function can connect to. # Format: projects/*/locations/*/connectors/* or fully-qualified URI #vpc_connector: "" # Number of maximum instances running at the same time. Default is unlimited. #maximum_instances: 0 # Optional fields that you can specify to add additional information to the # output. Fields can be scalar values, arrays, dictionaries, or any nested # combination of these. #fields: # env: staging # Define custom processors for this function. #processors: # - dissect: # tokenizer: "%{key1} %{key2}" ``` ### Vendor * `cloud.google.com/go/functions/metadata` * `cloud.google.com/go/storage` (cherry picked from commit e8e18d0) # Conflicts: # vendor/vendor.json
Opened a PR to fix the indexing issue for storage: #16000 |
This PR introduces the support for Google Cloud Platform to Functionbeat. This branch is located in the
elastic/beats
repository, so anyone on our team has access to it.Manager
Authentication
To use the API to deploy, remove and update functions, users need to set the environment variable
GOOGLE_APPLICATION_CREDENTIALS
. This variable should point to a JSON file which contains all the relevant information for Google to authenticate.(About authentication for GCP libs: https://cloud.google.com/docs/authentication/getting-started)
Required roles
Note: Cloud Functions Developer role is in beta. We should not make GCP support GA, until it becomes stable.
Configuration
Export
Function templates can be exported into YAML. With this YAML configuration, users can deploy the function using the Google Cloud Deployment Manager.
New functions
Google Pub/Sub
A function under the folder
pkg/pubsub
is available to get events from Google Pub/Sub.Configuration
Google Cloud Storage
A function under the folder pkg/storage is available to get events from Google Cloud Storage.
Configuration
Vendor
cloud.google.com/go/functions/metadata
cloud.google.com/go/storage
How to test
Manager
Make sure you have a GCP account with the required roles listed in the PR.
Package
Functionbeat is able to zip the functions which can be deployed to cloud providers.
Make sure all three zip files are generated and the
--output
flag is applied correctly.Expected packages:
package-aws.zip
package-gcp-pubsub.zip
package-gcp-storage.zip
Deployment
Download a credentials file for your user. Pass the path to Functionbeat as an environment variable:
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/file.json
General
Adjust the following general settings for GCP.
location_id
: Where your function is going to be deployed. (Available locations: https://cloud.google.com/about/locations/)project_id
: The ID of your projectstorage_name
: Name of the storage bucket you want to upload the function. If it does not exists, it will be created, given you have the appropriate role.Pub/Sub function
With the following minimal configuration, you can deploy a function which is triggered by new Pub/Sub events.
Set the option
trigger.resource
to the ID of your Pub/Sub:Besides deploying a function with a minimal config, try to experiment with the other available options to see if all of them are working.
Run the following command to deploy the function:
The deployed function shows up in the functions list, if everything went correctly.
To trigger the function, go to the configured Pub/Sub topic and publish a message. The logs of the function invocation can be found under Stackdriver/Logging/Logs Viewer.
Storage
Configure a trigger for the storage function under
trigger.resource
, just like in case of Pub/Sub trigger.Furthermore, you can configure multiple trigger types for the function as
event_type
:google.storage.object.finalize
google.storage.object.delete
google.storage.object.archive
google.storage.object.metadataUpdate
The function can be triggered by creating/deleting/archiving things from the bucket or any metadata change. Logs are located under the same log viewer as in case of Pub/Sub function.