Route Cloud Dataplex events to Cloud Run

An Eventarc trigger declares your interest in a certain event or set of events. You can configure event routing by specifying filters for the trigger, including the event source, and the target Cloud Run service.

Eventarc delivers events to the event receiver in a CloudEvents format through an HTTP request.

These instructions show you how to configure event routing to your Cloud Run service that is triggered by a direct Cloud Dataplex event. For more details, see the list of supported direct events.

Prepare to create a trigger

Before you create a trigger, complete these prerequisites:

Console

  1. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  2. Enable the Cloud Logging, Eventarc, and Eventarc Publishing APIs.

    Enable the APIs

  3. If applicable, enable the API related to the direct events. For example, for Cloud Dataplex events, enable the Cloud Dataplex API.

  4. If you don't already have one, create a user-managed service account, then grant it the roles and permissions necessary so that Eventarc can manage events for your target service.

    1. In the Google Cloud console, go to the Create service account page.

      Go to Create service account

    2. Select your project.

    3. In the Service account name field, enter a name. The Google Cloud console fills in the Service account ID field based on this name.

      In the Service account description field, enter a description. For example, Service account for event trigger.

    4. Click Create and continue.

    5. To provide appropriate access, in the Select a role list, select the required Identity and Access Management (IAM) roles to grant to your service account for authenticated or unauthenticated invocations. For more information, see Roles and permissions for Cloud Run targets.

      For additional roles, click Add another role and add each additional role.

    6. Click Continue.

    7. To finish creating the account, click Done.

gcloud

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. Enable the Cloud Logging, Eventarc, and Eventarc Publishing APIs.

    gcloud services enable logging.googleapis.com \
      eventarc.googleapis.com \
      eventarcpublishing.googleapis.com
    
  3. If applicable, enable the API related to the direct events. For example, for Cloud Dataplex events, enable dataplex.googleapis.com.

  4. If you don't already have one, create a user-managed service account, then grant it the roles and permissions necessary so that Eventarc can manage events for your target service.

    1. Create the service account:

      gcloud iam service-accounts create SERVICE_ACCOUNT_NAME
      

      Replace SERVICE_ACCOUNT_NAME with the name of the service account. It must be between 6 and 30 characters, and can contain lowercase alphanumeric characters and dashes. After you create a service account, you cannot change its name.

    2. Grant the required Identity and Access Management (IAM) roles or permissions for authenticated or unauthenticated invocations. For more information, see Roles and permissions for Cloud Run targets.

Create a trigger

You can create an Eventarc trigger using the Google Cloud CLI or through the Google Cloud console.

Console

  1. In the Google Cloud console, go to the Eventarc Triggers page.

    Go to Triggers

  2. Click Create trigger.
  3. Type a Trigger name.

    This is the ID of the trigger and it must start with a letter. It can contain up to 63 lowercase letters, numbers, or hyphens.

  4. For the Trigger type, select Google sources.
  5. In the Event provider list, select Cloud Dataplex.

    Note that the event provider name used in the associated Google Cloud documentation might not have a prefix of Cloud or Google Cloud. For example, on the console, Memorystore for Redis is referred to as Google Cloud Memorystore for Redis.

  6. In the Event type list, from the Direct events, select an event type.
  7. To specify the encoding of the event payload, in the Event data content type list, select application/json or application/protobuf.

    Note that an event payload formatted in JSON is larger than one formatted in Protobuf. This might impact reliability depending on your event destination and its limits on event size. For more information, see Known issues.

  8. In the Region list, select the same region as the Google Cloud service that is generating events.

    For more information, see Eventarc locations.

  9. If applicable to the event provider, click Add filter and specify the following:
    1. In the Attribute 1 field, depending on the direct event you chose, select a resource ID that can act as an event filter.
    2. Select an operator:
    3. In the Attribute value 1 field, depending on the operator that you chose, type the exact value or apply a path pattern.
    4. If more attribute filters are applicable, click Add filter and specify the appropriate values.
  10. Select the Service account that will invoke your service or workflow.

    Or, you can create a new service account.

    This specifies the Identity and Access Management (IAM) service account email associated with the trigger and to which you previously granted specific roles required by Eventarc.

  11. In the Event destination list, select Cloud Run.
  12. Select a service.

    This is the name of the service that receives the events for the trigger. The service must be in the same project as the trigger and will receive events as HTTP POST requests sent to its root URL path (/), whenever the event is generated.

  13. Optionally, you can specify the Service URL path to send the incoming request to.

    This is the relative path on the destination service to which the events for the trigger should be sent. For example: /, /route, route, route/subroute.

  14. Click Create.
  15. After a trigger is created, the event source filters cannot be modified. Instead, create a new trigger and delete the old one. For more information, see Manage triggers.

gcloud

You can create a trigger by running a gcloud eventarc triggers create command along with required and optional flags.

  gcloud eventarc triggers create TRIGGER \
      --location=LOCATION \
      --destination-run-service=DESTINATION_RUN_SERVICE \
      --destination-run-region=DESTINATION_RUN_REGION \
      --event-filters="type=EVENT_FILTER_TYPE" \
      --event-filters="COLLECTION_ID=RESOURCE_ID" \
      --event-filters-path-pattern="COLLECTION_ID=PATH_PATTERN" \
      --event-data-content-type="EVENT_DATA_CONTENT_TYPE" \
      --service-account=SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com"

Replace the following:

  • TRIGGER: the ID of the trigger or a fully qualified identifier.
  • LOCATION: the location of the Eventarc trigger. Alternatively, you can set the eventarc/location property; for example, gcloud config set eventarc/location us-central1.

    To avoid any performance and data residency issues, the location must match the location of the Google Cloud service that is generating events. For more information, see Eventarc locations.

  • DESTINATION_RUN_SERVICE: the name of the Cloud Run service that receives the events for the trigger. The service can be in any of the Cloud Run supported locations and does not need to be in the same location as the trigger. However, the service must be in the same project as the trigger and will receive events as HTTP POST requests sent to its root URL path (/), whenever the event is generated.
  • DESTINATION_RUN_REGION: (optional) the region in which the destination Cloud Run service can be found. If not specified, it is assumed that the service is in the same region as the trigger.
  • EVENT_FILTER_TYPE: the identifier of the event. An event is generated when an API call for the method succeeds. For long-running operations, the event is only generated at the end of the operation, and only if the action is performed successfully. For a list of supported event types, see Google event types supported by Eventarc.
  • COLLECTION_ID (optional): the resource component that can act as an event filter, and is one of the following:
    • asset
    • dataattributebindingid
    • dataattributeid
    • datascan
    • datataxonomy
    • datataxonomyid
    • environment
    • lake
    • task
    • zone
  • RESOURCE_ID: the identifier of the resource used as the filtering value for the associated collection. For more information, see Resource ID.
  • PATH_PATTERN: the path pattern to apply when filtering for the resource.
  • EVENT_DATA_CONTENT_TYPE: (optional) the encoding of the event payload. This can be application/json or application/protobuf. The default encoding is application/json.

    Note that an event payload formatted in JSON is larger than one formatted in Protobuf. This might impact reliability depending on your event destination and its limits on event size. For more information, see Known issues.

  • SERVICE_ACCOUNT_NAME: the name of your user-managed service account.
  • PROJECT_ID: your Google Cloud project ID.

Notes:

  • The --event-filters="type=EVENT_FILTER_TYPE" flag is required. If no other event filter is set, events for all resources are matched.
  • EVENT_FILTER_TYPE cannot be changed after creation. To change EVENT_FILTER_TYPE, create a new trigger and delete the old one.
  • Each trigger can have multiple event filters, comma delimited in one --event-filters=[ATTRIBUTE=VALUE,...] flag, or you can repeat the flag to add more filters. Only events that match all the filters are sent to the destination. Wildcards and regular expressions are not supported; however, when using the --event-filters-path-pattern flag, you can define a resource path pattern.
  • The --service-account flag is used to specify the Identity and Access Management (IAM) service account email associated with the trigger.
  • Optionally, specify a relative path on the destination Cloud Run service to which the events for the trigger should be sent by using the --destination-run-path flag.

Example:

  gcloud eventarc triggers create helloworld-trigger \
      --location=us-central1 \
      --destination-run-service=helloworld-events \
      --destination-run-region=us-central1 \
      --event-filters="type=google.cloud.dataplex.dataTaxonomy.v1.updated" \
      --event-filters-path-pattern="datataxonomyid=my-datataxonomyid-*" \
      --service-account=${SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com

This command creates a trigger called helloworld-trigger for the event identified as google.cloud.dataplex.dataTaxonomy.v1.updated and matches events for datataxonomyid IDs starting with my-datataxonomyid-.

Terraform

You can create a trigger for a Cloud Run destination using Terraform. For details, see Create a trigger using Terraform.

List a trigger

You can confirm the creation of a trigger by listing Eventarc triggers using the Google Cloud CLI or through the Google Cloud console.

Console

  1. In the Google Cloud console, go to the Eventarc Triggers page.

    Go to Triggers

    This page lists your triggers in all locations, and includes details such as names, regions, event providers, destinations, and more.

  2. To filter your triggers:

    1. Click Filter or the Filter triggers field.
  3. In the Properties list, select an option to filter the triggers by.

You can select a single property or use the logical operator OR to add more properties.

  • To sort your triggers, beside any supported column heading, click Sort.

  • gcloud

    Run the following command to list your triggers:

    gcloud eventarc triggers list --location=-

    This command lists your triggers in all locations, and includes details such as names, types, destinations, and statuses.

    What's next