Search data assets with Data Catalog

This document explains how you can use Data Catalog to perform a search of data assets.

Data assets that you can search for include the following:

  • Analytics Hub linked datasets
  • BigQuery datasets, tables, views, and models
  • Bigtable instances, clusters, and tables (including column family details)
  • Data Catalog tag templates, entry groups, and custom entries
  • Dataplex lakes, zones, tables, and filesets
  • Dataproc Metastore services, databases, and tables
  • Pub/Sub data streams
  • Spanner instances, databases, tables, and views
  • Vertex AI Models, Datasets, and Vertex AI Feature Store resources
  • Assets in enterprise data silos connected to Data Catalog

Search scope

You might have different search results based on your permissions. Data Catalog search results are scoped according to your role.

You can review the different types of IAM role and permissions available for Data Catalog.

For example, if you have BigQuery metadata read access to an object, that object appears in your Data Catalog search results. The following list describes the minimum permissions required:

  • To search for a table, you need bigquery.tables.get permission for that table.

  • To search for a dataset, you need bigquery.datasets.get permission for that dataset.

  • To search for metadata for a dataset or a table, you need the roles/bigquery.metadataViewer role.

  • To search for all resources within a project or organization, you need datacatalog.catalogs.searchAll permission. It works for all resources independent of the source system.

If you have access to a BigQuery table but not to the dataset containing that table, the table still shows up as expected in the Data Catalog search. The same access logic applies to all supported systems such as Pub/Sub and Data Catalog, itself.

Data Catalog search queries don't guarantee full recall. Results that match your query might not be returned, even in subsequent result pages. Additionally, returned (and not returned) results can vary if you repeat search queries.

If you are experiencing recall issues and you don't have to fetch the results in any specific order, consider setting the orderBy parameter to default when calling the catalog.search method.

Use the admin_search flag

Using the admin_search flag on the search request ensures full recall. Administrator search requires datacatalog.catalogs.searchAll permission to be set on all projects and organizations in the search scope. When using admin_search, only default orderBy is allowed.

Date-sharded tables

Data Catalog aggregates date-sharded tables into a single logical entry. This entry has the same schema as the table shard with the most recent date, and contains aggregate information about the total number of shards. The entry derives its access level from the dataset it belongs to. Data Catalog search only shows these logical entries if the user has the access to the dataset that contains them. Individual date-sharded tables are not visible in Data Catalog search, even if they are present in Data Catalog and can be tagged.

Filters

Filters let you narrow down search results. All filters are grouped in sections:

  • Scope to limit search to starred items only.
  • Systems such as BigQuery, Pub/Sub, Dataplex, Dataproc Metastore , custom systems, Vertex AI, and Data Catalog itself. Data Catalog system contains filesets and custom entries.
  • Lakes and zones come from Dataplex.
  • Data types such as data streams, datasets, lakes, zones, filesets, models, tables, views, services, databases, and custom types.
  • Projects lists all projects available to you.
  • Tags lists all tag templates (and their individual fields) available to you.
  • Datasets come from BigQuery and Vertex AI.
  • Public datasets are publicly available data from BigQuery.

You can combine filters from multiple sections to find assets that match at least one condition from every selected section. Multiple filters selected within a single section are evaluated using the "OR" logical operator. For example, given the following filter combination:

Tag value filter panel with multiple sections selected.

Data Catalog looks for:

  • BigQuery datasets tagged with MyTemplate1 template.

  • BigQuery datasets tagged with MyTemplate2 template.

  • BigQuery tables tagged with MyTemplate1 template.

  • BigQuery tables tagged with MyTemplate2 template.

Filter by tag value

The Tags filters let you query for assets tagged using a specific template. You can use the Customize menu to further refine results and filter by specific tag values. The tag value filter conditions depend on that tag field's data type. For example, for datetime and number fields you can specify a specific date or a range.

Filters visibility

The filters displayed in every section depend on the current query in the Search box. The whole set of search results may include entries that match the current query but the filters that correspond to those entries may not be shown on the Filters panel.

How to search for data assets

Console

Console

  1. To launch a Dataplex search query in the Google Cloud console, go to the Dataplex Search page.

    Go to Dataplex Search

  2. For Choose search platform, select Data Catalog as the search mode.

  3. In the search field, enter your query or use the Filters panel to refine the search parameters.

You can manually add the following filters:

  • In Projects, a project filter by clicking the ADD PROJECT button, searching for a specific project, selecting it, and clicking OPEN.
  • In Tags, a tag template filter by clicking the Add more tag templates drop-down, searching for a specific template, selecting it, and clicking OK.

Additionally, you can:

  • Check Include public datasets to search for data assets publicly available in Google Cloud in addition to the assets available to you.

Search example

For example, to search for the trips table that you set up in Configure tag templates, tags, overviews, and data stewards:

  1. Enter trips in the search field and click Search.
  2. Select BigQuery from the Systems section to exclude data assets with the same name that belong to other systems.
  3. Select your project ID from the Projects section to exclude data assets from other projects. If your project is not shown in the section, click ADD PROJECT and select it in the dialog window.
  4. Select the Demo Tag Template from the Tag templates section to see if a tag that uses this template is attached to the trips table. If this template is not shown in the section, click the Add more tags drop-down, find and select it, and click OK.

With all the selected filters, the search results contain only one entry—the BigQuery trips table in your project with an attached tag that uses the Demo Tag Template.

Additionally, you can do the following:

  1. Filter your search by adding a keyword:value to your search terms in the search field:

    KeywordDescription
    name: Match data asset name
    column: Match column name or nested column name
    description: Match table description

  • Perform a tag search by adding one of the following tag keyword prefixes to your search terms in the search field:

    TagDescription
    tag:project-name.tag_template_name Match tag name
    tag:project-name.tag_template_name.key Match a tag key
    tag:project-name.tag_template_name.key:value Match tag key:string value pair
  • Search expression tips

    • Enclose your search expression in quotes ("search terms") if it contains spaces.

    • You can precede a keyword with "NOT" (all CAPS required) to match the logical negation of the keyword:term filter. You can also use "AND" and "OR" (all-CAPS required) boolean operators to combine search expressions.

      For example:NOT column:term lists all columns except those that match the specified term. For a list of keywords and other terms you can use in a Data Catalog search expression, see Data Catalog search syntax.

    Java

    Before trying this sample, follow the Java setup instructions in the Data Catalog quickstart using client libraries. For more information, see the Data Catalog Java API reference documentation.

    To authenticate to Data Catalog, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

    import com.google.cloud.datacatalog.v1.DataCatalogClient;
    import com.google.cloud.datacatalog.v1.DataCatalogClient.SearchCatalogPagedResponse;
    import com.google.cloud.datacatalog.v1.SearchCatalogRequest;
    import com.google.cloud.datacatalog.v1.SearchCatalogRequest.Scope;
    import com.google.cloud.datacatalog.v1.SearchCatalogResult;
    import java.io.IOException;
    
    // Sample to search catalog
    public class SearchAssets {
    
      public static void main(String[] args) throws IOException {
        // TODO(developer): Replace these variables before running the sample.
        String projectId = "my-project-id";
        String query = "type=dataset";
        searchCatalog(projectId, query);
      }
    
      public static void searchCatalog(String projectId, String query) throws IOException {
        // Create a scope object setting search boundaries to the given organization.
        // Scope scope = Scope.newBuilder().addIncludeOrgIds(orgId).build();
    
        // Alternatively, search using project scopes.
        Scope scope = Scope.newBuilder().addIncludeProjectIds(projectId).build();
    
        // Initialize client that will be used to send requests. This client only needs to be created
        // once, and can be reused for multiple requests. After completing all of your requests, call
        // the "close" method on the client to safely clean up any remaining background resources.
        try (DataCatalogClient dataCatalogClient = DataCatalogClient.create()) {
          // Search the catalog.
          SearchCatalogRequest searchCatalogRequest =
              SearchCatalogRequest.newBuilder().setScope(scope).setQuery(query).build();
          SearchCatalogPagedResponse response = dataCatalogClient.searchCatalog(searchCatalogRequest);
    
          System.out.println("Search results:");
          for (SearchCatalogResult result : response.iterateAll()) {
            System.out.println(result);
          }
        }
      }
    }

    Node.js

    Before trying this sample, follow the Node.js setup instructions in the Data Catalog quickstart using client libraries. For more information, see the Data Catalog Node.js API reference documentation.

    To authenticate to Data Catalog, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

    // Import the Google Cloud client library.
    const {DataCatalogClient} = require('@google-cloud/datacatalog').v1;
    const datacatalog = new DataCatalogClient();
    
    async function searchAssets() {
      // Search data assets.
    
      /**
       * TODO(developer): Uncomment the following lines before running the sample.
       */
      // const projectId = 'my_project'; // Google Cloud Platform project
    
      // Set custom query.
      const query = 'type=lake';
    
      // Create request.
      const scope = {
        includeProjectIds: [projectId],
        // Alternatively, search using Google Cloud Organization scopes.
        // includeOrgIds: [organizationId],
      };
    
      const request = {
        scope: scope,
        query: query,
      };
    
      const [result] = await datacatalog.searchCatalog(request);
    
      console.log(`Found ${result.length} datasets in project ${projectId}.`);
      console.log('Datasets:');
      result.forEach(dataset => {
        console.log(dataset.relativeResourceName);
      });
    }
    searchAssets();

    Python

    Before trying this sample, follow the Python setup instructions in the Data Catalog quickstart using client libraries. For more information, see the Data Catalog Python API reference documentation.

    To authenticate to Data Catalog, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

    from google.cloud import datacatalog_v1
    
    datacatalog = datacatalog_v1.DataCatalogClient()
    
    # TODO: Set these values before running the sample.
    project_id = "project_id"
    
    # Set custom query.
    search_string = "type=dataset"
    scope = datacatalog_v1.types.SearchCatalogRequest.Scope()
    scope.include_project_ids.append(project_id)
    
    # Alternatively, search using organization scopes.
    # scope.include_org_ids.append("my_organization_id")
    
    search_results = datacatalog.search_catalog(scope=scope, query=search_string)
    
    print("Results in project:")
    for result in search_results:
        print(result)

    REST & CMD LINE

    REST

    If you do not have access to Cloud Client libraries for your language or want to test the API using REST requests, see the following examples and refer to the Data Catalog REST API documentation.

    1. Search catalog.

    Before using any of the request data, make the following replacements:

    • organization-id: GCP organization ID
    • project-id: GCP project ID

    HTTP method and URL:

    POST https://datacatalog.googleapis.com/v1/catalog:search

    Request JSON body:

    {
      "query":"trips",
      "scope":{
        "includeOrgIds":[
          "organization-id"
        ]
      }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    {
      "results":[
        {
          "searchResultType":"ENTRY",
          "searchResultSubtype":"entry.table",
    "relativeResourceName":"projects/project-id/locations/US/entryGroups/@bigquery/entries/entry1-id",
          "linkedResource":"//bigquery.googleapis.com/projects/project-id/datasets/demo_dataset/tables/taxi_trips"
        },
        {
          "searchResultType":"ENTRY",
          "searchResultSubtype":"entry.table",
          "relativeResourceName":"projects/project-id/locations/US/entryGroups/@bigquery/entries/entry2-id",
          "linkedResource":"//bigquery.googleapis.com/projects/project-id/datasets/demo_dataset/tables/tlc_yellow_trips_2018"
        }
      ]
    }
    

    View table details

    Within the Cloud console, you can use Data Catalog to view table details.

    1. Go to the Dataplex search page.

      Go to Data Catalog

    2. For Choose search platform, select Data Catalog as the search mode.

    3. In the search box, enter the name of a dataset that has a table.

      For example, if you completed the Quickstart, you can search for demo-dataset and select the trips table.

    4. Click the table.

      A BigQuery table details page opens.

    The table details include the following sections:

    • BigQuery table details. Includes information such as the time of creation, time of last modification, time of expiration, resource URLs, and labels.

    • Tags. Lists the applied tags.You can edit the tags from this page and view the tag template. Click the Actions icon.

    • Schema and column tags. Lists the applied schema and their values.

    Star your favorite entries and search for them

    If you frequently browse the same data assets, you can include their entries in a personalized list by marking them with stars. To do that in Dataplex UI:

    1. Go to the Dataplex search page.

      Go to Data Catalog

    2. For Choose search platform, select Data Catalog as the search mode.

    3. Find your asset, and then star its entry in one of two ways:

      • Click the icon next to the entry in search results.
      • Click the entry name to open its details page and click the STAR button on the action bar at the top.

    You can star up to 200 entries.

    Starred entries appear in the Starred Entries list on the search page before you enter a search query in the search bar. This list is visible only to you.

    To search for only starred entries, select the Scope > Starred option on the Filters panel.

    You can also use the corresponding methods of Data Catalog API to star and unstar entries. When searching for assets, use the starredOnly parameter in the scope object. See catalog.search method.