Export and import using SQL dump files

This page describes exporting and importing data into Cloud SQL instances using SQL dump files.

Before you begin

Exports use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.

For best practices, see Best Practices for Importing and Exporting Data.

After completing an import operation, verify the results.

Export data from Cloud SQL for MySQL

Required roles and permissions for exporting from Cloud SQL for MySQL

To export data from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:

Additionally, the service account for the Cloud SQL instance must have one of the following roles:

  • The storage.objectAdmin Identity and Access Management (IAM) role
  • A custom role, including the following permissions:
    • storage.objects.create
    • storage.objects.list (for exporting files in parallel only)
    • storage.objects.delete (for exporting files in parallel only)

For help with IAM roles, see Identity and Access Management.

Export to a SQL dump file from Cloud SQL for MySQL

To create a SQL dump file, you export data from Cloud SQL to Cloud Storage. Once the file is in Cloud Storage, you can import it into another Cloud SQL database. You can also download data from Cloud Storage to your local environment if you want to access it locally.

Exporting from Cloud SQL uses the mysqldump utility with the --single-transaction and --hex-blob options. With the --single-transaction option, mysqldump starts a transaction before running. Rather than lock the entire database, this lets mysqldump read the database in the current state, making for a consistent data dump.

If your SQL dump file contains DEFINER clauses (views, triggers, stored_procedures, and so on), then depending on the order these statements are executed, using this file for import could fail. Learn more about DEFINER usage and potential workarounds in Cloud SQL.

To export data from a database on a Cloud SQL instance to a SQL dump file in a Cloud Storage bucket:

Console

  1. In the Google Cloud console, go to the Cloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open the Overview page of an instance, click the instance name.
  3. Click Export.
  4. In the File format section, click SQL to create a SQL dump file.
  5. In the Data to export section, click One or more databases in this instance to export specific databases.
  6. Use the drop-down menu to select the databases you want to export from.
  7. In the Destination section, select Browse to search for a Cloud Storage bucket or folder for your export.
  8. Click Export to begin the export.

gcloud

  1. Create a Cloud Storage bucket.
  2. Find the service account for the Cloud SQL instance you're exporting from. You can do this running the gcloud sql instances describe command. Look for the serviceAccountEmailAddress field in the output.
    gcloud sql instances describe INSTANCE_NAME
      
  3. Use gcloud storage buckets add-iam-policy-binding to grant the storage.objectAdmin IAM role to the service account. For help with setting IAM permissions, see Using IAM permissions.
  4. Export the database to your Cloud Storage bucket:
    gcloud sql export sql INSTANCE_NAME gs://BUCKET_NAME/sqldumpfile.gz \
    --database=DATABASE_NAME \
    --offload
      

    The export sql command does not contain triggers or stored procedures, but does contain views. To export triggers and/or stored procedures, use the mysqldump tool.

    For more information about using the export sql command, see the sql export sql command reference page.

  5. If you do not need to retain the IAM role you set previously, revoke it now.

REST v1

  1. Create a bucket for the export:
    gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the legacyBucketWriter IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  3. Export your database:

    Before using any of the request data, make the following replacements:

    • PROJECT_ID: the project ID
    • INSTANCE_ID: the instance ID
    • BUCKET_NAME: the Cloud Storage bucket name
    • PATH_TO_DUMP_FILE: the path to the SQL dump file
    • DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
    • DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
  4. HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export

    Request JSON body:

    {
     "exportContext":
       {
          "fileType": "SQL",
          "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE",
          "databases": ["DATABASE_NAME"],
          "offload": TRUE | FALSE
        }
    }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

  5. If you do not need to retain the IAM role you set previously, remove it now.
For the complete list of parameters for the request, see the instances:export page.

REST v1beta4

  1. Create a bucket for the export:
    gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  3. Export your database:

    Before using any of the request data, make the following replacements:

    • PROJECT_ID: the project ID
    • INSTANCE_ID: the instance ID
    • BUCKET_NAME: the Cloud Storage bucket name
    • PATH_TO_DUMP_FILE: the path to the SQL dump file
    • DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
    • DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
  4. HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export

    Request JSON body:

    {
      "exportContext":
        {
           "fileType": "SQL",
           "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE",
           "databases": ["DATABASE_NAME"],
           "offload": TRUE | FALSE
         }
     }
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

  5. If you do not need to retain the IAM role you set previously, revoke it now.
For the complete list of parameters for the request, see the instances:export page.

Export from your local MySQL server using mysqldump

If you are exporting data from an on-premises MySQL database for import into a Cloud SQL database, you must use the mysqldump utility with the following flags:

  • --databases You must use the --databases option to specify an explicit list of databases to export, and this list must not contain the mysql system database.
  • --hex-blob If your database contains any binary fields, you must use this flag to ensure that your binary fields are imported correctly.
  • --set-gtid-purged=OFF GTID information must not be included in the SQL dump file, and binary logging must not be disabled by the SQL dump file. (Not required for MySQL 5.5 or external replication.)
  • --single-transaction Starts a transaction before running. Rather than lock the entire database, this lets mysqldump read the database in the current state, making for a consistent data dump.

From a command line, run mysqldump:

mysqldump --databases DATABASE_NAME -h INSTANCE_IP -u USERNAME -p \
--hex-blob --single-transaction --set-gtid-purged=OFF \
--default-character-set=utf8mb4 > SQL_FILE.sql

For help with mysqldump, see the mysqldump reference.

External replication to Cloud SQL for MySQL

To create a dump file for use in an external server configuration, see Replicating from an external server.

Import data to Cloud SQL for MySQL

Required roles and permissions for importing to Cloud SQL for MySQL

To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:

Additionally, the service account for the Cloud SQL instance must have one of the following roles:

  • The storage.objectAdmin IAM role
  • A custom role, including the following permissions:
    • storage.objects.get
    • storage.objects.list (for importing files in parallel only)

For help with IAM roles, see Identity and Access Management.

Import a SQL dump file to Cloud SQL for MySQL

SQL files are plain text files with a sequence of SQL commands.

Console

  1. In the Google Cloud console, go to the Cloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open the Overview page of an instance, click the instance name.
  3. Click Import.
  4. In the Choose the file you'd like to import data from section, enter the path to the bucket and SQL dump file to use for the import, or browse to an existing file.

    You can import a compressed (.gz) or an uncompressed (.sql) file.

  5. For Format, select SQL.
  6. Select the database you want the data to be imported into.

    This causes Cloud SQL to run the USE DATABASE statement before the import.

  7. If you want to specify a user to perform the import, select the user.

    If your import file contains statements that must be performed by a specific user, use this field to specify that user.

  8. Click Import to start the import.

gcloud

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Describe the instance you are importing to:
    gcloud sql instances describe INSTANCE_NAME
  4. Copy the serviceAccountEmailAddress field.
  5. Use gcloud storage buckets add-iam-policy-binding to grant the storage.objectAdmin IAM role to the service account for the bucket.
    gcloud storage buckets add-iam-policy-binding gs://BUCKET_NAME \
      --member=serviceAccount:SERVICE-ACCOUNT \
      --role=roles/storage.objectAdmin
      
    For help with setting IAM permissions, see Using IAM permissions.
  6. Import the database:
    gcloud sql import sql INSTANCE_NAME gs://BUCKET_NAME/IMPORT_FILE_NAME \
    --database=DATABASE_NAME

    For information about using the import sql command, see the sql import sql command reference page.

    If the command returns an error like ERROR_RDBMS, review the permissions; this error is often due to permissions issues.

  7. If you do not need to retain the IAM permissions you set previously, remove them using gcloud storage buckets remove-iam-policy-binding.

REST v1

  1. Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.

    • If you are importing data from an on-premises MySQL server:
    1. Create a SQL dump file.
    2. Create a bucket in Cloud Storage.
    3. Upload the SQL dump file to the Cloud Storage bucket.
  2. If you're importing data from another Cloud SQL instance, see the instructions in Exporting data from Cloud SQL to a SQL dump file.
  3. Create a Cloud Storage bucket.
  4. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  5. Provide your instance with the legacyBucketWriter and objectViewer IAM roles for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  6. Import your dump file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_sql_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for the request, see the instances:import page.
  7. If you do not need to retain the IAM permissions you set previously, remove them now.

REST v1beta4

  1. Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.

    • If you are importing data from an on-premises MySQL server:
    1. Create a SQL dump file.
    2. Create a bucket in Cloud Storage.
    3. Upload the SQL dump file to the Cloud Storage bucket.
  2. If you're importing data from another Cloud SQL instance, see the instructions in Exporting data from Cloud SQL to a SQL dump file.
  3. Create a Cloud Storage bucket.
  4. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  5. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  6. Import your dump file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_sql_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for the request, see the instances:import page.
  7. If you do not need to retain the IAM permissions you set previously, remove them now.

What's next