Cloud SQL Imports and Migrations for the PCA Exam

GCP Study Hub
Ben Makansi
December 17, 2025

Cloud SQL is a fully managed relational database, but the data it holds usually starts somewhere else. For the Professional Cloud Architect exam, you need to know the four main ways data lands in Cloud SQL, the tooling that produces and consumes dump files, and the gotchas that show up in import scenarios.

The four import paths

There are four ways to get data into Cloud SQL, and each one fits a different situation.

SQL dump files are a complete logical backup of a database, including schema and data. They are the standard choice for a one-time restore or migration when you have an existing MySQL or PostgreSQL database and you want to recreate it inside Cloud SQL.

CSV files import cleanly into Cloud SQL. If your source is flat tabular data, exports from a reporting system, or extracts from another database, CSV is usually the simplest path.

Replication or direct transfer covers ongoing synchronization between databases or exporting directly from an on-prem MySQL or PostgreSQL instance into Cloud SQL. This is what you reach for when downtime needs to stay small and the source needs to keep serving traffic during the cutover.

Database Migration Service is the managed option from Google. It handles the data transfer for you, minimizes downtime, and provides assessment and optimization tooling. When the exam describes a migration project where the team wants Google to do the heavy lifting, DMS is the answer.

Dump files: the tools by engine

Dump files capture a database's structure and data in a format that can be reloaded into another instance. The tooling depends on the engine.

For PostgreSQL, you generate a dump with pg_dump and reimport it with pg_restore. pg_dump supports several output formats, and pg_restore lets you selectively restore specific tables or schemas rather than the whole dump.

For MySQL, you generate a dump with mysqldump and load text-based data into existing tables with mysqlimport.

The Professional Cloud Architect exam likes to test recognition here. If a question describes a team using pg_dump output and asks how to bring that into Cloud SQL, you should recognize that as a PostgreSQL dump file import scenario, not a CSV or DMS scenario.

Best practices for imports

A few rules apply to almost every Cloud SQL import.

Stage the file in Cloud Storage first. Don't try to push directly from a local workstation. Upload the dump or CSV to a GCS bucket, then point Cloud SQL at the GCS object. This is more reliable, especially for large files, and it is the pattern Google expects in exam scenarios.

Compress your data. Cloud SQL accepts compressed .gz files for import. Compressing the dump before upload reduces both storage cost in GCS and data transfer cost during the import. There is no reason to skip this on a real migration.

Use the right flags on the dump. When you generate the dump file, options like --databases, --hex-blob, --skip-triggers, --set-gtid-purged=OFF, and --ignore-table matter. They control how the dump represents binary data, whether it includes triggers, and how it interacts with replication metadata. Cloud SQL has specific requirements, and a dump produced with the wrong flags will fail at import time.

SQL dump files imported into Cloud SQL cannot contain triggers, views, or stored procedures. This is the single most common Cloud SQL import gotcha. If your source database relies on these objects, you have to strip them from the dump (that is what --skip-triggers is for on the MySQL side) and recreate them inside Cloud SQL after the data lands. If an exam question describes an import failing on a dump that contains stored procedures, this is why.

Choosing between the methods

For a one-time migration of a small or medium database, a SQL dump staged in GCS is the simplest answer. For flat tabular data, CSV. For a migration where downtime needs to be near zero or the source database is large and complex, Database Migration Service. For ongoing replication from an external source, replication or direct transfer.

The Professional Cloud Architect exam will give you the constraints (downtime tolerance, source engine, file format on hand, managed vs. self-managed preference) and expect you to map them to one of these four paths.

My Professional Cloud Architect course covers Cloud SQL imports and migrations alongside the rest of the databases material.

arrow