
Encryption questions show up reliably on the Professional Cloud Architect exam, and BigQuery has its own quirks worth knowing. The default behavior covers most workloads, but the moment compliance enters the picture, you have to make a deliberate choice about who controls the keys. I want to walk through the three options BigQuery gives you and the one workaround you need when none of them fit.
By default, BigQuery encrypts every dataset at rest using keys that Google owns and rotates on its own schedule. You do not configure anything. You do not see the keys. You do not pay extra. For the majority of analytics workloads, this is the right answer, and it is what the Professional Cloud Architect exam expects you to recognize as the baseline.
Google handles the rotation cadence, key storage, and access control on the underlying key material. From your perspective, the data is just encrypted, and any authorized user querying the dataset never notices the encryption layer at all.
When you need more control, BigQuery supports customer-managed encryption keys backed by Cloud KMS. You create a key in KMS, grant the BigQuery service account permission to use it, and then point your dataset or table at that key. Google still does the actual encryption work, but the key material lives in a KMS keyring that you own.
This matters when you have compliance or regulatory requirements that demand auditability over key usage, or the ability to revoke access by disabling a key. CMEK can be configured at the dataset level or the table level. Configuring it at the dataset level is usually cleaner, because any data copied within that dataset inherits the encryption settings automatically. If you only set CMEK on individual tables, you have to specify the key during copy operations, which is easy to forget.
For the Professional Cloud Architect exam, the trigger phrases are compliance, regulatory control, and key rotation policy under your management. When you see those, CMEK is the answer.
Here is where it gets interesting. BigQuery does not natively support customer-supplied encryption keys. CSEKs are keys that you generate and provide directly with each request, and Google never stores them. Cloud Storage and Compute Engine support this model, but BigQuery does not.
If a question on the exam describes a requirement to use CSEKs with BigQuery, the workaround is simple. You encrypt the data yourself before uploading it. The flow looks like this:
BigQuery stores whatever bytes you give it. It will not re-encrypt the payload with a different key, and it will not decrypt anything for you on read. Only someone holding your CSEK can make sense of the contents. This is a real limitation, not a feature, and the exam tests whether you understand both the limitation and the workaround.
The decision tree is short. If you have no specific compliance requirement, use the default Google-managed encryption. If you need control over key rotation, key disablement, or audit logging on key usage, use CMEK with Cloud KMS. If your policy specifically mandates that Google never holds your key material at all, you cannot use BigQuery natively, so you encrypt before upload and accept that BigQuery will treat your data as opaque ciphertext.
That last constraint is the catch. Encrypted-before-upload data cannot be queried with SQL in any meaningful way. You can store it, you can retrieve it, but you cannot run aggregations or joins over the encrypted columns. This trade-off is usually why teams settle for CMEK instead, and it is worth flagging on the Professional Cloud Architect exam when a scenario asks about both encryption requirements and analytical queries.
My Professional Cloud Architect course covers BigQuery encryption alongside the rest of the storage and analytics material.