
Cloud Logging questions on the Professional Cloud Architect exam tend to follow a predictable pattern. You are given a scenario with multiple projects, a compliance retention requirement, or a security team that needs real-time visibility, and you have to pick the right export architecture. The service itself is straightforward, but the architectural decisions around sinks, aggregated exports, and SIEM integration are where the exam questions live.
I want to walk through the Cloud Logging concepts that I see show up most often in Professional Cloud Architect scenarios, with a focus on the export patterns you need to recognize quickly.
Cloud Logging stores, searches, and analyzes log data and events from Google Cloud resources and applications. It used to be called Stackdriver Logging, and you will still occasionally see older documentation refer to it that way. It integrates directly with Cloud Monitoring, which means the same log data you are storing for compliance can drive alerting and analytics.
For the exam, the integration with Cloud Monitoring matters because it explains why scenarios often combine logs and alerts in a single question. If a scenario describes detecting an unusual event and notifying an on-call team, the answer usually involves Cloud Logging feeding into a Cloud Monitoring alert.
Cloud Logging ingests three categories of logs, and you should be able to identify which category a scenario is describing.
Platform Logs originate from GCP services themselves. Compute Engine instances, Cloud SQL databases, and GKE clusters all emit platform logs. If a scenario talks about resource health or service operations, it is platform logs.
Application Logs come from your own code running on GCP. These capture application-specific events and errors. When a scenario mentions debugging a custom workload or capturing exceptions from your own service, that is application logs.
Audit Logs record user and system activities that have implications for security and compliance. Data access, administrative actions, API calls. When the exam mentions compliance, who-did-what, or regulatory retention, you are looking at audit logs.
The default log retention is 30 days. You can configure a custom retention period or export logs to other destinations for longer storage.
This is a small detail that drives a lot of exam questions. If a scenario requires retaining logs for seven years to satisfy a regulatory requirement, you cannot rely on default retention. You either extend retention on the log bucket itself or you set up an export sink to a destination designed for long-term storage. The exam usually wants the export answer because it is cheaper and more flexible.
A log sink is an export destination for your logs based on a filter you define. You write a filter that selects which log entries you care about, you point it at a destination, and Cloud Logging continuously routes matching entries to that destination.
The destinations you need to know are Cloud Storage, BigQuery, and Pub/Sub. Each one maps to a different intent.
Two scenarios I expect you to recognize on the Professional Cloud Architect exam. First, storing DAG execution logs from a Cloud Composer environment in BigQuery for analysis. Second, storing error logs from Compute Engine in a Cloud Storage bucket for long-term retention. Both of these are log sink configurations.
This is where exam questions get more interesting. When you have multiple projects under a folder or organization, you do not want to configure log exports project by project. You want a single aggregated export at the folder or organization level that captures logs from every project below it.
The standard pattern looks like this. You have a folder, call it PROD, containing Project A, Project B, and Project C. Each project generates logs. Instead of configuring an export inside each project, you create a separate Centralized Operations Project, and you configure an aggregated sink at the folder level that routes logs from all three projects into Cloud Storage inside that operations project.
The benefits the exam wants you to recognize:
If a scenario describes an organization with many projects and a security or compliance team that needs unified visibility, the answer is an aggregated export to a centralized operations project. Project-level sinks are the wrong answer in that scenario because they do not scale.
Audit logs are the canonical use case for long-term retention. Compliance frameworks often require retaining audit logs for years, and the cost of keeping them in default Cloud Logging storage adds up fast.
The architecture the exam expects:
If a scenario asks about the most cost-effective way to retain audit logs for seven years across an organization, the answer combines aggregated export and Archive storage. Recognizing that combination is the whole question.
SIEM stands for Security Information and Event Management. It is the security tooling category that includes products like Splunk, Chronicle, and a handful of others. A SIEM collects, analyzes, and acts on security events and logs from many sources, and the value depends on real-time ingestion. A SIEM that sees events ten minutes late is not useful for incident response.
The integration pattern is consistent. You configure a log sink in Cloud Logging that filters for the security-relevant logs, typically audit logs and logs from services like Security Command Center and Cloud IAM. The destination is Pub/Sub. The SIEM subscribes to the Pub/Sub topic and ingests events as they arrive.
The reason Pub/Sub is the right destination, and the reason the exam will give you alternatives that look reasonable, is real-time delivery. Cloud Storage and BigQuery are both batched. They are not appropriate for security incident detection because the latency is too high. Pub/Sub is the only sink destination that gives you streaming delivery, which is what a SIEM needs.
If a scenario describes a security team that needs to detect threats in real time, integrate with an external SIEM, or stream audit logs to Splunk or Chronicle, the answer is a log sink to Pub/Sub.
Most Cloud Logging questions on the Professional Cloud Architect exam are really architecture questions wearing a logging costume. The service does what it does, but the question is about choosing the right export pattern for the requirement.
A few mappings to keep in mind:
If you can read a scenario and immediately identify which of these patterns it is describing, you will get the Cloud Logging questions right without much hesitation.
My Professional Cloud Architect course covers Cloud Logging architecture alongside the rest of the IAM and governance material.