Audit logs are a subset of Cloud Logging designed specifically to answer one question: who did what to which resource and when. They are the record that compliance teams need when demonstrating that access controls worked, that security teams need when investigating incidents, and that the Associate Cloud Engineer exam tests when presenting scenarios about governance, compliance, and access visibility.
GCP has three categories of audit logs, and understanding what each captures is important for the exam.
Admin Activity audit logs record actions that modify the configuration or metadata of GCP resources. Creating a VM, changing an IAM policy, enabling or disabling an API, modifying a firewall rule — these all generate Admin Activity audit logs. These logs are always enabled and cannot be disabled. They are retained for 400 days by default, which is longer than regular Cloud Logging retention, reflecting their importance for compliance and governance.
Data Access audit logs record operations that read or write user data in resources. Querying a BigQuery dataset, reading an object from Cloud Storage, calling a Firestore document — these generate Data Access logs. Unlike Admin Activity logs, Data Access logs are disabled by default for most services because they generate very high volume and incur additional cost. You enable them selectively for the specific services and data types where you need that level of visibility.
System Event audit logs record automated GCP system actions that modify resource configurations, like a GKE node being automatically upgraded or a Compute Engine VM being migrated during host maintenance. These are always enabled and do not count against your log ingestion quota because they are generated by GCP itself rather than by user or application actions.
Each audit log entry includes the identity of the principal who performed the action (a user email or a service account email), the resource that was affected, the type of operation, the timestamp, and the outcome. For Admin Activity logs, you can see exactly which IAM role was granted to which principal, or exactly which firewall rule was deleted and by whom.
This traceability is what makes audit logs valuable for compliance. When an auditor asks "who granted themselves admin access last Tuesday," the Admin Activity audit logs provide the answer. When a security team asks "did anyone access this specific Cloud Storage bucket last month," the Data Access audit logs (if enabled for that bucket) provide the answer.
The default 400-day retention for Admin Activity logs may not be sufficient for regulatory requirements. Healthcare, finance, and other regulated industries often require seven years or more of log retention. The standard pattern for extended audit log retention is a log sink that exports to a Cloud Storage bucket with a locked retention policy.
When you need to retain audit logs from multiple projects — common in large organizations — a folder-level or organization-level log sink aggregates all project audit logs into a single Cloud Storage bucket. This makes compliance audits more manageable because all relevant logs are in one place rather than spread across dozens of project-level Cloud Logging consoles.
The storage class choice depends on retrieval expectations. Audit logs that might need to be retrieved for investigations or audits in the next year or two belong in Nearline or Coldline. Logs that are retained purely for regulatory compliance with no expected retrieval belong in Archive storage, which is the cheapest option but has a 365-day minimum storage duration and retrieval costs.
The Associate Cloud Engineer exam tests audit logs in compliance and troubleshooting scenarios. When a question describes a need to track configuration changes across a GCP organization, Admin Activity logs are the answer. When a question describes a regulated company that needs to prove logs were retained for a specific period, a Cloud Storage log sink with a retention lock is the answer. When a question describes a security team that needs to know who accessed a specific database, Data Access logs are the answer, with the caveat that they need to be enabled for that service first.
The exam also tests the distinction between audit logs and regular application logs. Audit logs track administrative and access events at the GCP control plane. Application logs track what your code is doing inside a service. Both live in Cloud Logging, but they serve different purposes and have different retention defaults.
For a complete walkthrough of Cloud Logging, audit logs, and the compliance scenarios on the Associate Cloud Engineer exam, my Associate Cloud Engineer course covers these topics in the depth the exam requires.
Audit logs are queryable through the Cloud Logging console's log explorer, through gcloud, or through BigQuery if you have set up a log sink to export them. For quick operational lookups, the gcloud approach is often fastest:
gcloud logging read 'logName="projects/PROJECT_ID/logs/cloudaudit.googleapis.com%2Factivity"
AND resource.type="gce_instance"' --limit=50 --format=json
This command reads Admin Activity audit logs for Compute Engine instances. The log name format for audit logs follows the pattern cloudaudit.googleapis.com/activity for Admin Activity, cloudaudit.googleapis.com/data_access for Data Access logs, and cloudaudit.googleapis.com/system_event for System Event logs.
For compliance investigations that need to analyze large volumes of audit log data across time, a BigQuery export via a log sink makes queries much faster and more flexible. You can join audit log data with other datasets, filter by specific principals, and calculate statistics like how many Admin Activity events occurred per day over a quarter. The Associate Cloud Engineer exam does not test gcloud log query syntax in detail, but understanding that audit logs are queryable and exportable reinforces the broader point that GCP's observability tools are interconnected.