Cloud Functions for the PCA Exam: Triggers, Rollback, Secure Connections

GCP Study Hub
Ben Makansi
March 5, 2026

Cloud Functions sits at the most abstracted, lightweight end of GCP's compute spectrum. It is a serverless, event-driven execution environment with automatic scaling and scale-to-zero behavior, and it is GCP's analogue to AWS Lambda. The Professional Cloud Architect exam does not ask you to write functions or memorize runtime limits. It asks you to recognize when Cloud Functions is the right choice, how its triggers work, how to recover from a bad deployment, and how one function should authenticate when calling another.

I will walk through each of those four areas the way they tend to show up on the exam.

When Cloud Functions is the right answer

The exam likes to present a workload and ask which compute service to use. Cloud Functions is the right pick when the workload has these properties:

  • Code needs to run automatically in response to specific events, like a new file in Cloud Storage, a new Pub/Sub message, or an HTTP request.
  • You need a lightweight API or webhook endpoint without the operational weight of a full server.
  • The workload is sporadic or unpredictable, and you want to pay only for the compute you actually use. This is where scale-to-zero matters more than it does for App Engine Standard environments that keep instances warm.
  • The team wants to focus on writing code, not managing infrastructure or scaling behavior.

If the question describes a long-running stateful service, a containerized application with custom dependencies, or a workload that needs predictable warm capacity, Cloud Functions is usually not the answer. Cloud Run or GKE will fit better. The Professional Cloud Architect exam often hinges on that distinction.

Triggers: built-in versus Eventarc

Triggers are how a Cloud Function decides to execute. They are event-driven, and there are three built-in trigger types you should know cold:

  • Cloud Storage triggers fire on object creation, modification, or deletion in a bucket. Common pattern: upload a file, run a function that processes it.
  • Pub/Sub triggers fire when a message lands on a topic. This is the standard pattern for decoupled async workflows.
  • HTTP triggers fire on an HTTP request. This is how you build a lightweight REST endpoint or a webhook.

Beyond those three, Eventarc expands the trigger surface to many more event sources across GCP. The exam scenario you should be ready for is a log-based trigger via Eventarc, because it is a standard pattern for reactive automation.

The flow goes like this. Some GCP service generates logs and ships them to Cloud Logging. In Eventarc, you define a filter that describes the specific log events you care about. When a log event matches the filter, Eventarc routes the event to a Cloud Function and invokes it, passing the event payload through.

Use cases that map cleanly to this pattern:

  • Run a cleanup script when a Compute Engine instance is removed.
  • Update DNS records based on load balancer logs.
  • Back up a database when a schema-update event is detected.

If a question describes "react to a specific log event somewhere in GCP," Eventarc plus Cloud Functions is the canonical answer.

Rolling back to a previous stable version

Every deployment of a Cloud Function is versioned. The function maintains a deployment history that you can inspect in the Cloud Console or via gcloud, and each version captures the source code, environment configuration, and trigger settings that were active at the time.

The Professional Cloud Architect exam tends to phrase this as a scenario: a recent deployment introduced a bug, and you need to restore the previous stable behavior. The mechanic to know is that Cloud Functions does not have a literal "rollback" button that flips the active version pointer to an old one. The standard pattern is to retrieve the configuration of the prior stable version (source, environment, triggers) from the deployment history, then redeploy that exact configuration. The result is a new version number, but functionally it restores the previous stable state.

So if v3 is broken and v2 was the last good version, you pull v2's configuration out of the deployment history and redeploy it. That redeploy becomes v4, and v4 is functionally a clone of v2. Versioning history stays clean and traceable, and the function is back to known-good behavior.

Securely connecting one function to another

In real architectures, Cloud Functions frequently call other Cloud Functions. The exam expects you to know how that call should be secured.

By default, Cloud Functions require authentication. Public, unauthenticated invocation only happens if you explicitly opt in, either by passing --allow-unauthenticated at deploy time or by configuring the function as public in the console. In the absence of that flag, only principals that hold the Cloud Functions Invoker role (or an equivalent permission) can call the function.

The mechanic for an authenticated function-to-function call is:

  1. The caller function runs as a dedicated service account that has the Cloud Functions Invoker role on the target function.
  2. The caller generates an ID token tied to that service account.
  3. The caller sends an HTTP request to the target function with the ID token in the Authorization header.
  4. The target function verifies the token and the caller's identity, then processes the request.

Two principles to anchor this in your head for the exam. First, give each function its own dedicated service account rather than reusing a generic one, so you can scope IAM precisely. That is the principle of least privilege in practice. Second, never reach for --allow-unauthenticated as a shortcut for internal service-to-service calls. The zero-trust model assumes no implicit trust between services, even inside the same project, and the exam grades that way.

If a question describes Function A calling Function B and asks how to secure the call, the answer is almost always: dedicated service account on A, Cloud Functions Invoker role granted on B, ID token in the Authorization header. Anything that bypasses authentication is wrong.

What to take into the exam

For Cloud Functions on the Professional Cloud Architect exam, four things carry most of the weight:

  • The workload profile that fits Cloud Functions: event-driven, sporadic, lightweight, code-focused.
  • The three built-in trigger types (Cloud Storage, Pub/Sub, HTTP) and the Eventarc extension for everything else, especially log-based triggers.
  • The rollback pattern: deployment history is versioned, and you restore by redeploying the prior version's configuration as a new version.
  • The authentication pattern for function-to-function calls: dedicated service account, Cloud Functions Invoker role, ID token in the Authorization header.

If you can answer scenario questions on those four points, the Cloud Functions surface area on the exam is well-covered.

My Professional Cloud Architect course covers Cloud Functions triggers, rollbacks, and secure invocation alongside the rest of the containers and serverless material.

arrow