
Cloud Functions sits at the most abstracted, lightweight end of GCP's compute spectrum. It is a serverless, event-driven execution environment with automatic scaling and scale-to-zero behavior, and it is GCP's analogue to AWS Lambda. The Professional Cloud Architect exam does not ask you to write functions or memorize runtime limits. It asks you to recognize when Cloud Functions is the right choice, how its triggers work, how to recover from a bad deployment, and how one function should authenticate when calling another.
I will walk through each of those four areas the way they tend to show up on the exam.
The exam likes to present a workload and ask which compute service to use. Cloud Functions is the right pick when the workload has these properties:
If the question describes a long-running stateful service, a containerized application with custom dependencies, or a workload that needs predictable warm capacity, Cloud Functions is usually not the answer. Cloud Run or GKE will fit better. The Professional Cloud Architect exam often hinges on that distinction.
Triggers are how a Cloud Function decides to execute. They are event-driven, and there are three built-in trigger types you should know cold:
Beyond those three, Eventarc expands the trigger surface to many more event sources across GCP. The exam scenario you should be ready for is a log-based trigger via Eventarc, because it is a standard pattern for reactive automation.
The flow goes like this. Some GCP service generates logs and ships them to Cloud Logging. In Eventarc, you define a filter that describes the specific log events you care about. When a log event matches the filter, Eventarc routes the event to a Cloud Function and invokes it, passing the event payload through.
Use cases that map cleanly to this pattern:
If a question describes "react to a specific log event somewhere in GCP," Eventarc plus Cloud Functions is the canonical answer.
Every deployment of a Cloud Function is versioned. The function maintains a deployment history that you can inspect in the Cloud Console or via gcloud, and each version captures the source code, environment configuration, and trigger settings that were active at the time.
The Professional Cloud Architect exam tends to phrase this as a scenario: a recent deployment introduced a bug, and you need to restore the previous stable behavior. The mechanic to know is that Cloud Functions does not have a literal "rollback" button that flips the active version pointer to an old one. The standard pattern is to retrieve the configuration of the prior stable version (source, environment, triggers) from the deployment history, then redeploy that exact configuration. The result is a new version number, but functionally it restores the previous stable state.
So if v3 is broken and v2 was the last good version, you pull v2's configuration out of the deployment history and redeploy it. That redeploy becomes v4, and v4 is functionally a clone of v2. Versioning history stays clean and traceable, and the function is back to known-good behavior.
In real architectures, Cloud Functions frequently call other Cloud Functions. The exam expects you to know how that call should be secured.
By default, Cloud Functions require authentication. Public, unauthenticated invocation only happens if you explicitly opt in, either by passing --allow-unauthenticated at deploy time or by configuring the function as public in the console. In the absence of that flag, only principals that hold the Cloud Functions Invoker role (or an equivalent permission) can call the function.
The mechanic for an authenticated function-to-function call is:
Two principles to anchor this in your head for the exam. First, give each function its own dedicated service account rather than reusing a generic one, so you can scope IAM precisely. That is the principle of least privilege in practice. Second, never reach for --allow-unauthenticated as a shortcut for internal service-to-service calls. The zero-trust model assumes no implicit trust between services, even inside the same project, and the exam grades that way.
If a question describes Function A calling Function B and asks how to secure the call, the answer is almost always: dedicated service account on A, Cloud Functions Invoker role granted on B, ID token in the Authorization header. Anything that bypasses authentication is wrong.
For Cloud Functions on the Professional Cloud Architect exam, four things carry most of the weight:
If you can answer scenario questions on those four points, the Cloud Functions surface area on the exam is well-covered.
My Professional Cloud Architect course covers Cloud Functions triggers, rollbacks, and secure invocation alongside the rest of the containers and serverless material.