
Cloud Workflows is one of those services that probably will not anchor a Professional Data Engineer question on its own, but it shows up often enough in distractor answers and "which orchestrator should you pick" scenarios that you need a clean mental model. The exam tests whether you can match a workload to the right orchestration tool. Cloud Workflows, Cloud Functions, and Cloud Composer all coordinate work, and the trap is treating them as interchangeable.
I want to walk through what Cloud Workflows actually is, what its definition language looks like, and how to separate it from Functions and Composer in the time pressure of an exam question.
Cloud Workflows is a fully managed, serverless orchestration service. You describe a sequence of steps, and Workflows executes them in order, calling Google Cloud services or any HTTP-based API along the way. The platform handles retries, failures, branching, and parallel execution without you provisioning anything.
The key properties to remember for the exam:
Workflows are written in YAML or JSON. Each step has a name, a call (an HTTP endpoint or a Google Cloud connector), arguments, and an optional result variable. You can chain steps, branch with conditionals, run steps in parallel, and define retry policies.
A trivial example looks like this:
main:
steps:
- read_bucket:
call: googleapis.storage.v1.objects.list
args:
bucket: raw-events
result: listing
- call_function:
call: http.post
args:
url: https://us-central1-myproj.cloudfunctions.net/transform
body:
files: ${listing.items}
result: transformed
- return_result:
return: ${transformed.body}
The exam will not ask you to write YAML, but recognizing that Workflows uses a declarative config (as opposed to Composer's Python DAGs) is exactly the kind of detail that distinguishes the correct answer.
Workflows orchestrate HTTP and gRPC endpoints, which in practice means almost anything in Google Cloud and beyond. Typical step targets:
The pattern to internalize is this: Workflows is the conductor, not the worker. It does not run your data transformation. It calls the service that runs your transformation, waits for the result, and decides what to call next.
This is the cleanest distinction. Cloud Functions runs a single piece of code in response to an event or HTTP request. Cloud Workflows runs a sequence of steps that often includes calling one or more Cloud Functions.
Use Cloud Functions when:
Use Cloud Workflows when:
If an exam question describes a single trigger fires, single function runs scenario, the answer is Cloud Functions. If it describes a sequence of API calls with dependencies between them, the answer is Cloud Workflows.
This is the harder distinction because both orchestrate. The split comes down to weight and purpose.
Cloud Composer is managed Apache Airflow. You write DAGs in Python. It is built for data pipelines, complex dependencies, batch jobs, scheduled workloads, and multi-cloud orchestration. Composer runs on a GKE cluster behind the scenes, which means it is heavier and has a baseline cost even when idle.
Cloud Workflows is serverless, YAML-driven, and lightweight. It bills per step executed and scales to zero. It does not have the breadth of operators that Airflow does, and it is not the right tool for a 200-task batch DAG with intricate scheduling.
The exam framing to remember:
If a Professional Data Engineer question mentions Airflow, DAGs, batch data pipelines, or migration from on-prem Airflow, the answer is almost always Composer. If it mentions a lightweight sequence of HTTP calls, serverless orchestration, or YAML, the answer is Workflows.
The common distractor pattern is dressing up a Composer scenario in lightweight language to tempt you toward Workflows, or describing a single event-driven function but adding the word "orchestration" to push you away from Functions. Read for these signals:
The Professional Data Engineer exam rewards crisp mappings. Cloud Workflows fills a real gap between a one-shot function and a full Airflow deployment, and knowing where that gap lives is usually enough to answer the question correctly.
My Professional Data Engineer course covers Cloud Workflows alongside Cloud Functions and Cloud Composer, with side-by-side comparisons and the exam framings you need to pick the right orchestrator under time pressure.