
One of the cleaner trick questions on the Professional Data Engineer exam asks you to pick between the Cloud Composer API and the Airflow REST API for some specific task. The names sound interchangeable, both APIs technically live in the same orchestration story, and if you have not drawn the line between them in your head ahead of time, the answer choices all look plausible. I want to settle that line in this article so you can pattern-match the question type in a few seconds on test day.
The short version is this. The Cloud Composer API manages the environment. The Airflow REST API manages the workflows that run inside the environment. Everything else flows from that split.
The Cloud Composer API is the Google-managed control plane for the Composer environment itself. It is the API you call when you want to create, modify, or tear down the underlying piece of Composer infrastructure that hosts Airflow. You are not touching DAGs with this API. You are touching the box those DAGs live in.
Typical operations the Cloud Composer API handles:
If the exam question describes a platform team, a Terraform module, an infrastructure-as-code workflow, or anything that sounds like provisioning, you are in Cloud Composer API territory. A handy mental tag is that this API is the one Cloud Build or a deployment pipeline would call, not the one a DAG author would call.
A typical call looks like this:
gcloud composer environments create my-env \
--location us-central1 \
--image-version composer-2-airflow-2 \
--node-count 3Or via the REST surface directly:
POST https://composer.googleapis.com/v1/projects/PROJECT/locations/REGION/environmentsThe Airflow REST API is a different beast. It is the standard open-source Airflow API that ships with Airflow itself, exposed through the Composer environment's web server. Google does not own this surface. The Airflow community does. What it manages is everything inside Airflow once the environment exists.
Typical operations the Airflow REST API handles:
If the exam question describes triggering a pipeline from an external event, checking whether yesterday's run succeeded, rerunning a failed task, or wiring a DAG into a Cloud Function or Cloud Run service, you are in Airflow REST API territory. This is the API a data engineer reaches for during day-to-day pipeline operations.
A typical call looks like this:
POST https://AIRFLOW_URI/api/v1/dags/my_dag/dagRuns
Content-Type: application/json
{
"conf": {"target_date": "2026-02-19"}
}The Professional Data Engineer exam loves to bury the right answer in a small detail of the scenario. When you see a Composer question, scan for these tells:
One thing worth flagging because it catches people out. Both APIs require authentication, but they authenticate differently. The Cloud Composer API uses standard Google Cloud IAM and a service account with the appropriate Composer roles. The Airflow REST API sits behind Identity-Aware Proxy on the Composer web server and uses an IAP-issued ID token. If a question asks how an external Cloud Run service should authenticate to trigger a DAG, the answer involves IAP, not the Composer API.
Here is the kind of phrasing the exam uses. A team has a Composer 2 environment running production DAGs. A new upstream data source publishes a file to Cloud Storage at an unpredictable time each day. The team wants to trigger the corresponding DAG as soon as the file lands. Which API should the Cloud Function call?
The temptation is to pick Cloud Composer API because the word Composer is right there in the scenario. The correct answer is the Airflow REST API. The Cloud Function is not creating or scaling the environment. It is asking Airflow to start a DAG run. That is a workflow operation, not an infrastructure operation.
Flip the scenario. A platform team wants to standardize how Composer environments are provisioned across business units, with a Terraform module that creates a new environment, sets the image version, and configures three worker nodes. Which API does Terraform call under the hood? Cloud Composer API. Terraform is not running DAGs. It is provisioning the box.
Once you have the environment-versus-workflows split internalized, these questions get fast.
My Professional Data Engineer course covers Cloud Composer end to end, including this exact distinction and the other Composer quirks the exam likes to probe, such as environment versions, worker scaling, and the IAP authentication path for the Airflow REST API.