Cloud Composer API vs Airflow REST API for the PDE Exam

GCP Study Hub
619c7c8da6d7b95cf26f6f70
February 19, 2026

One of the cleaner trick questions on the Professional Data Engineer exam asks you to pick between the Cloud Composer API and the Airflow REST API for some specific task. The names sound interchangeable, both APIs technically live in the same orchestration story, and if you have not drawn the line between them in your head ahead of time, the answer choices all look plausible. I want to settle that line in this article so you can pattern-match the question type in a few seconds on test day.

The short version is this. The Cloud Composer API manages the environment. The Airflow REST API manages the workflows that run inside the environment. Everything else flows from that split.

What the Cloud Composer API does

The Cloud Composer API is the Google-managed control plane for the Composer environment itself. It is the API you call when you want to create, modify, or tear down the underlying piece of Composer infrastructure that hosts Airflow. You are not touching DAGs with this API. You are touching the box those DAGs live in.

Typical operations the Cloud Composer API handles:

  • Create a new Composer environment in a given project and region with a chosen image version.
  • Update the Composer or Airflow version on an existing environment.
  • Scale workers up or down to handle changing pipeline volume.
  • Update environment configuration such as node count, machine type, network, or environment variables exposed to Airflow.
  • Delete an environment when you are done with it.

If the exam question describes a platform team, a Terraform module, an infrastructure-as-code workflow, or anything that sounds like provisioning, you are in Cloud Composer API territory. A handy mental tag is that this API is the one Cloud Build or a deployment pipeline would call, not the one a DAG author would call.

A typical call looks like this:

gcloud composer environments create my-env \
  --location us-central1 \
  --image-version composer-2-airflow-2 \
  --node-count 3

Or via the REST surface directly:

POST https://composer.googleapis.com/v1/projects/PROJECT/locations/REGION/environments

What the Airflow REST API does

The Airflow REST API is a different beast. It is the standard open-source Airflow API that ships with Airflow itself, exposed through the Composer environment's web server. Google does not own this surface. The Airflow community does. What it manages is everything inside Airflow once the environment exists.

Typical operations the Airflow REST API handles:

  • Trigger a DAG run on demand, optionally with a configuration payload.
  • List DAGs currently registered in the environment.
  • List DAG runs and inspect their status.
  • Get task instances for a given run, including their state, start time, end time, and logs.
  • Clear or mark task instances to rerun failed tasks.
  • Pause or unpause a DAG.
  • Manage Airflow variables, connections, and pools.

If the exam question describes triggering a pipeline from an external event, checking whether yesterday's run succeeded, rerunning a failed task, or wiring a DAG into a Cloud Function or Cloud Run service, you are in Airflow REST API territory. This is the API a data engineer reaches for during day-to-day pipeline operations.

A typical call looks like this:

POST https://AIRFLOW_URI/api/v1/dags/my_dag/dagRuns
Content-Type: application/json

{
  "conf": {"target_date": "2026-02-19"}
}

How to spot which one the question wants

The Professional Data Engineer exam loves to bury the right answer in a small detail of the scenario. When you see a Composer question, scan for these tells:

  • The verb is create, resize, upgrade, scale, or delete, and the noun is environment. That is Cloud Composer API.
  • The verb is trigger, run, monitor, list, pause, rerun, or clear, and the noun is DAG, task, or run. That is Airflow REST API.
  • The actor is a platform engineer or CI/CD pipeline standing up infrastructure. Cloud Composer API.
  • The actor is a data engineer or external service kicking off a workflow on an event. Airflow REST API.
  • The answer choice mentions composer.googleapis.com. Cloud Composer API.
  • The answer choice mentions /api/v1/dags or the Airflow web server URL. Airflow REST API.

One thing worth flagging because it catches people out. Both APIs require authentication, but they authenticate differently. The Cloud Composer API uses standard Google Cloud IAM and a service account with the appropriate Composer roles. The Airflow REST API sits behind Identity-Aware Proxy on the Composer web server and uses an IAP-issued ID token. If a question asks how an external Cloud Run service should authenticate to trigger a DAG, the answer involves IAP, not the Composer API.

A worked example

Here is the kind of phrasing the exam uses. A team has a Composer 2 environment running production DAGs. A new upstream data source publishes a file to Cloud Storage at an unpredictable time each day. The team wants to trigger the corresponding DAG as soon as the file lands. Which API should the Cloud Function call?

The temptation is to pick Cloud Composer API because the word Composer is right there in the scenario. The correct answer is the Airflow REST API. The Cloud Function is not creating or scaling the environment. It is asking Airflow to start a DAG run. That is a workflow operation, not an infrastructure operation.

Flip the scenario. A platform team wants to standardize how Composer environments are provisioned across business units, with a Terraform module that creates a new environment, sets the image version, and configures three worker nodes. Which API does Terraform call under the hood? Cloud Composer API. Terraform is not running DAGs. It is provisioning the box.

Once you have the environment-versus-workflows split internalized, these questions get fast.

My Professional Data Engineer course covers Cloud Composer end to end, including this exact distinction and the other Composer quirks the exam likes to probe, such as environment versions, worker scaling, and the IAP authentication path for the Airflow REST API.

Get tips and updates from GCP Study Hub

arrow