
Cloud Composer is one of the orchestration services the Professional Data Engineer exam expects you to know cold, and IAM questions on Composer are a favorite trap. The scenarios sound innocent. A team needs to author DAGs. A platform engineer needs to spin up a new environment. A scheduler service account needs to actually run the workers. Each of those maps to a different role, and picking the wrong one is how exam takers lose points on otherwise easy questions.
I want to walk through the Composer IAM roles you need to recognize on the exam, what each one actually grants, and the least-privilege patterns the exam tends to reward.
Every Composer-related IAM role is granted at the project level. That detail matters because the exam sometimes asks where you bind a role, and Composer roles are not bucket-scoped or environment-scoped through IAM directly. You bind them on the project and they apply across Composer resources in that project.
Here are the six roles to lock in.
The Composer Worker role is the part of Composer IAM that most exam takers underprepare for. When you create a Composer environment, you have to specify a service account for the workers. That service account needs the Composer Worker role on the project, plus whatever permissions your DAGs actually need to do their jobs (read from BigQuery, write to a bucket, publish to Pub Sub, and so on).
A common exam scenario looks like this. A team creates a new Composer 2 environment using a custom service account, the environment fails to start, and you need to pick the most likely cause. The answer is almost always that the service account is missing roles/composer.worker. The default Compute Engine service account has broad permissions, which is why most starter environments come up fine, but the moment you switch to a least-privilege custom service account you have to grant Worker explicitly.
Granting Worker to the environment service account looks like this.
gcloud projects add-iam-policy-binding my-project \
--member="serviceAccount:composer-env-sa@my-project.iam.gserviceaccount.com" \
--role="roles/composer.worker"The Professional Data Engineer exam consistently rewards least-privilege answers. If two options both technically work and one grants a narrower role, pick the narrower one. Here is how that plays out for Composer.
Pipeline authors who write DAGs get Composer Developer, not Admin. They need to push DAG code and manage their pipelines, but they should not be able to provision or delete environments. If the scenario says "data engineers who write and deploy DAGs," Developer is the answer.
Operational users who trigger and monitor runs get Composer User, not Developer. If the scenario describes someone who clears failed tasks, reruns DAGs, or kicks off backfills but does not author pipeline code, User is correct. The exam likes this distinction because it mirrors the real split between pipeline development and pipeline operations.
Auditors and support engineers get Composer Viewer. Anyone who needs to look at run history or environment config but should never be able to change anything ends up here.
External principals who only need DAG source or logs get Composer Environment and Storage Object User. If a question describes a separate service or team that needs to pull log files or sync DAGs from the Composer bucket, this is the right role. It is narrower than Developer because it does not grant any environment-level operations.
The environment service account gets Composer Worker, plus the specific data-access roles its DAGs need. Never grant Worker to a human user. If you see an answer that gives a developer the Worker role, that is a distractor.
When you see a Composer IAM question on the Professional Data Engineer exam, run through these in order.
That sequence covers the vast majority of Composer IAM questions you will see.
My Professional Data Engineer course covers Cloud Composer end to end, including environment architecture, DAG authoring patterns, and the IAM scenarios that show up on the exam.