Cloud Storage is not just a place to put files. It is also a starting point for pipelines. When an object is uploaded, deleted, or has its metadata changed, Cloud Storage can fire an event that kicks off work in Cloud Functions, Pub/Sub, or Cloud Run. This article covers the three common trigger patterns, when each one is appropriate, and the way the Associate Cloud Engineer exam tests this.
It does not cover the full Eventarc service in depth, every event type Cloud Storage can emit, or the Pub/Sub plumbing under the hood. The goal is to give you what you need for the Associate Cloud Engineer exam, which is the high-level pattern, not the wiring diagram.
An action happens in a Cloud Storage bucket. Most often this is a new object being uploaded. Cloud Storage detects that action and routes an event somewhere. That somewhere can be a Cloud Function, a Pub/Sub topic, or a Cloud Run service. The downstream service then runs whatever code you wrote to handle the event.
The triggering event does not have to be an upload. Object deletion, metadata updates, and object archival can all fire events. In practice, upload is by far the most common trigger and is what almost every exam scenario describes.
This is the most direct pattern. You write a Cloud Function. You configure it with a trigger that points at a specific bucket and a specific event type, like object finalization, which is the technical term for "an object was successfully uploaded." Whenever that event happens, the function runs with information about the object as input.
This is the right pattern for lightweight, on-the-fly work. Resize an image. Extract text from a PDF. Validate a CSV schema. Anything that finishes in seconds and does not need its own server. The function spins up, processes one file, and shuts down.
The Associate Cloud Engineer exam calls out a specific scenario where you place image files in a Cloud Storage bucket, trigger a Cloud Function on upload, and have that function call the Vision API to extract labels and text. That is the canonical exam pattern. If a question describes "automatically process a file when it is uploaded," Cloud Function is almost always the answer.
This pattern is for cases where the upload is the start of a more elaborate workflow. Cloud Storage publishes a message to a Pub/Sub topic. From there, anything that subscribes to the topic can react. You might have a Dataflow pipeline pulling from the topic. You might have several Cloud Run services, each doing a different thing with the same event. You might have a downstream system entirely outside of GCP.
The reason to route through Pub/Sub instead of going directly to a function is fan-out. One upload, many consumers. Pub/Sub also gives you the durability of a buffer, so if a consumer is down briefly, the message waits in the topic until it is processed.
If a question describes a Cloud Storage upload that needs to kick off multiple downstream processes, or that needs to feed into a Dataflow pipeline, Pub/Sub is the trigger.
Cloud Run is for cases where the work being done is too heavyweight for a Cloud Function but you still want serverless. The trigger goes through Eventarc, which is the service that routes Google Cloud events to Cloud Run endpoints. The Cloud Run service receives an HTTP POST with the event payload and processes it.
This is the right pattern when you need a full container with custom dependencies, more memory than a Cloud Function allows, or longer execution time. A microservice that does heavy file conversion, runs an ML inference pipeline on the uploaded file, or talks to a custom database, those go to Cloud Run.
Eventarc is the connective tissue that routes events from Google Cloud sources to event consumers. For Cloud Storage to Cloud Run, Eventarc handles the routing. You do not need to know Eventarc's full architecture for the Associate Cloud Engineer exam. You do need to recognize that when a question mentions a Cloud Storage event triggering a Cloud Run service, Eventarc is the service in the middle, even if it is not named in the question.
The exam scenarios for this topic are pretty stylized.
If you see "a file is uploaded to a bucket and we need to process it automatically with serverless code," think Cloud Function. The question is testing whether you know Cloud Storage can trigger a function on object creation.
If you see "an upload should kick off multiple downstream processes" or "feed a streaming pipeline," think Pub/Sub. The question is testing whether you know Cloud Storage can publish to a topic and that Pub/Sub is the right place to fan out.
If you see "the processing logic is too complex for a function" or "we need a containerized service," think Cloud Run, with Eventarc handling the trigger.
The exam rarely makes you choose between Cloud Functions and Cloud Run for the same scenario. The right one is usually clear from the workload size and complexity.
For reference, here is how you deploy a Cloud Function with a Cloud Storage trigger using gcloud:
gcloud functions deploy process-upload \
--runtime=python311 \
--trigger-resource=my-bucket \
--trigger-event=google.storage.object.finalize \
--entry-point=process_file
The trigger event for object creation is google.storage.object.finalize. Other supported events include object.delete, object.archive, and object.metadataUpdate.
Cloud Storage events drive a lot of GCP pipelines. Three patterns matter for the exam. Cloud Function for lightweight per-file processing. Pub/Sub for fan-out and integration with streaming pipelines. Cloud Run for heavier containerized work, routed through Eventarc. The exam usually tells you which one fits by describing the size and shape of the downstream work.
My Associate Cloud Engineer course covers Cloud Storage triggers in the context of the data pipeline section, alongside Pub/Sub, Dataflow, and the rest of the storage and ingestion stack the Associate Cloud Engineer exam tests.