Cloud Storage Classes for the PDE Exam: Standard, Nearline, Coldline, Archive

GCP Study Hub
619c7c8da6d7b95cf26f6f70
November 10, 2025

Cloud Storage classes are one of those Professional Data Engineer topics that look simple on the surface and then trip people up under exam pressure. The four classes (Standard, Nearline, Coldline, Archive) each have a minimum storage duration, a target access pattern, and a price-versus-retrieval-cost tradeoff. The exam likes to mix these signals together in scenario questions, so I want to walk through how I think about each one and the gotchas that come up most often.

The four storage classes at a glance

Every object in Google Cloud Storage lives in one of four classes. The class controls how cheap the storage is and how much you pay (in time and money) when you read the data back. Here is the short version I keep in my head:

  • Standard: no minimum storage duration, high availability, low latency. The default for live applications and data you touch regularly. It is the most expensive class per gigabyte stored, but retrievals are essentially free.
  • Nearline: 30-day minimum. Designed for data accessed less than once a month. Backups and infrequently used files fit well here.
  • Coldline: 90-day minimum. Good for archive-style data that still has to be available for disaster recovery, typically accessed once every few months.
  • Archive: 365-day minimum. The cheapest storage cost and the highest retrieval cost. Built for compliance retention and data you might pull once a year, if ever.

Minimum storage duration is the trap

The single concept I see most often on Professional Data Engineer practice questions is minimum storage duration. If you put an object into Nearline and delete it after 10 days, Google Cloud Storage still charges you for the full 30 days. Coldline charges you for the full 90 even if you delete after a week. Archive charges you for the full 365 days regardless.

This is why picking the storage class before you create a bucket (or before you write the object) matters. Moving data to a colder class to save money looks attractive until you realize the new minimum duration clock resets and you are locked in. On the exam, watch for scenarios where someone wants to migrate frequently changing data into Coldline or Archive. That is almost always the wrong answer.

Bucket default versus object-level override

You set the storage class at the bucket level when you create the bucket, and every new object inherits that default. But you can override the class on individual objects inside the same bucket. This is useful when most of your data is cold but a handful of hot objects need to stay in Standard, or vice versa.

For the exam, remember the rule: storage class is per-object, with a bucket-level default. If a question describes mixed access patterns inside one bucket, object-level overrides are usually the answer rather than splitting into multiple buckets.

The Archive special case

Archive is recommended for data accessed once per year or less, but the actual cost math is more flexible than that rule of thumb suggests. If your priority is minimizing total cost and you access the data only 2-3 times per year, Archive can still come out ahead. The storage savings versus Standard or even Coldline are large enough that paying the retrieval fees a couple of times a year is worth it.

The exam can ask this directly: a workload accesses data three times a year and cost is the top concern. Coldline looks like the right fit from the access frequency table, but Archive is often the cheaper total when you do the math. The judgment call is whether storage cost savings exceed retrieval costs over the year. If yes, Archive wins.

How I match scenarios to classes

When a question gives me a workload, I run through a short checklist:

  • How often does the data get read? Daily means Standard. Monthly means Nearline. Quarterly means Coldline. Yearly means Archive.
  • What is the minimum time the data will sit untouched? If it is shorter than the class minimum, that class is wrong.
  • Is cost or retrieval speed the priority? Archive trades retrieval cost for storage cost, Standard does the opposite.
  • Is the access pattern uniform across the bucket, or do some objects need different treatment? If mixed, think object-level override.

That checklist handles most of what the Professional Data Engineer exam throws at you on this topic. The questions that go beyond it usually combine storage classes with lifecycle management (auto-transition rules from Standard down to Coldline after N days) or with location options like dual-region for disaster recovery. Those are separate concepts, but the underlying class rules still apply.

Quick recap for exam day

Standard for hot data. Nearline at 30 days for monthly access. Coldline at 90 days for quarterly access. Archive at 365 days for yearly access or for 2-3x per year when cost rules. Minimum storage duration is a real charge even if you delete early. Storage class is per-object with a bucket default. If you can recall those five things without thinking, you will get the storage class questions right.

My Professional Data Engineer course covers Cloud Storage classes, lifecycle rules, and the broader storage decision framework you need for the exam.

Get tips and updates from GCP Study Hub

arrow