BigQuery Admin Console and INFORMATION_SCHEMA for the PCA Exam

GCP Study Hub
Ben Makansi
March 13, 2026

The BigQuery admin console is the central hub for visibility into how your warehouse is actually running. For the Professional Cloud Architect exam, you need to know what lives in the admin console, what the INFORMATION_SCHEMA gives you, and how to respond when a job fails or hits a quota. I will walk through each piece in the order it shows up day to day.

What lives in the BigQuery admin console

The admin console pulls four capabilities into one place. The first is resource utilization and jobs monitoring, which shows you how slots and queries are being consumed across your projects and reservations. The second is slot capacity management. BigQuery uses slots as the unit of compute for query execution, and the admin console is where you adjust allocations so workloads have the processing power they need. The third is policy tag taxonomies. You create and manage taxonomies here to classify columns and control access to sensitive data. The fourth is the INFORMATION_SCHEMA itself, which you query to get a sense of job performance and historical activity.

If you see a Professional Cloud Architect question that asks where to manage slot reservations, configure column-level access, or audit recent job activity, the admin console is the answer.

INFORMATION_SCHEMA

INFORMATION_SCHEMA is a standardized set of read-only views that expose metadata about your datasets, tables, columns, jobs, and other BigQuery resources. It is queryable like any other table, which means you can use SQL to investigate usage patterns, find performance bottlenecks, and decide where to optimize.

The view that shows up most in exam scenarios is INFORMATION_SCHEMA.JOBS_BY_USER. It tracks user job activity, including the queries that ran, when they ran, and the resources they consumed. A common pattern is checking which jobs have not finished:

SELECT
  job_id,
  creation_time,
  query
FROM
  `region-REGION_NAME`.INFORMATION_SCHEMA.JOBS_BY_USER
WHERE
  state != 'DONE';

That returns a history of jobs that are still running or failed, with the SQL text attached. It is useful for auditing, debugging long-running queries, and analyzing job activity over time. There are sibling views for jobs by project and jobs by organization, but JOBS_BY_USER is the one to anchor on.

Viewing errors related to jobs

BigQuery jobs can fail for many reasons. A query might have a syntax error, an ingest might hit malformed data, a scheduled query might run into a resource limit. The BigQuery interface lets you track and view errors for all of these job types, including ingests, ad-hoc queries, and scheduled queries.

The Jobs explorer is the first place to look when something fails. It surfaces the job status, the error message, and enough context to point you at the cause. The interface highlights errors visually in both the job list and the query results pane, so a failure does not stay hidden. On the Professional Cloud Architect exam, if a question asks where to start troubleshooting a failed BigQuery job, the answer is the Jobs explorer.

Handling the quotaExceeded error

Even with careful planning, you can hit a quotaExceeded error. That means a specific resource or operation has gone over its limit, which could be queries per day, concurrent queries, load size, or any of the other quotas BigQuery enforces. The response has three steps.

First, identify the specific quota limit that was exceeded. The error message itself includes the quota name, so read it carefully rather than guessing. Second, review current quota usage in the GCP Console. The BigQuery quotas section shows a breakdown of where you stand against each limit, which tells you whether you are bumping up against the ceiling repeatedly or just had a spike. Third, leverage INFORMATION_SCHEMA views to investigate. JOBS_BY_USER and its siblings let you correlate the quota event with recent jobs and resource usage, which is how you figure out whether one runaway query caused it or your normal workload has outgrown the current quota.

This three-step response is exactly the kind of thing the Professional Cloud Architect exam tests when it asks how to react to a quota error. The trap answers usually skip straight to requesting a quota increase. The right move is to diagnose first using INFORMATION_SCHEMA, then decide whether the fix is a query change, a workload shift, or an actual quota bump.

My Professional Cloud Architect course covers the BigQuery admin console and INFORMATION_SCHEMA alongside the rest of the storage and analytics material.

arrow