The GCP Professional Cloud Architect exam has four case studies: Mountkirk Games, EHR Healthcare, Helicopter Racing League, and TerramEarth. Google publishes them, so candidates know them before exam day. The trap is that most candidates read the case studies once, memorize the company names, and then are surprised when the exam questions test something they did not notice in the business requirements.
Each case study is testing a specific architectural theme. Once you name the theme, the questions about it become much more predictable.
Mountkirk Games — real-time multiplayer, global players
The testable theme: low-latency global data distribution, plus real-time analytics on player behavior.
What the stem is actually telling you:
- Players in 20 countries → multi-region architecture, latency-critical
- Mobile and HTML5 → back-end cannot assume reliable connections
- Real-time metrics on player behavior → streaming analytics pipeline
- Legacy on-prem game with session state → migration plan with stateful-to-stateless conversion
- Competitive tournaments → consistency-critical in some areas, eventually-consistent in others
What they will ask:
- When to use Spanner (global consistency) vs Firestore (regional, fast)
- When to use Pub/Sub + Dataflow for streaming analytics
- How to handle session state (do not put it in the VM)
- Multi-region load balancing with Global HTTP(S) Load Balancer
- GameServer-style workloads on GKE with regional clusters
If a PCA question mentions gaming, tournaments, or low-latency global access, the Mountkirk architecture toolkit is the answer set.
EHR Healthcare — regulated data, hybrid, multiple tenants
The testable theme: compliance (HIPAA-coded scenarios), data-residency, hybrid connectivity, and multi-tenant isolation.
What the stem is actually telling you:
- Customer clinic data must not mix → VPC-level isolation, project-level isolation
- Existing on-prem systems remain → hybrid connectivity (Interconnect or VPN)
- Regulated data (PHI) → encryption at rest with CMEK, VPC Service Controls
- Audit trails required → Cloud Audit Logs, data access logs enabled
- Customers span multiple US states → data residency per state not required, but data retention and audit per state is
What they will ask:
- Customer-managed encryption keys (CMEK) vs Google-managed
- VPC Service Controls as a compliance perimeter
- Cloud Interconnect (Dedicated vs Partner) for hybrid
- Shared VPC vs separate projects per tenant
- Organization policies to enforce tenant isolation
- Data Loss Prevention API use cases
If a question mentions compliance, hybrid, or multi-tenant healthcare, EHR's toolkit applies.
Helicopter Racing League — media, streaming, AI inference
The testable theme: global content distribution plus real-time AI/ML inference at the edge.
What the stem is actually telling you:
- Global streaming video → Cloud CDN, Media CDN
- Real-time predictive race data → streaming ML inference
- Video archive → object storage with lifecycle policies
- Low latency for viewers → edge caching and global LB
What they will ask:
- Cloud CDN vs Media CDN (Media CDN is purpose-built for video)
- Cloud Storage classes for video archive (Nearline vs Coldline vs Archive)
- Streaming analytics with Pub/Sub + Dataflow
- Vertex AI for model training, online prediction endpoints for real-time inference
- Global load balancing for viewer traffic
If a question mentions streaming, video, or "real-time prediction during an event," HRL applies.
TerramEarth — IoT scale, edge, analytics
The testable theme: industrial IoT at scale, edge computing, batch and streaming analytics.
What the stem is actually telling you:
- 20 million vehicles, each reporting 200 data points per second → truly massive ingestion
- Vehicles in remote regions with intermittent connectivity → edge buffering needed
- Dealer network needs reports → serving layer required
- Engineering needs analytics → data warehouse + ML
What they will ask:
- Pub/Sub as the ingestion backbone for millions of IoT devices
- Dataflow for streaming transforms
- BigQuery as the analytics warehouse
- Cloud IoT Core (note: deprecated in 2023, so answers should lean on Pub/Sub + direct ingestion or third-party IoT platforms, not IoT Core)
- Edge ML with Coral or pre-processed inference
- Cold storage for long-term archive
If a question is about IoT, fleet data, or massive ingestion, TerramEarth applies.
The meta-pattern
Every case study is about three variables:
- Where does the data come from (ingestion pattern)?
- Where does the data need to go (serving / access pattern)?
- What are the constraints (compliance, cost, latency)?
If you can answer those three for each case study in 30 seconds, you have the architectural skeleton in your head. Exam questions then become "which service fills slot X in the skeleton."
A drill for exam week
Take a blank sheet of paper and, for each of the four case studies, write:
- One sentence on the business model
- The three biggest technical constraints
- The three GCP services that anchor the architecture
- The one compliance or risk constraint that overrides
If you can do this from memory for all four, you are ready. If you cannot, you have revealed your weak case study. Spend 45 minutes on that one and repeat the drill.
The 2023 IoT Core caveat
Google deprecated Cloud IoT Core in 2023. If your practice bank was written before that, it may include questions where IoT Core is the right answer. On the live exam, it is not. The replacement pattern is direct device → Pub/Sub ingestion or a partner IoT platform feeding Pub/Sub. If you see IoT Core in an answer option in a current exam, it is almost certainly a distractor for TerramEarth-shaped questions.
The case studies are the most predictable part of GCP PCA. Read them as test-design artifacts, not as marketing text, and the questions built on top of them stop feeling random.