Most cert candidates do not fail from lack of knowledge. They fail because the exam author wrote four answers that are all technically correct, and only one answers the question that was actually asked. I have spent the last six years generating and auditing cert questions for a living — the Pruvos trap library is at 28,373 items across 34 certifications as of this week — and the traps keep landing in the same nine buckets. Here is the full list, with a worked example for each.
1. Keyword misdirection
The stem contains a loaded word that matches the wrong answer. You read "highly available" and your eyes go to Multi-AZ, even though the question is about cost-optimization and the right answer is a single-AZ instance with a backup plan.
How to defeat it: underline the words that define the goal in the last sentence of the stem, not the words scattered through the scenario. Cost-optimized? The answer is not the fault-tolerant one.
2. Right answer, wrong scope
Three options solve the technical problem. One option also solves the organizational constraint buried in the scenario ("the security team cannot manage instances"). The other two are disqualified by that constraint, not by the tech.
How to defeat it: after you pick an answer, re-read the stem and make sure every non-technical constraint is still satisfied. "No operational overhead" quietly eliminates anything self-managed.
3. Most-correct vs. technically-correct
Both A and C would work. A is 3x more expensive but works in more edge cases. C is the service AWS/Microsoft/Google actually built for this exact pattern. The "most correct" answer is C.
How to defeat it: if two answers both solve the problem, pick the one whose marketing page says exactly what the stem describes.
4. Same word, different meaning
"Encryption" in a KMS question is not the same "encryption" in a TLS-termination question. "Scaling" in an Auto Scaling group is not the same "scaling" in a DynamoDB on-demand table.
How to defeat it: read the noun before the ambiguous word. "Encryption key rotation" vs "encryption in transit" are two different conversations.
5. The distractor that was correct three years ago
Retired services make great distractors because they were once the right answer. SimpleDB, Data Pipeline, CodeCommit (on Azure: Classic VMs, Azure Logic Apps v1) — these lived long enough to end up in a lot of study notes.
How to defeat it: if an answer surprises you, ask "is this the service my study guide mentioned, but is it still the modern pattern?"
6. The number trap
Question says "75 GB of data, growing 5 GB per day." Options include storage tiers that would hit a limit in six months and tiers that scale indefinitely. The trap is whether you did the arithmetic at all.
How to defeat it: spend ten seconds on the back-of-envelope math the first time you read any scenario with numbers. You will usually eliminate two options immediately.
7. Negation buried mid-sentence
"Which of the following would NOT be appropriate for..." — scanned quickly, that NOT vanishes and you pick the best-fit answer, which is exactly wrong.
How to defeat it: circle the NOT / EXCEPT / LEAST on paper before reading the options. A dozen exams have cost people a pass over this one word.
8. The plausible new-service trap
Cloud providers launched 200+ services in the past two years. If an option names something you have never heard of and it sounds perfectly on-point, it might be a real service — or it might be made up for this exact test. Candidates who haven't studied broadly pick it out of fear.
How to defeat it: if you don't recognize a service name, it is almost certainly either (a) real but outside the exam blueprint, or (b) fabricated. In either case, it is not the answer. Trust the services you know.
9. The five-percent-edge-case trap
The question describes a common scenario, but one option is correct only in the edge case where the customer has a specific compliance profile or latency requirement that the stem never mentioned. It sounds smart to pick the edge case. It is wrong.
How to defeat it: if your answer depends on a constraint the stem did not give you, you are solving a different question.
How to practice this
Recognizing traps is not about memorizing the list above. It is about running enough scenarios that the shape of each trap becomes familiar. I review 50–100 exam-grade scenarios a week across certs — even now, 21 years in — because trap patterns evolve as services change. The SAA-C03 refresh introduced new "GenAI" traps in late 2024 that did not exist in SAA-C02 banks. CISSP's April 2024 format reweighted toward scenario-based reasoning, which made the most-correct trap (#3) more common.
If you have never reviewed exam explanations in detail, try this: take any 10 questions you have already answered, cover the options, and write out the category of the intended trap. You will be wrong on half of them. Do that ten times. Now you are reading questions the way the authors wrote them.
Pruvos is built around this. Every question in our library is tagged with the trap category it exercises, so you can drill "keyword misdirection" or "right-answer-wrong-scope" specifically — something no other platform does, because nobody else tags at that level. If you want to test your trap-recognition on a real question bank, pick any cert from the library; the first 10 questions are free.
The nine patterns above are the recurring ones. Once you can name the trap mid-read, you stop falling for it.