How to Audit Your Website Content With Google Search Console — Step by Step
Most SEO audits start with a crawl tool and a list of technical fixes. Meta descriptions too short. Images missing alt text. Pages not in sitemap. These things matter, but they rarely move the needle on their own.
The audits that actually change traffic trajectories start somewhere different: with the question of whether the content on each page deserves to rank at all.
This post walks through the methodology I use for content audits on established websites, particularly those in YMYL niches where accuracy and trust signals carry extra weight. The process is repeatable, data-led, and produces clear decisions for every page on the site.
Why Most Content Audits Fail
The typical content audit produces a spreadsheet with word counts, backlink counts, and traffic numbers. Someone looks at the low-traffic pages and decides to “optimise” them. New keywords get added, headings get restructured, internal links get dropped in. Traffic stays flat.
The problem is that the audit diagnosed the symptom (low traffic) without identifying the cause. There are four distinct reasons a page might underperform, and each requires a different response:
- The page targets the wrong query
- The page targets the right query but answers it poorly
- The page cannibalises another page on the same site
- The page should not exist at all
You cannot fix all four problems the same way. Rewriting a page that should be deleted wastes time and leaves an E-E-A-T liability on the domain. Deleting a page that just needs a title fix loses ranking history for no reason.
The methodology below forces a decision for each page before any writing starts.
Step 1: Pull the Real Data Before Touching Anything
Open Google Search Console and export two reports: queries and pages. You want impressions, clicks, CTR, and average position for both.
This is your ground truth. Not your assumptions about what people search for. Not keyword research tools showing estimated volumes. The actual queries that are already triggering your pages, and the actual pages Google has decided are relevant to those queries.
For each page, note:
Impressions with low CTR signals that Google is already ranking the page but searchers are not clicking. This is a title and meta description problem, not a content problem.
Impressions with zero clicks at a low position (above 20) means the page is visible but not compelling. Content quality and title alignment need work.
Zero impressions means Google has either not indexed the page or has assessed it as low quality and is not surfacing it for any query. This is your first deletion candidate.
High impressions with reasonable CTR but poor position means the content is directionally right but not authoritative enough. This is where a substantive rewrite pays off.
Step 2: Match Every Page to Its Actual Query
This is the most important step and the one most audits skip.
Take each page URL and find the queries driving its impressions. Not the queries you think it should rank for. The queries it is actually appearing for right now.
Then ask: does the page content match what someone searching that query actually wants?
This mismatch between page content and query intent is responsible for more underperforming pages than any technical issue. A page titled “Complete Guide to X” might be getting impressions for “how to fix X quickly” because Google is trying to be helpful. But the searcher wants a quick fix, not a complete guide. They bounce. CTR stays low. Rankings slip.
When the dominant query is a task (fix this, cancel that, change this detail) the page should lead with the answer, not with background context. When the dominant query is research-oriented (what is X, how does X work) the page has room to be more comprehensive.
The query data tells you which is which.
Step 3: Make a Decision for Every Page
With the query data and a read of the content, each page gets one of four decisions:
Keep and rewrite. The page has impressions or ranking history worth preserving, but the content has intent mismatches, factual issues, or poor structure. Rewrite around the exact queries driving impressions. Lead with the answer. Cut everything that does not serve the searcher’s specific need.
Fix title and meta only. The page has solid impressions but low CTR and the content is fundamentally sound. The problem is the snippet in search results is not compelling enough. Rewrite the title to match the exact search term. Make the meta description specific: include numbers, dates, or the direct answer where possible.
Merge and redirect. Two pages cover the same topic and compete against each other. Combine the best content from both into one stronger page. 301 redirect the deleted URL to the consolidated page to preserve any link equity and ranking signals built up over time.
Delete. The page has zero impressions, covers a topic with no search demand on this site, or carries factual errors that create trust liability across the domain. On YMYL sites in particular, one page with fabricated statistics or wrong information affects how Google assesses the entire domain, not just that URL. The page costs more to keep than to remove.
The delete decision is where most people hesitate. It feels like losing something. But a page that is not ranking, not getting clicks, and containing inaccurate information is not an asset. It is a liability that is actively working against every other page on the site.
Step 4: Fix Factual Issues Before Anything Else
On YMYL sites covering health, finance, legal matters, or government services, factual accuracy is not optional. Google’s quality rater guidelines are explicit that pages with wrong information in these categories are assessed as low quality regardless of how well they are written or structured.
Before rewriting any page, verify every specific claim. Numbers, dates, deadlines, contact details, regulatory thresholds. If a claim cannot be verified against an official source, remove it.
This matters especially for statistics. AI-generated content and poorly researched articles frequently include specific-sounding percentages and survey figures that have no source. “30% of applicants experience delays.” “85% of cases are processed automatically.” These fabrications read as credible but they fail any fact-check, and their presence signals to quality reviewers that the site does not verify its information.
Remove them all. Replace with verifiable facts or remove the claim entirely. A page that says less but says it accurately outperforms a page that says more but cannot be trusted.
Step 5: Rewrite Around the Exact Query
Once the decision is made and facts are verified, write the new version of the page with the dominant query as the anchor.
The title should contain the exact phrase people are searching. Not a variation of it. Not a cleverly rephrased version. The exact phrase. This is not about being unimaginative. It is about giving Google and the searcher an immediate, unambiguous signal that this page answers their specific question.
The opening paragraph should answer the query directly. Not introduce the topic. Not explain why the topic matters. Answer the question. Everything after that is supporting detail for people who need more context.
Length should match intent. Task-oriented queries (how to do something, what to do when something goes wrong) are usually served well by focused pages under 400 words where every sentence has a job. Research queries and comparison queries benefit from more depth because the searcher is evaluating options, not just looking for a quick answer.
The question to ask after writing is: could someone complete their task or answer their question without clicking away? If yes, the page is doing its job.
Step 6: Address Cannibalization Systematically
Keyword cannibalization happens when multiple pages on the same site target the same query. Google has to pick one to rank. It often picks the wrong one, or ranks both poorly because neither looks authoritative on its own.
The GSC data surfaces this clearly. If two pages are getting impressions for the same query cluster, you have a cannibalization problem.
The fix is almost always to consolidate. Pick the page with more ranking history (usually the one with higher impressions), merge the best content from both into it, and redirect the other URL. One strong page consistently outperforms two weak ones competing against each other.
Step 7: Update Titles and Metas Across Underperforming Pages
After rewrites and deletions, go back to the pages with solid impressions but poor CTR and fix the titles and meta descriptions.
A page at position 9 with 3,000 impressions and 0.5% CTR is getting roughly 15 clicks per month. The same page with a 3% CTR gets 90 clicks. Nothing about the page changed. The ranking did not change. Just the snippet in search results became more compelling.
Specific beats generic every time. “SASSA Payment Dates — March 2026: Old Age 3 March, Disability 4 March” outperforms “SASSA Payment Dates 2026: Complete Guide.” The first one answers the question in the snippet. The second one promises an answer somewhere inside the page.
The Repeatable Process in Summary
Pull GSC data at page and query level. Match every page to its actual driving queries. Assess each page against four possible decisions: rewrite, fix title only, merge and redirect, or delete. Verify all factual claims before writing. Rewrite around the exact query phrase. Consolidate cannibalized pages. Fix titles and meta descriptions on high-impression, low-CTR pages.
That is the full loop. It works on sites with 20 pages and sites with 2,000 pages. The only difference is the time it takes to work through each URL.
The principle underneath all of it is simple: every page on a site should either be earning its place through traffic and conversions, or it should be gone. There is no neutral. A page that is not performing is either fixable or it is costing you.
If you found this useful and want to talk through how it applies to your site, get in touch.


