Content pruning + refreshing

Most sites publish content and never touch it again. Over time, that content rots into Google's "low quality" bucket and drags the rest of the site down with it. Pruning and refreshing, auditing what's working, updating what's close, removing what isn't, can lift site-wide rankings within months. This page walks through the audit workflow, the four buckets every page falls into, what a real refresh looks like versus a fake one, and why removal is underrated.

Why content pruning matters

The audit workflow

  1. Inventory every URL. Crawl (Screaming Frog) plus export rankings plus traffic (GSC + Ahrefs).
  2. Pull data per URL. Impressions, clicks, avg position, word count, last-updated, backlinks, internal links in.
  3. Categorize into 4 buckets.

The four buckets

Refresh: what a real update looks like

What doesn't count as a refresh

Google detects fake refresh patterns. They don't earn freshness signals.

Consolidation

You have two pages, "Best CRM for Insurance" and "Top Insurance CRM Tools." Identical intent, split authority.

  1. Merge the best content from both into one strong page
  2. Choose the better URL (higher traffic, more backlinks, or cleaner slug)
  3. 301 redirect the weaker URL to the winner
  4. Update internal links to point at the winner

Removal

For content that shouldn't exist:

Expected impact

A well-executed pruning cycle often lifts site-wide rankings within 4 to 12 weeks. The remaining strong content inherits authority from the dissipating weak content. The Helpful Content Update effect reverses.

What to do with this

Open Search Console. Sort pages by impressions over the last 90 days. The bottom 20% of your library (lowest impressions) is your pruning candidate pool. Go through them one by one. Refresh the salvageable ones. Delete the rest. Your top content will thank you.

Next: editorial calendars, the operational layer that keeps publishing consistent.

Why content pruning matters

The audit workflow

Step 1: Inventory every URL

Crawl (Screaming Frog) + export rankings + traffic (GSC + Ahrefs/SEMrush).

Step 2: Pull data per URL

For each: impressions, clicks, avg position, word count, last-updated date, # of backlinks, number of internal links pointing to it.

Step 3: Categorize

Four buckets:

  1. Keep + leave alone, ranking well, getting traffic, still current. Don't touch.
  2. Refresh, ranks close to top 10 or gets impressions but not clicks, outdated content, worth keeping.
  3. Consolidate, thin page that overlaps another. Merge content + redirect URL.
  4. Remove, zero traffic, zero value, nothing worth salvaging. 410 (or noindex if it must remain accessible).

Refresh: what a real update looks like

Refresh: what doesn't count

Google detects these patterns. Fake refreshes don't earn freshness signals.

Consolidation

You have two pages: "Best CRM for Insurance" and "Top Insurance CRM Tools." Identical intent, split authority. Action:

  1. Merge the best content from both into one strong page
  2. Choose the better URL (higher traffic, more backlinks, or cleaner slug)
  3. 301 redirect the weaker URL to the winner
  4. Update internal links to point at the winner

Removal

For content that shouldn't exist:

Expected timeline

Pruning-driven traffic recovery

Well-documented pattern: a site with thousands of thin pages prunes aggressively (drops 50-80% of URLs). Over the next 90 days, remaining content's rankings rise. Total traffic often exceeds the original level, from fewer pages.

Cadence