Content pruning + refreshing

Most sites publish content and never touch it again. Compounds that content into Google's "low quality" bucket. Pruning and refreshing, auditing what's working, updating what's close, removing what isn't, can lift site-wide rankings within months.

Why content pruning matters

The audit workflow

Step 1: Inventory every URL

Crawl (Screaming Frog) + export rankings + traffic (GSC + Ahrefs/SEMrush).

Step 2: Pull data per URL

For each: impressions, clicks, avg position, word count, last-updated date, # of backlinks, number of internal links pointing to it.

Step 3: Categorize

Four buckets:

  1. Keep + leave alone, ranking well, getting traffic, still current. Don't touch.
  2. Refresh, ranks close to top 10 or gets impressions but not clicks, outdated content, worth keeping.
  3. Consolidate, thin page that overlaps another. Merge content + redirect URL.
  4. Remove, zero traffic, zero value, nothing worth salvaging. 410 (or noindex if it must remain accessible).

Refresh: what a real update looks like

Refresh: what doesn't count

Google detects these patterns. Fake refreshes don't earn freshness signals.

Consolidation

You have two pages: "Best CRM for Insurance" and "Top Insurance CRM Tools." Identical intent, split authority. Action:

  1. Merge the best content from both into one strong page
  2. Choose the better URL (higher traffic, more backlinks, or cleaner slug)
  3. 301 redirect the weaker URL to the winner
  4. Update internal links to point at the winner

Removal

For content that shouldn't exist:

Expected timeline

Pruning-driven traffic recovery

Well-documented pattern: a site with thousands of thin pages prunes aggressively (drops 50-80% of URLs). Over the next 90 days, remaining content's rankings rise. Total traffic often exceeds the original level, from fewer pages.

Cadence