HTTP response codes tell browsers, users, and search engines what happened when they requested a URL. Using the wrong code for a situation confuses Google and can drop pages out of the index or leak ranking signal. Using the right code preserves everything. This page walks through the codes that matter for SEO, when to use each, and the common mistakes that silently break sites.
A response code is your server's one-sentence answer to "what happened?" 200 means "here's the page." 301 means "that page moved permanently, here's the new home." 404 means "that page doesn't exist." Each code sends a specific signal to Google. Wrong code = wrong signal.
Everything worked. Serve this for all normal pages. If a page is serving 200, it's indexable (unless told otherwise by noindex, robots.txt, etc.).
The URL has permanently moved. Google transfers ~95-100% of ranking signals to the new URL. Use for:
The URL has temporarily moved. Google does NOT transfer ranking signals (fully) to the destination.
Use for:
Common mistake: using 302 for permanent changes. Always 301 for permanent.
Like 302/301 respectively but with strict method preservation. Rarely matter for SEO; most servers treat as equivalent.
A → B → C → D is a chain. Each hop loses some signal (and slows users). Keep redirects to ONE hop. If you find chains in a migration, clean them up.
The URL doesn't exist. Serve for URLs that truly don't and won't exist. Google eventually drops 404s from the index. Don't block 404s in robots.txt; let Google see they're 404.
404 is a valid answer. Don't 301-redirect every 404 to the homepage, that's "soft 404" territory and Google treats it as suspicious.
Like 404, but stronger. "this URL is permanently gone, never coming back." Google drops 410 pages faster than 404.
Use for:
Not a real HTTP code, but a classification by Google. Happens when:
Google flags these as soft 404s in Search Console and removes them from the index, usually faster than the site owner notices.
Access denied. Google can't crawl; the URL won't index unless it has strong inbound signals from elsewhere. Generally not a problem for SEO, intended for protected content.
Rate limiting. Googlebot backs off when it sees these. Occasional 429s are fine; persistent 429s hurt crawl budget.
Your server broke. Google logs it and backs off. Persistent 500s = Google reduces crawl rate, may drop pages from index over time.
The server is temporarily down. Combined with a Retry-After header, Google will return later without dropping pages.
Use for:
503 is the polite way to say "come back later." Don't hide real server issues behind 503 indefinitely. Google eventually treats persistent 503 like 5xx.
Crawl your site with Screaming Frog. Sort by status code. Anything not 200 or a deliberate 301 is worth investigating. 404s with traffic are priority (can you redirect to something better?). Redirect chains are priority (collapse them). 500s are priority (they're breaking pages).
Next: technical audit checklist, the full pass you should run on any site before committing to a content plan.