Single-page apps (React, Vue, Angular) and heavy JS sites work for SEO, with effort. Googlebot can render JavaScript, but it's slower, pickier, and more error-prone than serving HTML. Understand the pipeline or ship broken SEO.
Key issue: the render step is delayed and uses more resources than plain HTML crawling. Google processes static HTML sites faster and more thoroughly than JS-rendered ones.
Browser runs JS to build the page. Initial HTML is a skeleton (often an empty <div id="root">). Googlebot must run JS to see content.
Issues: render delay, fragile, content invisible until JS succeeds. Not recommended for content SEO.
Server generates full HTML on each request. Browser and Googlebot both see complete content immediately.
Pros: best for SEO, fastest TTFT. Cons: higher server cost.
Frameworks: Next.js, Nuxt, SvelteKit, all support SSR natively.
HTML pre-generated at build time. Deployed as static files.
Pros: fastest possible delivery, cheapest hosting, best SEO. Cons: requires rebuilds on content changes.
Frameworks: Astro, Next.js (SSG mode), Hugo, Jekyll.
Hybrid, pages are static but can be regenerated on demand. Combines SSG speed with some dynamic flexibility.
Next.js does this well.
Serve plain HTML to bots, JS-rendered content to users. Historically a workaround; now considered a last resort. Can be viewed as cloaking if not implemented carefully.
Content only available after JS runs. If JS fails, content is gone.
Fix: SSR or SSG. If CSR is the only option, server-render at least the critical content (title, h1, meta, hero text, nav).
// Bad (Google can't follow):
<div onclick="navigate('/page')">Link</div>
// Good:
<a href="/page">Link</a>
/#!/page, ancient SPA pattern. Never index cleanly. Always use clean URLs with HTML5 history API.
If content lazy-loads on scroll, Googlebot may not see it (it doesn't scroll a page the way a user does). Use intersection-based lazy loading that triggers on the whole page being rendered, or pre-render if content is critical.
Title and meta description set via JS after initial load. Googlebot often indexes the pre-JS values (empty or placeholder).
Fix: set meta tags server-side or in the initial HTML. Supplement with JS updates if needed for client-side UX.
Old posts are only reachable by scrolling. Googlebot doesn't scroll.
Fix: paginate under the hood (/blog/page/2) so Googlebot can follow. Or offer an "Older posts" link.
If your site is primarily content (blog, documentation, marketing), go static. Fewer moving parts, better SEO, faster. Use an SSG framework, deploy to a CDN, and don't look back.