Search engines do three things: crawl the web, index what they find, and rank pages in response to queries. Every SEO tactic touches at least one of these stages.
Search engines use bots (Googlebot, Bingbot, etc.) to discover pages. Bots follow links from known pages to new ones. Pages they can't reach (poor site architecture, noindex tags, robots.txt blocks) never enter the index.
Your job in SEO: make sure every page you want ranked is discoverable via links from already-indexed pages.
Once a page is crawled, the search engine parses its content, extracts key signals (headings, keywords, links, schema), and stores the representation in an index. An unindexed page cannot rank, period.
Watch outs: duplicate content, thin content, canonical conflicts, and server errors all prevent indexing.
When a user searches, the engine queries its index and returns a ranked list. Ranking combines hundreds of signals: relevance, quality (E-E-A-T), user behavior, speed, freshness, backlinks, and more.
Beyond the traditional 10 blue links, modern SERPs include featured snippets, People Also Ask, knowledge panels, AI Overviews, image carousels, and local packs. Each slot has different qualifying signals.
Diagnose in this order.