Technical SEO Checklist
Crawlability & Indexation
- robots.txt exists and does not block critical pages
- XML sitemap exists, is valid, and submitted to GSC
- Sitemap includes only canonical, indexable URLs
- No important pages blocked by noindex meta tag
- Canonical tags point to correct URLs (no self-referential errors)
- No orphan pages (all pages reachable from internal links)
- Crawl depth โค 3 clicks from homepage for important pages
- Pagination handled correctly (rel=next/prev or infinite scroll signals)
Core Web Vitals
| Metric | Good | Needs Improvement | Poor |
|---|---|---|---|
| LCP (Largest Contentful Paint) | โค 2.5s | 2.5โ4.0s | > 4.0s |
| INP (Interaction to Next Paint) | โค 200ms | 200โ500ms | > 500ms |
| CLS (Cumulative Layout Shift) | โค 0.1 | 0.1โ0.25 | > 0.25 |
HTTPS & Security
- All pages served over HTTPS
- HTTP automatically redirects to HTTPS (301)
- SSL certificate valid and not expiring soon
- No mixed content (HTTP resources on HTTPS pages)
- HSTS header set
- Security headers present (CSP, X-Frame-Options, etc.)
Mobile & Structured Data
- Mobile-friendly (responsive design or separate mobile URL)
- Viewport meta tag present
- No intrusive interstitials on mobile
- Structured data (JSON-LD) for key page types
- No structured data errors in Rich Results Test
- Open Graph and Twitter Card meta tags set
URL & Site Architecture
- URLs are clean, descriptive, and lowercase
- No URL parameters causing duplicate content
- 301 redirects for all changed/removed URLs
- No redirect chains (AโBโC, use AโC)
- Custom 404 page returns HTTP 404 (not 200)
- Hreflang tags correct for multilingual sites
- Consistent URL format (trailing slash or not)
robots.txt Example
User-agent: *
Allow: /
# Block admin and private areas
Disallow: /admin/
Disallow: /api/
Disallow: /private/
# Allow Googlebot full access
User-agent: Googlebot
Allow: /
Sitemap: https://example.com/sitemap.xml