SSR Testing Tool
Run a free server-side rendering test on any URL. Compare the SSR HTML Googlebot sees against the fully rendered DOM and surface every meta tag, image, and link that's missing from your server-side render.
What is SSR testing?
SSR testing — short for server-side rendering testing — is the practice of comparing the raw HTML a server returns against what users (and search engines) actually see after JavaScript executes in the browser. A good SSR test shows you, line for line, what content depends on client-side rendering and what content is delivered up-front.
Why SSR matters for SEO
Googlebot crawls in two passes: a first pass that reads only the server-side HTML, then a deferred render pass that executes JavaScript. If your meta tags, internal links, or primary content only exist after JS runs, you're betting your SSR SEO on a render queue that can take days. A server-side rendering test lets you see what Google sees on pass one.
How this SSR test works
Paste any URL. We make two requests: a raw HTTP fetch with a Googlebot user agent, and a full headless Chromium render with mobile viewport. Both screenshots capture the entire scrollable page so you can compare them section-by-section. We then diff every meta tag, image, and link between the two — anything missing from the server-side HTML is flagged as an SSR SEO risk.
When to run an SSR test
Run an SSR test whenever you ship a new SPA, hydrate a static site, migrate frameworks (Next.js, Nuxt, Remix, Astro), or audit a site that's struggling with indexation. If your SSR website is missing critical SEO elements after a release, this tool will tell you exactly which ones and where.
Limitations of automated SSR testing
No external SSR test can perfectly reproduce what Googlebot sees. Results may be incomplete or misleading when a site enforces bot protection (Cloudflare challenges, hCaptcha, reCAPTCHA), requires authentication or paywalls, geo-blocks visitors outside a specific region, or aggressively rate-limits non-browser traffic. We render with a real headless Chromium and a Googlebot user agent, but we don't bypass these defenses on purpose — that wouldn't reflect what a search crawler actually experiences either. Always test against the public, unauthenticated version of the URL, and treat the diff as a strong signal rather than a guarantee. If the SSR HTML looks empty or far smaller than expected, the origin is likely blocking automated requests.