Why Your React Site Isn't Showing Up in Google Search (And How to Fix It)
React apps are invisible to Google by default. Here's why — and a step-by-step fix covering SSR, meta tags, crawlable links, sitemaps, and structured data.
Why Your React Site Isn't Showing Up in Google Search (And How to Fix It)
You built a React app with Create React App or Vite. You deployed it. You searched Google for your site. Nothing.
You're not alone. This is the single most common SEO problem for React developers, and the cause is simple: Google receives an empty page when it visits your site.
The Root Cause: Client-Side Rendering
A standard React app (CRA or Vite) ships a single HTML file with an empty <div id="root"></div>. All content — your headings, text, images, everything — is injected by JavaScript after the page loads in a browser.
Here's what Google receives when it crawls your site:
<!doctype html>
<html>
<head>
<title>React App</title>
</head>
<body>
<div id="root"></div>
<script src="/static/js/main.abc123.js"></script>
</body>
</html>
No content. No meta description. No headings. Just an empty div and a script tag.
Google's Web Rendering Service (WRS) can execute JavaScript to see your content — but it queues pages for rendering, which takes hours to weeks. And it frequently fails on complex React apps with heavy API calls, authentication, or dynamic routing.
The bottom line: If Google can't see your content in the raw HTML, your site won't rank.
How to Fix It: 5 Steps in Priority Order
Step 1: Add Server-Side Rendering (Move to Next.js or Remix)
Why this matters: SSR sends fully rendered HTML to Google on every request. Instead of an empty div, Google receives your actual page content — headings, text, meta tags, everything.
You have two realistic options:
Option A: Migrate to Next.js (recommended)
Next.js is the most popular React SSR framework. If you're building a new project, start here. If you have an existing CRA/Vite app, Next.js has a migration guide.
// app/page.tsx — Server Component, rendered on the server by default
export default async function HomePage() {
const data = await fetch('https://api.example.com/featured').then(r => r.json());
return (
<main>
<h1>Welcome to Our Site</h1>
<p>{data.description}</p>
</main>
);
}
Option B: Use React Router v7 with SSR
If you're already using React Router, v7 supports server-side rendering with loaders:
// app/routes/home.tsx
export async function loader() {
const data = await fetch('https://api.example.com/featured').then(r => r.json());
return { data };
}
export default function Home() {
const { data } = useLoaderData<typeof loader>();
return (
<main>
<h1>Welcome to Our Site</h1>
<p>{data.description}</p>
</main>
);
}
Option C: If you absolutely can't migrate — use prerendering
Tools like react-snap or prerender.io can generate static HTML snapshots of your React app. This is a band-aid, not a cure — but it's better than serving empty pages to Google.
Verify SSR is working:
curl -s https://your-domain.com | head -50
You should see your actual page content in the HTML output.
Step 2: Add Unique Meta Tags to Every Page
Why this matters: Google uses your <title> tag as the blue link in search results and your <meta description> as the snippet below it. Without these, Google guesses — and it guesses badly.
In Next.js (App Router):
// app/about/page.tsx
import type { Metadata } from 'next';
export const metadata: Metadata = {
title: 'About Us | Your Site',
description: 'Learn about our team, mission, and what makes us different.',
alternates: {
canonical: 'https://your-domain.com/about',
},
};
export default function AboutPage() {
return <main><h1>About Us</h1></main>;
}
In a client-rendered React app (if you can't migrate):
import { Helmet } from 'react-helmet-async';
function AboutPage() {
return (
<>
<Helmet>
<title>About Us | Your Site</title>
<meta name="description" content="Learn about our team and mission." />
<link rel="canonical" href="https://your-domain.com/about" />
</Helmet>
<main><h1>About Us</h1></main>
</>
);
}
Note: react-helmet-async only works for users who execute JavaScript. Google's WRS may or may not pick up Helmet tags — SSR is far more reliable.
Step 3: Use <a> Tags for All Internal Links
Why this matters: Googlebot discovers your pages by following links. It cannot click buttons or execute onClick handlers.
Before (invisible to Google):
<button onClick={() => navigate('/products')}>View Products</button>
After (crawlable by Google):
import Link from 'next/link';
// or: import { Link } from 'react-router-dom';
<Link href="/products">View Products</Link>
Both Next.js Link and React Router Link render actual <a> elements in the HTML.
Search your codebase for navigate( and onClick.*navigate. Replace every instance with a Link component.
Step 4: Create and Submit a Sitemap
Why this matters: A sitemap is a list of every URL on your site, submitted directly to Google. It guarantees that Google knows about all your pages — even those with no inbound links.
In Next.js:
// app/sitemap.ts
import type { MetadataRoute } from 'next';
export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
const pages = await fetch('https://api.example.com/pages').then(r => r.json());
return [
{ url: 'https://your-domain.com', lastModified: new Date(), priority: 1 },
...pages.map((p: any) => ({
url: `https://your-domain.com/${p.slug}`,
lastModified: new Date(p.updatedAt),
priority: 0.8,
})),
];
}
For CRA/Vite apps: Generate a static sitemap.xml during your build process and place it in your public/ folder.
Submit your sitemap at https://search.google.com/search-console → Sitemaps.
Step 5: Add JSON-LD Structured Data
Why this matters: Structured data enables rich results in Google — star ratings, FAQ accordions, product prices, breadcrumbs. These dramatically increase your click-through rate.
// In a Next.js Server Component
export default function BlogPost({ post }: { post: any }) {
const jsonLd = {
'@context': 'https://schema.org',
'@type': 'Article',
headline: post.title,
author: { '@type': 'Person', name: post.author },
datePublished: post.publishedAt,
description: post.excerpt,
};
return (
<>
<script
type="application/ld+json"
dangerouslySetInnerHTML={{ __html: JSON.stringify(jsonLd) }}
/>
<article>
<h1>{post.title}</h1>
<p>{post.content}</p>
</article>
</>
);
}
How to Check If Google Can See Your Site
1. Google Search Console URL Inspection: Paste any URL from your site. Google will show you exactly what it sees — the rendered HTML, any errors, and whether the page is indexed.
2. site: search:
Search site:your-domain.com on Google. If zero results appear, your site isn't indexed at all.
3. curl test:
curl -s https://your-domain.com
If the HTML output is empty (just a div and script tags), Google can't see your content.
Summary
React apps don't show up in Google because they serve empty HTML by default. The fix:
- Add SSR — migrate to Next.js or React Router v7 with server rendering
- Add meta tags — unique title, description, and canonical on every page
- Use Link components —
<a>tags, notonClickhandlers, for navigation - Submit a sitemap — tell Google about every URL explicitly
- Add structured data — qualify for rich results that get more clicks
The most impactful change is #1. If you do nothing else, add server-side rendering. Everything else builds on that foundation.
Want to see exactly what Google sees when it crawls your site? Scan What's Ranking to run a full analysis.
FAQ
Can Google index a client-side rendered React app?
Technically yes — Google's Web Rendering Service (WRS) can execute JavaScript. But it's unreliable. WRS queues pages for rendering with delays of hours to weeks, and complex apps with heavy API calls frequently fail to render completely. SSR is the only reliable path to indexing.
Is Create React App bad for SEO?
CRA generates a client-rendered SPA with no server-side rendering capability. This makes SEO extremely difficult. If SEO matters for your project, migrate to Next.js, Remix, or React Router v7 with SSR. CRA is no longer actively maintained as of 2023.
Does react-helmet fix React SEO?
Partially. react-helmet-async can set meta tags, but only after JavaScript executes. In a client-rendered app, Google may not see these tags. With SSR (Next.js or Remix), meta tags are in the initial HTML and react-helmet becomes unnecessary — use the framework's built-in metadata API instead.
How long after adding SSR will my React site appear on Google?
After deploying SSR and submitting a sitemap, most sites begin appearing in Google within 1-2 weeks. You can speed this up by using Google Search Console's URL Inspection tool to request indexing of specific pages.
Should I use Next.js or Remix for React SEO?
Both are excellent for SEO. Next.js has a larger ecosystem, built-in sitemap generation, and static export options. Remix (now part of React Router v7) has a simpler mental model and excellent streaming SSR. Choose based on your project needs — both solve the core SEO problem of empty HTML.