Soft 404 Errors on Next.js Dynamic Routes: Real Fix

Google Search Console flagging Next.js dynamic routes as soft 404 in 2026? Fix 200 status fallbacks, missing notFound() throws, and thin streaming renders.
SEONext.jsSearch Console
May 2, 20266 min read1012 words

The Problem

I logged into Google Search Console for a client e-commerce store last Wednesday and saw 1,400 URLs flagged as "Soft 404." All of them legitimate product pages on a Next.js 16 App Router build. Each URL returned a 200 status, the page rendered, and there was visible content. Google still classified them as missing and they had quietly dropped out of the index over the previous two weeks.

If you are seeing the same pattern (product or article pages indexed fine last month, now showing up under Page indexing → Soft 404 in GSC, with no recent code changes that should affect them), the cause is almost always one of three things: a fallback page returning 200 with empty content, notFound() not actually setting a 404 status, or a thin-content threshold tripped by streaming render.

Why It Happens

Soft 404 is Google's classifier saying "this page returned 200, but it looks like it should not exist." It is not a server error code. Google's crawler hits the URL, evaluates the rendered HTML, and decides if the page has enough unique content to be a real page. Three patterns trigger it on Next.js dynamic routes.

200 fallback for missing items. If your app/products/[slug]/page.tsx fetches a product and that product is deleted, the function might still return a JSX shell with "Product not available" or a generic placeholder. Status code 200, almost no unique content. Google fingerprints this as a soft 404 across every deleted product URL, then carries the classification to similar shells.

notFound() not returning 404. Calling notFound() from a server component triggers not-found.tsx to render. By default that page returns a 404 status, but only if you let Next.js handle the response. If you wrap the fetch in a try/catch and return a <NotFoundComponent /> JSX instead of throwing notFound(), you render the same UI but with a 200 status. Google sees a 200 plus a "page not found" headline and flags it as soft 404 — perfect mismatch.

Streaming rendering shipping empty HTML. Next.js 16's default streaming renderer flushes the shell before async data resolves. Googlebot fetches the page and reads the initial shell, which on slow data can contain only the layout chrome plus skeletons. The crawler scores it as thin content. If the data later fails (a network blip during crawl), the page never gets the real content and Google records the empty version.

The Fix

Step 1: Audit which URLs are flagged. Open GSC → Indexing → Pages → Soft 404. Export the list. Group by URL pattern. If they are all under one dynamic route prefix (e.g. /products/*), the fix is in that route's page.tsx. If they span multiple prefixes, suspect a shared layout or proxy issue.

Step 2: Throw notFound() for missing data, do not render a fallback. This is the most common cause. Replace any try/catch fallback with:

// app/products/[slug]/page.tsx
import { notFound } from 'next/navigation'

type Props = { params: Promise<{ slug: string }> }

export default async function ProductPage({ params }: Props) {
  const { slug } = await params
  const product = await getProduct(slug)

  if (!product) {
    notFound()
  }

  return <ProductDetail product={product} />
}

notFound() throws a special error that Next.js catches at the route boundary, renders not-found.tsx, and sets the response status to 404. Do not try/catch around it. Do not return a JSX fallback. Throwing is the only way to get the correct status code.

Step 3: Verify status codes with curl. Do not trust browser DevTools alone, since service workers and middleware caches can mask the real response:

curl -I https://example.com/products/deleted-item
# expect: HTTP/2 404
curl -I https://example.com/products/real-item
# expect: HTTP/2 200

If a deleted item still returns 200, the fix in Step 2 did not apply. Check that not-found.tsx exists at the right scope (app/products/[slug]/not-found.tsx or app/not-found.tsx).

Step 4: Pre-render real content for indexable routes. For thin-content soft 404s, force static generation of pages with full content using generateStaticParams and dynamic = 'force-static':

export const dynamic = 'force-static'
export const revalidate = 3600

export async function generateStaticParams() {
  const products = await getActiveProducts()
  return products.map((p) => ({ slug: p.slug }))
}

Pre-rendered pages ship complete HTML in the first byte. Googlebot sees the real content immediately, with no streaming gap. Use ISR via revalidate so updates flow through, but the initial render is always full.

Step 5: Validate with URL Inspection. Pick a flagged URL in GSC and click "Test Live URL." Look at the rendered HTML and screenshot. If Googlebot still sees thin content after Step 4 ships, the issue is bot-rendering specific (geo-blocked API, JS error during SSR). The Search Central soft 404 docs cover the classifier rules in detail.

Step 6: Request indexing on a sample. After deploying fixes, request indexing on 10 representative URLs in GSC. Google processes them within 24 hours and updates the Soft 404 report within 7 to 14 days. Do not request bulk indexing. Google rate-limits this hard and bulk requests get queued for days.

The Lesson

Soft 404 is content classification, not status code. Throw notFound() for missing data, return a real 404 status, and ship complete pre-rendered HTML for everything indexable. If Search Console flagged thousands of URLs at once, they share a code path. Fix the code path, the report clears.

If you are losing organic traffic to soft 404 right now, this is the kind of thing I run on technical SEO retainers. See my services, or if you want a broader audit framework I wrote up the technical SEO audit checklist 2026 recently.

Back to blogStart a project