Crawled Currently Not Indexed in GSC: The Real Fix

Google Search Console shows Crawled currently not indexed on your pages? Here is why it happens and the real fix that actually gets URLs indexed quickly.
SEOGoogle Search ConsoleIndexing
April 22, 20267 min read1227 words

The Problem

A client shipped thirty new service pages last month. Three weeks later, Search Console showed a familiar pattern on half of them:

URL is on Google? No
Crawling: Crawled – currently not indexed
Last crawl: 14 Apr 2026

Google hit the page, read it, and decided not to index it. No noindex. No canonical pointing somewhere else. No robots.txt block. The URL was technically fine, and Google simply walked away. This is the single most frustrating status in Search Console because nothing is obviously wrong.

I ran into this on a Next.js build recently, and the same pattern shows up on WordPress sites all the time. The fix is not "request indexing" on every URL and wait. The fix is figuring out which of four causes is yours.

Why It Happens

"Crawled – currently not indexed" means Google chose not to add the page. It is a quality and priority signal, not a technical error. In my experience, it comes down to one of these:

1. The page looks like a thin duplicate. Not a literal duplicate, just similar enough in structure, headings, and boilerplate that Google sees it as a near-copy of a page already indexed. Programmatic pages, location pages, and tag archives fall into this trap constantly.

2. Internal linking is too weak. The URL is in the sitemap but is not linked from anywhere Google trusts. If the only path to the page is the sitemap, Google treats it as low priority. Crawl budget is finite; orphan-ish pages get skipped.

3. The page loads slow or ships a broken render to Googlebot. Soft failures (a hydration error that blanks the body, a hero that 404s, a JS error that stops the rest of the page rendering) make Google bounce off. The crawl log will show a 200, but the rendered version is thin.

4. The domain's overall quality is low. New sites, sites that were spammed, or sites with a long tail of thin pages all get indexed slowly. Google is protecting its index from noise. You fix this with cleanup, not requests.

Most sites with this problem have a mix of 1, 2, and 3. The fix is a short, repeatable audit.

The Fix

Run through these in order. Do not jump to "request indexing" until the rest is clean.

1. Confirm the page renders fully for Google

In Search Console, open the URL Inspection tool and click Test live URL → View tested page → Screenshot. If the screenshot looks blank, truncated, or like a loading skeleton, Google is not seeing your content. Fix that first.

If you are on Next.js, make sure the main content is rendered on the server, not behind a client effect:

// app/services/[slug]/page.tsx
export default async function ServicePage({
  params,
}: {
  params: Promise<{ slug: string }>
}) {
  const { slug } = await params
  const service = await getService(slug)

  return (
    <article>
      <h1>{service.title}</h1>
      <p>{service.summary}</p>
      <div dangerouslySetInnerHTML={{ __html: service.bodyHtml }} />
    </article>
  )
}

If any of that content is fetched in a useEffect or rendered behind an intersection observer, Google's rendering pass often misses it. Server rendering the core copy is the single biggest lever for this bucket of pages. If your hero is blocking LCP and the rest of the page is delayed behind it, my post on Next.js App Router CSS LCP fix is a related sanity check.

2. Fix the thin-duplicate problem

Pull up five "Crawled – currently not indexed" URLs and one indexed URL side-by-side. If the new ones share 70% of the visible content with the indexed one, the pages are too similar.

What I do on client sites:

  • Rewrite the top 120 words so they are genuinely unique, not just swapped city names
  • Replace any boilerplate testimonial block with a version specific to the page
  • Vary the H2 structure; Google notices templated outlines
  • Add one piece of content that literally cannot appear on other pages (a local FAQ, a specific client case study, a data point)

You do not need a thousand extra words. You need the first screen to read like it was written for that page.

3. Strengthen internal linking

Every page you want indexed needs at least two contextual links from already indexed pages. Not sitemap links. Not footer mega-menu links. In-body, relevant, followed links.

Audit this quickly:

# from the repo root, find which pages link to the orphaned URL
grep -rn "services/analytics-setup" content/ components/ app/

If that returns only the sitemap file or nothing, the page is effectively orphaned. Add a link from a high-traffic post, the parent service page, and the homepage or a hub page. Links from indexed pages carry the signal; links from unindexed pages do not.

4. Clean the low-quality tail

If your site has thousands of thin tag archives, paginated comment pages, or author pages with two posts each, Google's overall impression of the domain is "lots of low-value URLs." The faster you prune, the faster priority pages index.

Practical cleanup:

  • noindex tag, author, and date archives unless they rank
  • 410 old staging URLs that leaked
  • Merge thin blog posts under 300 words into stronger pillar pages
  • Remove attachment pages from the sitemap

The goal is a smaller, denser sitemap where every listed URL is worth Google's crawl time. The official Google indexing issues documentation covers the signals Google uses if you want the canonical source.

5. Only then, request indexing

Once you have fixed render, content, links, and pruned the tail, then click Request indexing in URL Inspection for your top 10 priority pages. Not all of them. Not in bulk. Ten.

Bulk requesting indexing on a site with unfixed quality issues does not help — Google re-crawls, still decides the page is low priority, and the status does not change.

The Practical Rule

"Crawled – currently not indexed" is a decision, not a bug. Google saw the page and chose to skip it. You fix it by making the page obviously worth indexing: rendered on the server, distinct from its siblings, linked from pages Google already trusts, and shipped on a domain that has pruned its low-value tail.

Small changes compound. A site I cleaned up in February moved from 40% indexed to 82% indexed in six weeks without any outreach, just render fixes and internal linking. That is the usual curve.

Want Your Pages Actually Indexed?

I audit and fix indexing issues on production sites: render problems, thin content, internal linking, and the crawl-budget cleanup that moves the needle. If Search Console is full of "Crawled – currently not indexed" and your new pages never land, see my SEO and performance services and I will get the site back in the index.

Back to blogStart a project