The Problem
A content-heavy Next.js 16.2 site I work with publishes 30 to 80 articles a week. Their sitemap.xml at /sitemap.xml was generated by app/sitemap.ts using App Router conventions, and Google Search Console started flagging the lastmod field as stale three weeks after launch. Every URL in the sitemap had the same lastmod date, the date of the last full deploy, even though articles were being edited daily. Bing dropped fresh articles from the index entirely after a week because IndexNow saw no lastmod change.
If your sitemap.ts uses ISR or generateSitemaps and every URL ships with the build timestamp instead of the per-article modification date, this is the bug. It silently destroys freshness signals to Google, Bing, and any other crawler that respects lastmod. Editorial teams keep publishing, search rankings keep sliding, and nobody notices until "Last read" in Search Console drifts a month behind.
Why It Happens
The new sitemap conventions in Next.js 16 cache aggressively. app/sitemap.ts is treated as a static route by default, which means three things go wrong:
- The sitemap is generated once at build time. Without an explicit
revalidateexport, Next.js renders the sitemap to a static asset atnext buildand caches it forever in the static output. Your editorial team can update articles all day, the sitemap never re-renders. generateSitemapsfor chunked sitemaps caches each chunk independently with the same default. If you split intositemap-0.xml,sitemap-1.xml, etc., each chunk inherits the static behaviour and gets its own stale snapshot. Touching one article does not bust any chunk.new Date()inside the route runs once at build time, not at request time. If you wrotelastModified: new Date()to "use the current date", Next.js evaluated that during prerender and the value is now frozen forever. Every subsequent request returns that build-time stamp.
The third one is the specific gotcha that hit my client. They thought new Date() would always reflect now. It only ran once, then got serialised into the static output.
The Fix
Step 1: Use the article's actual modification date, not new Date(). Pull the updatedAt from your CMS or filesystem, never inject the current time:
// app/sitemap.ts
import type { MetadataRoute } from 'next';
import { getAllPosts } from '@/lib/content';
export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
const posts = await getAllPosts();
const baseUrl = 'https://qasimcode.com';
return [
{
url: baseUrl,
lastModified: new Date('2026-05-01'),
changeFrequency: 'weekly',
priority: 1,
},
...posts.map((post) => ({
url: `${baseUrl}/blog/${post.slug}`,
lastModified: new Date(post.updatedAt),
changeFrequency: 'monthly' as const,
priority: 0.7,
})),
];
}
Now each URL gets its own lastmod based on real edit history. The home page gets a manual date you control, which is fine for a low-change-frequency root.
Step 2: Make the sitemap revalidate on a schedule, not at build only. Add a revalidate export so Next.js regenerates the sitemap periodically:
export const revalidate = 3600; // 1 hour
The sitemap rebuilds every hour against your live data. If you publish or update an article, the sitemap reflects it within an hour, which is fast enough for Google's typical crawl rhythm on mid-traffic sites.
Step 3: For event-driven freshness, revalidate on publish. Schedule revalidation does not catch a Friday-night editorial push that needs to land in Bing now. Wire your CMS webhook to a route handler that calls revalidatePath:
// app/api/cms-webhook/route.ts
import { revalidatePath } from 'next/cache';
import { headers } from 'next/headers';
import { NextResponse } from 'next/server';
export async function POST(request: Request) {
const headersList = await headers();
const token = headersList.get('x-webhook-secret');
if (token !== process.env.CMS_WEBHOOK_SECRET) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
}
revalidatePath('/sitemap.xml');
for (let i = 0; i < 5; i++) {
revalidatePath(`/sitemap/${i}.xml`);
}
return NextResponse.json({ revalidated: true });
}
When your CMS calls this endpoint after a publish, the sitemap is invalidated and the next crawler request triggers a fresh render with the latest updatedAt values.
Step 4: For chunked sitemaps with generateSitemaps, revalidate every chunk. The chunked variant is fiddlier because each chunk is its own cache key:
// app/sitemap.ts
export async function generateSitemaps() {
const total = await countPosts();
const chunks = Math.ceil(total / 5000);
return Array.from({ length: chunks }, (_, id) => ({ id }));
}
export const revalidate = 3600;
export default async function sitemap(
{ id }: { id: number }
): Promise<MetadataRoute.Sitemap> {
const start = id * 5000;
const posts = await getPostsRange(start, 5000);
return posts.map((post) => ({
url: `https://qasimcode.com/blog/${post.slug}`,
lastModified: new Date(post.updatedAt),
}));
}
The 50,000-URL sitemap protocol limit means a real publication needs chunking once you cross 50k URLs. Even before that, chunking helps because Google reads sub-sitemaps in parallel.
Step 5: Verify with curl and Google Search Console. Hit the sitemap directly and confirm two things, the dates differ across URLs and the response includes a recent Date header:
curl -sI https://qasimcode.com/sitemap.xml
curl -s https://qasimcode.com/sitemap.xml | grep -m 5 lastmod
In Google Search Console, resubmit the sitemap and look at "Last read" in 24 hours. Fresh dates mean fresh crawls. The Sitemaps protocol spec defines lastmod formatting and Google documents its interpretation in the sitemap build guide.
The Lesson
Sitemap lastmod dates that never change are a Next.js static-rendering gotcha. You wrote new Date() and Next.js evaluated it at build time. Pull real updatedAt values per URL, set revalidate on the sitemap route, and trigger revalidatePath from your CMS webhook for instant freshness signals. Crawlers reward sites whose sitemap dates match real edits, and they punish sites whose sitemap looks frozen.
If your search rankings are sliding because Google sees a stale sitemap and you need a technical SEO pass on your Next.js site, this is the kind of work I do — see my services. For another freshness-related indexing bug I covered, see Crawled - Currently Not Indexed in GSC fix.