The Problem
A SaaS dashboard I audit had its mobile INP creep from 180ms to 340ms over two sprints. Field data from CrUX showed the regression clearly, but Lighthouse on the same pages came back "Good" every time. The team rolled back three suspect commits and the field number did not move. They needed to know which interaction was slow, on which page, for which users, with the call stack.
If your INP regressed, you cannot reproduce it locally, and the web-vitals attribution payload just points at a generic <button>, this is the gap the Long Animation Frames API fills. LoAF entries give you the work that blocked the main thread around the interaction, including the script source and the function name.
Why It Happens
INP measures the longest delay between an interaction and the next paint, p98 across all interactions on the page. Three things conspire to make it hard to debug:
- Lighthouse INP is synthetic. Lighthouse fires scripted clicks on a freshly loaded page in a clean profile. It does not click your filter panel after the user has scrolled, hovered, and triggered six analytics events. The interactions that break INP in the field never happen in the lab.
- Real users have third-party scripts running. GTM, Intercom, and Hotjar fire
setTimeoutcallbacks that occupy the main thread between an interaction and the next frame. The handler may be 20ms of your code plus 200ms of a Hotjar flush, and field tooling cannot tell you that. - The Long Tasks API is too coarse. It tells you "a task over 50ms happened somewhere" without naming the script or function. The Long Animation Frames API (LoAF), stable everywhere by mid-2026, gives you a frame-level breakdown with
sourceURL,sourceFunctionName, andsourceCharPosition.
If you wire LoAF into your existing INP attribution beacon, every slow interaction comes back with the exact script and function that ran during the blocking frame. That is the signal you need.
The Fix
You need three pieces: a PerformanceObserver that captures both event and long-animation-frame entries, a correlation step that ties LoAF frames to the specific interaction, and a beacon that ships the payload to your logging endpoint without hurting the very metric you are measuring.
Step 1: Subscribe to both observers. Put this in a small inp-attribution.ts module loaded after hydration:
type LoafEntry = PerformanceEntry & {
renderStart: number;
styleAndLayoutStart: number;
scripts: Array<{
name: string;
duration: number;
sourceURL: string;
sourceFunctionName: string;
invoker: string;
invokerType: string;
}>;
};
const loafBuffer: LoafEntry[] = [];
function setupLoaf() {
if (typeof PerformanceObserver === 'undefined') return;
if (!PerformanceObserver.supportedEntryTypes?.includes('long-animation-frame')) return;
const obs = new PerformanceObserver((list) => {
for (const entry of list.getEntries() as LoafEntry[]) {
loafBuffer.push(entry);
if (loafBuffer.length > 50) loafBuffer.shift();
}
});
obs.observe({ type: 'long-animation-frame', buffered: true });
}
setupLoaf();
The buffered: true flag is important. It replays frames that already happened before the observer attached, which catches the worst case where the slowest frame is the one right after page load.
Step 2: Correlate the slow interaction to the LoAF frames it overlaps. Hook into the existing web-vitals/attribution callback for INP and look up frames inside the interaction window:
import { onINP } from 'web-vitals/attribution';
onINP((metric) => {
if (metric.value < 200) return;
const attr = metric.attribution;
const interactionStart = attr.interactionTime;
const interactionEnd = interactionStart + metric.value;
const overlapping = loafBuffer
.filter((f) => f.startTime + f.duration >= interactionStart && f.startTime <= interactionEnd)
.map((f) => ({
start: Math.round(f.startTime),
duration: Math.round(f.duration),
renderStart: Math.round(f.renderStart),
scripts: f.scripts.map((s) => ({
url: s.sourceURL,
fn: s.sourceFunctionName,
invoker: s.invoker,
duration: Math.round(s.duration),
})),
}));
navigator.sendBeacon('/api/vitals/inp', JSON.stringify({
value: metric.value,
rating: metric.rating,
interactionType: attr.interactionType,
target: attr.interactionTarget,
url: location.pathname,
frames: overlapping,
}));
});
The shape of the beacon payload matters because you are going to query it in your logging tool. Each script entry tells you sourceURL (which file), sourceFunctionName (which function), and duration (how long it ran). Sort by duration descending and the top entry is almost always your culprit. The invokerType field tells you whether it was a event-listener, setTimeout, promise.then, or media-callback — equally useful when you are chasing down a third-party that schedules work after the click.
Step 3: Aggregate and sort by impact. Once a few thousand beacons arrive, the pattern jumps out. Typical findings:
- A single Hotjar
__hjBootaccounts for 60% of slow INP frames on a marketing site. Defer and load afterload. - A
analytics.trackinvendor/segment-snippet.jsruns synchronously on every click and adds 80ms. Wrap inrequestIdleCallback. - A React 19.2 render path goes through a Context provider with a
'use no memo'directive a previous dev added. Remove it.
Without LoAF you guess. With LoAF you have the call site.
Step 4: Do not let the beacon become a long task. sendBeacon is fire-and-forget and runs off the main thread, which is why I use it instead of fetch. Do not push beacons inside the interaction handler. Capture, queue, and flush during idle.
The Lesson
INP regressions hide in the field because the slow code path runs on real users with real third-party scripts, not in your dev tools. The Long Animation Frames API closes the gap by naming the exact script and function that occupied the frame around the slow interaction. Wire it once, ship to production, sort by script.duration, and the culprit reveals itself within a day of traffic.
If your INP is flapping above 200ms and you need an attribution pipeline that points at the real cause, that is the kind of work I do. See my services. For another INP attribution pattern I covered, see INP regression from GTM third-party tags.