Why Google Lighthouse Doesn’t Include INP, A Core Web Vital
Google’s Lighthouse doesn’t use the Interaction to Next Paint (INP) metric in its standard tests, despite INP being one of the Core Web Vitals.
Barry Pollard, Web Performance Developer Advocate on Google Chrome, explained the reasoning behind this and offered insights into measuring INP.
Lighthouse Measures Page Loads, Not Interactions
Lighthouse measures a simple page load and captures various characteristics during that process.
It can estimate the Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS) under specific load conditions, identify issues, and advise on improving these metrics.
However, INP is different as it depends on user interactions.
Pollard explained:
“The problem is that Lighthouse, again like many web perf tools, typically just loads the page and does not interact with it. No interactions = No INP to measure!”
Custom User Flows Enable INP Measurement
While Lighthouse can’t measure INP, knowing common user journeys allows you to use “user flows” to measure INP.
Pollard added:
“If you as a site-owner know your common user journeys then you can measure these in Lighthouse using ‘user flows’ which then WILL measure INP.”
These common user journeys can be automated in a continuous integration environment, allowing developers to test INP on each commit and spot potential regressions.
Total Blocking Time As An INP Proxy
Although Lighthouse can’t measure INP without interactions, it can measure likely causes, particularly long, blocking JavaScript tasks.
This is where the Total Blocking Time (TBT) metric comes into play.
According to Pollard:
“TBT (Total Blocking Time) measures the sum time of all tasks greater 50ms. The theory being:
- Lots of long, blocking tasks = high risk of INP!
- Few long, blocking tasks = low risk of INP!”
Limitations Of TBT As An INP Substitute
TBT has limitations as an INP substitute.
Pollard noted:
“If you don’t interact during long tasks, then you might not have any INP issues. Also interactions might load MORE JavaScript that is not measure by Lighthouse.”
He adds:
“So it’s a clue, but not a substitute for actually measuring INP.”
Optimizing For Lighthouse Scores vs. User Experience
Some developers optimize for Lighthouse scores without considering the user impact.
Pollard cautions against this, stating:
“A common pattern I see is to delay ALL JS until the user interacts with a page: Great for Lighthouse scores! Often terrible for users 😢:
- Sometimes nothing loads until you move the mouse.
- Often your first interaction gets a bigger delay.”
Pollard’s Full Post
Why This Matters
Understanding Lighthouse, INP, and TBT relationships is necessary for optimizing user experience.
Recognizing limitations in measuring INP helps avoid misguided optimizations.
Pollard’s advice for measuring INP is to focus on real user interactions to ensure performance improvements enhance UX.
As INP remains a Core Web Vital, grasping its nuances is essential for keeping it within an acceptable threshold.
Practical Applications
To monitor site performance and INP:
- Use Lighthouse’s “user flows” for INP measurement in common journeys.
- Automate user flows in CI to monitor INP and catch regressions.
- Use TBT as an INP proxy, but understand its limitations.
- Prioritize field measurements for accurate INP data.
- Balance performance optimizations with UX considerations.
Featured Image: Ye Liew/Shutterstock
Source link : Searchenginejournal.com