Situation
A high-volume fashion retailer noticed a sustained drop in organic search traffic starting in Q2. Google Search Console was showing increasing numbers of pages failing Core Web Vitals (CWV) assessments — specifically LCP (Largest Contentful Paint) — but no one on the analytics team could connect those failures to actual business impact.
The problem: their GA4 implementation had no performance tracking. They could see that product pages had terrible bounce rates, but not why. They suspected page speed but had no data to prove it or prioritise which pages to fix first.
The e-commerce team was making guesses. The engineering team wanted data before committing sprint capacity to performance work. Neither team had what they needed.
We were brought in to build a measurement layer that would close that gap — and make the case internally for the engineering investment.
Task
Build a complete performance monitoring pipeline in GA4 and GTM that would:
- Capture all three Core Web Vitals (LCP, CLS, FID/INP) as GA4 custom events
- Tie performance data to revenue — i.e., can we prove slow pages lose sales?
- Create a Looker Studio dashboard the engineering team can actually use to prioritise work
- Run before/after measurement to quantify the impact of any fixes made
The secondary goal was to demonstrate the SEO impact clearly enough to secure budget for a sustained performance engineering programme.
Action
Step 1 — Instrumenting Core Web Vitals in GTM
The web-vitals library from Google is the most reliable way to capture CWV metrics in real user sessions. We injected it via GTM using a Custom HTML tag.
<!-- GTM Custom HTML Tag: Load web-vitals and push to dataLayer -->
<script>
(function() {
var script = document.createElement('script');
script.src = 'https://unpkg.com/web-vitals@3/dist/web-vitals.iife.js';
script.onload = function() {
function sendToGA4(metric) {
window.dataLayer = window.dataLayer || [];
window.dataLayer.push({
event: 'web_vitals',
metric_name: metric.name, // LCP, CLS, FID, INP, TTFB
metric_value: Math.round(metric.name === 'CLS' ? metric.value * 1000 : metric.value),
metric_rating: metric.rating, // 'good', 'needs-improvement', 'poor'
metric_delta: Math.round(metric.delta),
page_path: window.location.pathname,
page_template: document.body.dataset.template || 'unknown'
});
}
webVitals.onCLS(sendToGA4);
webVitals.onFID(sendToGA4);
webVitals.onLCP(sendToGA4);
webVitals.onINP(sendToGA4);
webVitals.onTTFB(sendToGA4);
};
document.head.appendChild(script);
})();
</script>
Trigger: Window Loaded — fires after page is fully loaded, ensuring the web-vitals library captures the correct final values.
Step 2 — GTM Variables for Performance Data
We created GTM Data Layer Variables to extract each metric:
| Variable Name | DL Key | Used For |
|---|---|---|
| dlv - metric_name | metric_name | Filter by LCP/CLS/etc |
| dlv - metric_value | metric_value | The actual ms/score |
| dlv - metric_rating | metric_rating | good/needs-improvement/poor |
| dlv - page_template | page_template | product/collection/home |
Step 3 — GA4 Custom Event & Dimensions
We registered the web_vitals event in GA4 and created custom dimensions:
metric_name(event-scoped) — which vitalmetric_rating(event-scoped) — good / needs-improvement / poorpage_template(event-scoped) — product, collection, home, etc.
This meant we could now query: “What is the add-to-cart rate on product pages where LCP was ‘poor’ vs ‘good’?”
Step 4 — The Revenue Correlation Analysis
This was the key piece of evidence. Using GA4 Explorations with the new dimensions:
| LCP Rating | Sessions | Add-to-Cart Rate | Purchase Rate |
|---|---|---|---|
| Good (< 2.5s) | 124,800 | 8.4% | 2.9% |
| Needs Improvement (2.5–4s) | 89,200 | 5.1% | 1.6% |
| Poor (> 4s) | 61,400 | 2.8% | 0.7% |
The data was unambiguous. Product pages with poor LCP converted at 0.7% vs 2.9% for fast pages — a 4.1x conversion rate difference. Extrapolating to monthly revenue: poor LCP was costing an estimated £1.8m annually in lost conversions.
That number got the engineering team’s attention.
Step 5 — Prioritising Fixes with a Performance Dashboard
We built a Looker Studio dashboard connected to the GA4 data that showed:
- CWV ratings by page template (product/collection/home)
- LCP distribution histogram — where are the worst outliers?
- Revenue-at-risk per page template (sessions × conversion gap × AOV)
- Before/after tracking table for pages in active development
The engineering team used this to build their sprint backlog. Pages with the highest revenue-at-risk and the worst LCP went first.
The biggest culprits found:
- Hero product images not preloaded — LCP element was a
<img>loaded lazily. Adding<link rel="preload">reduced LCP on product pages by ~1.4s. - Third-party review widget blocking render — The review aggregator JS was loading synchronously in
<head>. Moving to async deferred it below LCP threshold. - Unoptimised filter panel JavaScript — Collection page JS bundle included the entire filter component on page load. Code-splitting reduced initial JS parse time by 800ms.
Result
Over the 90 days following implementation and engineering fixes:
LCP on product pages: 4.8s → 1.9s (median, p75 percentile)
All 340+ product pages previously flagged as “poor” in Search Console moved into “good” or “needs improvement.” Google’s CWV assessment for the site went from 22% “good URLs” to 81%.
Organic traffic recovered 23% from its trough, directly tracking the CWV improvement timeline. The client’s SEO agency confirmed no significant content or backlink changes in the same period — the recovery was attributed to the Page Experience signal improvement.
Product page bounce rate: 68% → 37%. Customers who previously left within seconds because the page hadn’t loaded were now staying to browse.
The Looker Studio dashboard remains the engineering team’s primary tool for performance regression monitoring. Every deploy now gets a performance check before being marked complete.
The web-vitals GTM setup costs nothing to run and takes half a day to configure. The business case for fixing slow pages almost always pays back in weeks. If your GA4 has no performance data, you’re missing the single clearest signal connecting technical work to revenue.