← Back to blog

Why Your Site's Core Web Vitals Tanked (and How to Fix It)

Why Your Site's Core Web Vitals Tanked (and How to Fix It)

Meta description: The analytics script is usually the culprit. Here's how to diagnose the real cost of your tracking code and reclaim your LCP, FID and CLS scores.

You spent months optimizing your images, deferring JavaScript, and compressing CSS. Your Lighthouse score was perfect. Then you added Google Analytics. Now Core Web Vitals are in the red, your SEO rank is slipping, and users are bouncing faster. What happened?

Your analytics script happened. Most companies don't realize that the tracking code silently running on every page—measuring user behavior, sending data to third parties, setting cookies—is one of the biggest drains on page performance. Google Analytics alone can add 45+ kilobytes and delay your page load by 500+ milliseconds. For a fast-loading site, that's catastrophic. This guide shows you how to measure the impact of your analytics script, identify the bottleneck, and fix it—without losing visibility into your traffic.

What Are Core Web Vitals?

Google's Core Web Vitals (CWV) are three metrics that measure user experience. They matter because Google uses them as a ranking signal: sites with good CWV scores rank higher in search results.

Largest Contentful Paint (LCP): How fast does your main content appear? Google wants it under 2.5 seconds. Slow LCP usually means your server response is slow, or heavy JavaScript blocks rendering.

Cumulative Layout Shift (CLS): Does your page jump around as it loads? Ads, images, or scripts that resize elements cause layout shifts. Google wants CLS under 0.1. Every shift over that is a usability issue.

Interaction to Next Paint (INP): When a user clicks, types, or scrolls, how fast does the browser respond? Google wants it under 200 milliseconds. Slow INP usually means JavaScript is blocking the main thread.

A site with good CWV:

  • Loads main content in under 2.5 seconds (LCP)
  • Doesn't jump around as it loads (CLS < 0.1)
  • Responds to clicks in under 200ms (INP)

A site with poor CWV will rank lower and users will bounce more often.

Is Your Analytics Script the Culprit?

Before you blame your analytics tool, measure its actual impact. Many developers assume analytics is slow without checking. Some tools are lightweight; others are performance killers.

How to check in Chrome DevTools:

  1. Open your site in Chrome.
  2. Press F12 to open DevTools.
  3. Go to the Performance tab.
  4. Click the record button (circle icon).
  5. Reload the page.
  6. Wait 5-10 seconds, then click stop.

You'll see a waterfall showing when every script loaded and executed. Look for:

  • Scripts from google-analytics.com, analytics.google.com, facebook.com, hotjar.com, etc.
  • How long each took to download and execute.
  • Whether it's blocking your main content from rendering.

How to measure impact using Lighthouse:

  1. In DevTools, go to the Lighthouse tab.
  2. Click "Analyze page load."
  3. Lighthouse will give you a performance score (0-100) and time-to-interactive.
  4. Now, disable all analytics scripts (remove the <script> tags from your HTML) and reload.
  5. Run Lighthouse again.
  6. Compare the scores.

If removing analytics improves your score by 20+ points, your script is a problem.

How to measure using Web Vitals API:

Add this to your site to log real Core Web Vitals:

<script>
  // Largest Contentful Paint
  new PerformanceObserver((list) => {
    const entries = list.getEntries();
    const lastEntry = entries[entries.length - 1];
    console.log('LCP:', lastEntry.renderTime || lastEntry.loadTime);
  }).observe({entryTypes: ['largest-contentful-paint']});

  // Cumulative Layout Shift
  let clsValue = 0;
  new PerformanceObserver((list) => {
    for (const entry of list.getEntries()) {
      if (!entry.hadRecentInput) {
        clsValue += entry.value;
        console.log('CLS:', clsValue);
      }
    }
  }).observe({entryTypes: ['layout-shift']});

  // Interaction to Next Paint
  new PerformanceObserver((list) => {
    for (const entry of list.getEntries()) {
      console.log('INP:', entry.duration);
    }
  }).observe({entryTypes: ['event']});
</script>

Open your browser console (F12 → Console) and you'll see your actual metrics. Now disable analytics and reload—the numbers will probably improve.

Lightweight Tracking vs. Heavy

Not all analytics tools are created equal. Here's how they compare:

Google Analytics 4: 45-65 KB, loaded synchronously by default, multiple third-party requests, ~800ms execution time on slow networks.

Facebook Pixel: 30-40 KB, synchronous loading, blocks rendering, tracks lots of signals.

Hotjar: 50+ KB, records sessions and heatmaps (very heavy), blocks page render.

Statalog: 2-3 KB, asynchronously loaded, minimal impact on rendering, no third-party requests, ~10ms execution time.

The difference is dramatic:

  • GA4: Every page load sends data to Google, waits for a response, potentially blocks rendering.
  • Statalog: Sends data asynchronously (doesn't wait), minimal code, no blocking.

On a 3G network (common in developing countries and on mobile), GA4 can add 2+ seconds to page load. Statalog adds 10-50 milliseconds.

Step-by-Step: Find the Bottleneck

Step 1: Open Chrome DevTools Performance tab. Press F12, go to Performance. Record a page load. You'll see a timeline showing:

  • DNS lookup (resolving the domain)
  • TCP connection (connecting to servers)
  • Script download and execution
  • Rendering (painting pixels to the screen)

Step 2: Look for long tasks. A "long task" is any JavaScript that runs for 50+ milliseconds without yielding to the browser. Long tasks block the main thread and delay interactions. The Performance tab shows them in red.

Step 3: Identify which script is the culprit. Hover over long tasks. Most will be labeled with a script name (e.g., "google-analytics.js" or "facebook-pixel.js"). If the long task is from analytics, you've found your problem.

Step 4: Check the "Coverage" tab. Go to DevTools → More Tools → Coverage. Reload the page. Coverage shows how much of each JavaScript file is actually used:

  • If GA4's 65 KB is only 10% used (6.5 KB), you're loading 58.5 KB of unused code just to track pageviews.
  • If you use GA4 for advanced features, you might need all of it. But if you just need simple pageview tracking, you're over-engineered.

Step 5: Profile the main thread. In Performance tab, look at the "Main" row. It shows the JavaScript execution timeline. Heavy scripts create long, continuous bars. A lightweight script has short bars with gaps.

How to Fix It

You have several options:

Option 1: Switch to a lightweight tool. If your analytics script is 45+ KB and you only need basic metrics, switch to something smaller. Statalog, Plausible, or Fathom are all under 5 KB and have minimal impact on performance.

Option 2: Lazy load the analytics script. If you like your current tool but need to improve LCP, load it after the page renders:

<script>
  // Wait for page to fully load, then load analytics
  window.addEventListener('load', function() {
    var script = document.createElement('script');
    script.src = 'https://www.googletagmanager.com/gtag/js?id=GA_ID';
    script.async = true;
    document.head.appendChild(script);
  });
</script>

This delays the analytics load until after your main content is painted, improving LCP. Trade-off: you might miss a few pageviews if visitors leave immediately.

Option 3: Async + defer the script. Instead of <script src="analytics.js"></script>, use:

<script async defer src="analytics.js"></script>

This tells the browser to download the script in the background without blocking HTML parsing. Most modern analytics tools support this.

Option 4: Split your analytics. Use a lightweight tool for basic pageview tracking, and a heavier tool (GA4, Hotjar) only on pages where you need advanced features. For example:

  • Homepage, blog, pricing: lightweight tool only
  • Dashboard, checkout: add GA4 for detailed conversion tracking

Option 5: Self-host your analytics. If you use Plausible, Fathom, or Statalog, you can self-host the collection script. This reduces the number of external requests and gives you full control over caching.

Before & After: Real Numbers

Here's a real case study. A SaaS company was using GA4 + Hotjar + custom event tracking. Their metrics:

Before optimization:

  • LCP: 3.8 seconds (poor)
  • CLS: 0.15 (poor)
  • INP: 280ms (poor)
  • Lighthouse score: 62
  • Page size: 280 KB (70 KB just scripts)

What they did:

  1. Replaced GA4 + Hotjar with Statalog (removed 95+ KB)
  2. Lazy-loaded a smaller GA4 config for conversion tracking only (kept 15 KB, loaded after page render)
  3. Moved custom event tracking to a separate dashboard (removed client-side tracking)

After optimization:

  • LCP: 1.2 seconds (excellent)
  • CLS: 0.05 (excellent)
  • INP: 45ms (excellent)
  • Lighthouse score: 94
  • Page size: 165 KB (15 KB for analytics, loaded lazily)

Business impact:

  • 32% faster page loads
  • 18% increase in conversion rate (users don't bounce on slow pages)
  • 12% improvement in SEO ranking (Google weighted CWV improvement)
  • Bounce rate down from 42% to 34%

They didn't lose any tracking data—they just chose tools wisely and loaded them smartly.

Choosing Between Performance and Data

There's a trade-off: more detailed analytics = heavier script. Ask yourself:

Do you really need GA4 if you're a small site? GA4 is powerful but overkill for most websites. It's designed for e-commerce and large companies with complex conversion goals. If you just need to know pageviews, referrers, and bounce rate, a lightweight tool is enough.

Is session recording (Hotjar, LogRocket) essential? Session recording is incredibly heavy (50+ KB + video streaming). If you're not actively using it to debug user behavior, remove it. Use it only when you're actively investigating a problem, then disable it.

Do you need pixel-perfect event tracking? If your main goal is tracking signups and purchases, event tracking is essential. But if you're just tracking pageviews, you can skip it and use lightweight tools.

Rule of thumb: If your analytics script is >10 KB and you're not using advanced features, you're over-optimized. Switch to a lighter tool.

Next Steps

  1. Run Lighthouse on your site (DevTools → Lighthouse).
  2. Record your current Core Web Vitals (DevTools → Performance → record load).
  3. Identify which analytics scripts are loaded.
  4. Measure their impact (disable and re-test).
  5. Decide: switch tools, lazy-load, or remove unused features.

The performance gain is usually immediate—within hours, you'll see faster page loads and better SEO rankings. Ready to optimize? Explore Statalog's lightweight approach or review performance best practices.