Skip to content
Learni
View all tutorials
UX Design

How to Implement A/B Testing UX in 2026

Lire en français

Introduction

A/B testing UX is a scientific method to compare two versions of a user interface (variants A and B) and see which performs better in terms of conversions, clicks, or time spent. In 2026, with mature frontend tools, you no longer need complex paid services: a simple JavaScript script randomizes the user experience and tracks key metrics.

This beginner tutorial guides you step-by-step to create an A/B test on a fictional newsletter signup landing page. We'll test two CTA buttons: a green optimistic one (variant A) vs a blue confident one (variant B). Think of it like a lab experiment: 50% of visitors see A, 50% see B; we measure clicks to pick the winner.

Why it matters: A poorly chosen button can halve your signups. By the end, you'll have a working prototype with localStorage for persisting variants and basic console stats. Ready to level up your data-driven UX?

Prerequisites

  • Basic knowledge of HTML, CSS, and JavaScript (beginner level).
  • A code editor like VS Code.
  • A modern browser (Chrome/Firefox) with devtools console (F12).
  • No server needed: just open index.html locally.

Step 1: Base HTML Page

index.html
<!DOCTYPE html>
<html lang="fr">
<head>
  <meta charset="UTF-8">
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
  <title>A/B Testing UX - Newsletter</title>
  <link rel="stylesheet" href="styles.css">
</head>
<body>
  <div class="container">
    <h1>Inscrivez-vous à notre newsletter</h1>
    <p>Recevez des astuces UX hebdomadaires.</p>
    <form id="newsletter-form">
      <input type="email" id="email" placeholder="Votre email" required>
      <button type="submit" id="cta-button">S'inscrire</button>
    </form>
    <p id="stats"></p>
  </div>
  <script src="script.js"></script>
</body>
</html>

This HTML file creates a minimalist landing page with a signup form. The #cta-button will be dynamically styled by JS based on the A/B variant. The #stats paragraph displays metrics in real-time. Copy-paste as-is to get started.

Step 1: Understanding the HTML Structure

This structure is responsive and accessible: viewport meta for mobile, required on input for native validation, and clear IDs for JS targeting. Analogy: it's the neutral skeleton of your test, unbiased for any variant. Open index.html in your browser to verify.

Step 2: CSS Styles for Variants

styles.css
body {
  font-family: Arial, sans-serif;
  background: linear-gradient(135deg, #f5f7fa 0%, #c3cfe2 100%);
  display: flex;
  justify-content: center;
  align-items: center;
  min-height: 100vh;
  margin: 0;
}

.container {
  background: white;
  padding: 2rem;
  border-radius: 10px;
  box-shadow: 0 10px 30px rgba(0,0,0,0.1);
  text-align: center;
  max-width: 400px;
}

h1 { color: #333; margin-bottom: 1rem; }

#email {
  width: 100%;
  padding: 0.75rem;
  margin-bottom: 1rem;
  border: 1px solid #ddd;
  border-radius: 5px;
  box-sizing: border-box;
}

#cta-button {
  width: 100%;
  padding: 0.75rem;
  border: none;
  border-radius: 5px;
  font-size: 1rem;
  cursor: pointer;
  transition: transform 0.2s;
}

#cta-button:hover { transform: scale(1.02); }

#cta-button.variant-a {
  background: #4CAF50;
  color: white;
}

#cta-button.variant-b {
  background: #2196F3;
  color: white;
}

#stats { margin-top: 1rem; font-size: 0.9rem; color: #666; }

Base styles make the page attractive and mobile-first. The .variant-a (green) and .variant-b (blue) classes define the two UX looks to test: green for positive urgency, blue for trust. Hover adds interactive feedback without JS.

Step 2: Why These UX Styles?

The gradient background sets a modern vibe; the shadow adds Material Design-like depth. Variants test color psychology: green = quick action (A), blue = reliability (B). Reload the page: the button is neutral for now.

Step 3: JS for Assigning the Variant

script.js
(function() {
  // Function to assign variant A or B (50/50)
  function assignVariant() {
    if (localStorage.getItem('ab_variant')) {
      return localStorage.getItem('ab_variant');
    }
    const variant = Math.random() < 0.5 ? 'a' : 'b';
    localStorage.setItem('ab_variant', variant);
    return variant;
  }

  const variant = assignVariant();
  document.getElementById('cta-button').classList.add(`variant-${variant}`);

  console.log(`Variant assigned: ${variant.toUpperCase()}`);

  // Initialize stats
  let clicksA = 0;
  let clicksB = 0;
  updateStats();

  function updateStats() {
    document.getElementById('stats').textContent = 
      `Stats: A=${clicksA} clicks, B=${clicksB} clicks`;
  }
})();

This script uses localStorage to persist the variant per user (avoids flicker on refresh). Math.random() ensures a 50/50 split. Applies the CSS class right to the DOM. updateStats() sets up tracking. Test with hard refresh (Ctrl+F5): variant sticks per session.

Step 3: Randomization Logic

Like a digital coin flip, the 50/50 split is fair. LocalStorage mimics a user-ID cookie: the same visitor always sees the same variant. Open the console (F12): you'll see 'Variant assigned: A' or 'B'. Perfect for consistent tests.

Step 4: Track Clicks

script.js
// Add this after updateStats() in script.js

document.getElementById('cta-button').addEventListener('click', function(e) {
  e.preventDefault();
  const email = document.getElementById('email').value;
  if (!email) {
    alert('Email requis !');
    return;
  }

  const variant = localStorage.getItem('ab_variant');
  if (variant === 'a') {
    clicksA++;
  } else {
    clicksB++;
  }
  updateStats();

  console.log(`Click on variant ${variant.toUpperCase()} - Email: ${email}`);
  // Here, send to backend: fetch('/track', {method: 'POST', body: JSON.stringify({variant, event: 'click'})});

  alert('Inscription simulée ! Merci.');
});

The event listener tracks real clicks, increments the right counter, and validates email. preventDefault() stops real submit for local testing. Console logs everything; swap for fetch in production. It's actionable: click multiple times to see live stats.

Step 4: Measuring Impact

Stats display under the form in real-time. Test across tabs: each gets its own variant. Click rate = clicks / visits; simulate visits with refreshes. Variant B often wins since blue signals trust (UX psychology).

Step 5: Reset and Analysis

script.js
// Add these functions at the end of script.js

function resetTest() {
  localStorage.removeItem('ab_variant');
  clicksA = 0;
  clicksB = 0;
  updateStats();
  location.reload();
}

// Reset button (add to HTML: <button onclick="resetTest()">Reset Test</button>)
console.log('A/B test active. For reset: localStorage.clear() or button.');

// Export stats for CSV
console.table({ 'Variant A': clicksA, 'Variant B': clicksB, 'Rate A': clicksA / (clicksA + clicksB) * 100 || 0, 'Rate B': clicksB / (clicksA + clicksB) * 100 || 0 });

Add a reset button in HTML to simulate new users. console.table provides an analyzable table with rates %. Grab the full script: it's now 100% functional. Analyze: if A beats B by 10%, roll out A!

Step 5: Finalize and Scale

Your A/B test is live! Test with 20-50 'visits' (refresh + click). For production, swap console for Google Analytics: gtag('event', 'ab_click', {variant}). Scalable to 100 variants with Math.random() buckets.

Step 6: Version with Simulated Backend Fetch

script.js
// Replace the console.log with this (simulates API)
async function trackClick(variant, email) {
  try {
    const response = await fetch('https://jsonplaceholder.typicode.com/posts', {
      method: 'POST',
      body: JSON.stringify({ variant, email, event: 'ab_click', timestamp: Date.now() }),
      headers: { 'Content-Type': 'application/json' }
    });
    console.log('Tracked:', await response.json());
  } catch (e) {
    console.error('Tracking failed:', e);
  }
}

// In click handler: await trackClick(variant, email);

Uses JSONPlaceholder as a mock API to persist clicks. async/await handles errors without blocking UX. In prod, point to your endpoint. Test: clicks "send"; check Network tab.

Best Practices

  • Sample split: Always 50/50 or segments (e.g., mobile/desktop) for reliable stats.
  • Persistence: LocalStorage + user-agent hash for cross-device.
  • Multiple metrics: Track clicks + scroll + time (IntersectionObserver).
  • Significance: Min 100 clicks/variant; use Evan Miller calculator.
  • No-flicker: Apply variant before paint with document.documentElement.classList.

Common Mistakes to Avoid

  • Forgetting persistence: Refresh changes variant → biased stats (use localStorage).
  • No validation: Clicks without email inflate metrics (add if (!email)).
  • Sample too small: 10 clicks = high variance; wait for 100+.
  • UX flicker: Slow JS → variant flashes (load script in with defer).

Next Steps