How to Do A/B Testing with Google Analytics

Cody Schneider8 min read

A/B testing is one of the most powerful tools in a marketer's or website owner's toolkit. It takes the guesswork out of optimizing your site by letting you scientifically test changes - from a new headline to a complete page redesign - to see what truly moves the needle. This guide will walk you through exactly how to set up, run, and analyze A/B tests using Google Analytics 4 so you can make data-driven decisions that improve your performance.

What Exactly is A/B Testing?

A/B testing, also known as split testing, is a straightforward method for comparing two versions of a webpage to determine which one performs better. In a typical A/B test, you create two versions of a page: version 'A' (the control) and version 'B' (the variation). Traffic to the page is then split between these two versions.

By tracking user interactions and conversions on each version, you can gather data to see if the changes made in version 'B' led to a positive, negative, or insignificant change in behavior. For example, did changing the color of your main call-to-action (CTA) button from blue to orange result in more clicks? A/B testing gives you a definitive answer.

Common Elements to Test

You can test almost any element on your website. The key is to start with changes you believe will have the biggest impact on your goals. Some popular elements to test include:

  • Headlines and Subheadings: Does a benefit-oriented headline perform better than a question?
  • Call-to-Action (CTA) Text: Does "Get a Free Demo" convert better than "Request A Quote"?
  • Button Colors and Design: Does a higher-contrast button attract more clicks?
  • Page Layout and Design: Should the contact form be above the fold or at the bottom of the page?
  • Images and Videos: Does a video of the product perform better than a carousel of static images?
  • Copy and Content: Does a shorter, scannable product description lead to more "Add to Cart" clicks than a long, detailed one?

Planning Your A/B Test: From Idea to Hypothesis

A successful A/B test starts long before you touch any testing software. A clear plan ensures your test is meaningful and that the results you get are actionable.

1. Identify Your Goal and Key Metric

First, decide what you want to improve. What is the single most important action you want a user to take on this page? This will be your primary goal. Then, find the metric in Google Analytics that measures this goal.

Good goals could be:

  • Increasing newsletter sign-ups (measured by a generate_lead conversion event).
  • Boosting demo requests (measured by a form submission event).
  • Driving more sales on a product page (measured by the purchase event and revenue).
  • Reducing a high bounce rate on a landing page (measured by Engagement Rate).

Your entire test will be designed to influence this single key metric.

2. Formulate a Strong Hypothesis

A hypothesis is an educated guess about the outcome of your test. It provides structure and forces you to think about why you are making a particular change. A good hypothesis follows a simple format: "If I [change X], then [outcome Y] will happen, because [reason Z]."

Here’s an example:

  • Poor Hypothesis: "Let's test a red button instead of a blue one."
  • Strong Hypothesis: "If we change the 'Add to Cart' button color to bright orange, then we will increase clicks by 15%, because the new color has higher contrast against the page background, making it more visible and compelling."

The "because" part is crucial. It connects your proposed change to an expected user behavior, which helps you learn from your tests, even if they don't produce the desired outcome.

Choosing a Tool to Run the Test

It's important to understand a key change in the Google ecosystem: Google Analytics 4 is used to analyze test results, not to run the test itself. GA4 doesn’t have a built-in feature to split traffic and show different versions of your page. Its predecessor, Google Analytics Universal Analytics, had a companion tool called Google Optimize, but this platform was sunset in September 2023.

To run your A/B test, you'll need a third-party A/B testing tool. Luckily, most modern tools integrate smoothly with GA4. Popular options include:

  • HubSpot
  • Optimizely
  • VWO (Visual Website Optimizer)
  • Convert.com

These tools handle the technical side of splitting your website traffic and showing the correct version to each user. Your job is to set them up to properly send data to GA4 for analysis.

How to Configure Your Test to Send Data to GA4

This is the most critical technical step. To analyze your results in GA4, your testing tool needs to tell GA4 which version of the page each user saw. Most A/B testing platforms do this by automatically firing an event to GA4 when a user is exposed to an experiment.

The standard event is often called experiment_impression or similar, and it includes two key parameters:

  • experiment_name (or experiment_id): The name you gave your test (e.g., "Homepage_Headline_Test").
  • variant_id: An identifier for the version shown (e.g., "control" or "variation_1").

Step-By-Step: Registering Custom Dimensions in GA4

Out of the box, GA4 doesn't know what to do with these custom experiment parameters. You need to "teach" it to recognize them by registering them as Custom Dimensions. Without this step, you won't be able to use your experiment data in your reports.

Here’s how to set it up:

  1. Navigate to the Admin section of your GA4 property (the gear icon in the bottom-left).
  2. Under the 'Property' column, click on Custom definitions.
  3. Click the Create custom dimensions button.
  4. Now, create a dimension for your experiment name:
  5. Click Save.
  6. Click Create custom dimensions again to create the second dimension for the variant:
  7. Click Save.

That’s it! It might take up to 48 hours for data to begin flowing into these new dimensions, but once it does, you’re ready to analyze your results.

Analyzing A/B Test Results in GA4 Explorations

Once your test has been running long enough to gather a sufficient amount of data (enough to reach statistical significance), it's time to see which version won. The best place to do this in GA4 is with a Free Form Exploration report.

Here is how you build the report:

1. Create a New Exploration

In the left-hand GA4 menu, click on Explore and select Free form to start with a blank canvas.

2. Import Your Dimensions and Metrics

In the Variables column on the left, you'll need to import the data you want to analyze. Click the '+' sign next to Dimensions and import:

  • The two custom dimensions you just created: Experiment Name and Variant ID.

Next, click the '+' sign next to Metrics and import the metrics that correspond to your goal:

  • Sessions
  • Total users
  • Conversions (if your goal is a specific event, select that one, e.g., Conversions: generate_lead)
  • Total Revenue (if it's an e-commerce test)

3. Build the Report Canvas

Now, drag and drop the imported variables onto the Tab Settings canvas to build your analysis table:

  • Drag Variant ID into the Rows section. This will show your "control" and "variation" as different rows in your table.
  • Drag your key metrics (like Sessions, Conversions) into the Values section.
  • Drag Experiment Name into the Filters section. Configure the filter to exactly match the name of the test you want to analyze (e.g., "Experiment Name" matches "Homepage_Headline_Test").

4. Read the Results and Calculate Conversion Rate

You'll now see a table showing the performance of each variant side-by-side. It will look something like this:

One small limitation of GA4 Explorations is that it won’t automatically calculate a conversion rate for you. You have to do this small bit of math yourself: Conversion Rate = Conversions / Sessions.

  • Control Conversion Rate: 512 / 10,240 = 5.0%
  • Variation Conversion Rate: 648 / 9,980 = 6.49%

In this example, the variation clearly outperformed the control, delivering a substantially higher conversion rate. You now have a data-backed reason to implement the new headline permanently!

Final Thoughts

A/B testing is a foundational practice for any data-informed business looking to grow. By connecting your favorite testing platform to GA4 and using the power of Explorations, you can move beyond simple click tracking and measure the real impact of your changes on key business outcomes.

Setting up reports in GA4 can feel tedious, especially when you need answers quickly and don’t want to rebuild explorations for every test. That’s why we built Graphed. After easily connecting your GA4 account, you can skip the manual report building and just ask questions like, "Compare conversions for my 'control' and 'variation' variants from the CTA test this month." Graphed instantly gives you the visualization and data you need, turning hours of analysis into a 30-second conversation and letting you get back to making smart, data-driven decisions for your business.

Related Articles

How to Connect Facebook to Google Data Studio: The Complete Guide for 2026

Connecting Facebook Ads to Google Data Studio (now called Looker Studio) has become essential for digital marketers who want to create comprehensive, visually appealing reports that go beyond the basic analytics provided by Facebook's native Ads Manager. If you're struggling with fragmented reporting across multiple platforms or spending too much time manually exporting data, this guide will show you exactly how to streamline your Facebook advertising analytics.

Appsflyer vs Mixpanel​: Complete 2026 Comparison Guide

The difference between AppsFlyer and Mixpanel isn't just about features—it's about understanding two fundamentally different approaches to data that can make or break your growth strategy. One tracks how users find you, the other reveals what they do once they arrive. Most companies need insights from both worlds, but knowing where to start can save you months of implementation headaches and thousands in wasted budget.