How to Exclude Bot Traffic from Google Analytics

Cody Schneider8 min read

Seeing strange spikes in your website traffic can be exciting until you realize computers, not potential customers, are behind them. Bot traffic inflates your metrics, skews your conversion rates, and leads to poor marketing decisions based on bad data. This tutorial will walk you through a few simple but effective methods to identify and exclude bot traffic, giving you a much cleaner and more accurate view of your performance in Google Analytics 4.

GraphedGraphed

Still Building Reports Manually?

Watch how growth teams are getting answers in seconds — not days.

Watch Graphed demo video

Why You Should Care About Bot Traffic

At first glance, more traffic seems like a good thing. But bot traffic isn't harmless noise, it actively damages your data integrity. When bots crawl your site, they trigger pageviews and sessions just like a real person, but their behavior is completely different, leading to several problems:

  • Skewed success metrics: Simple bots can dramatically inflate your session and user counts, making it look like your marketing efforts are more effective than they really are. This makes it impossible to gauge the true reach of your campaigns.
  • Misleading engagement data: Most bots visit a single page and leave immediately. In GA4, this translates to sessions with extremely low engagement rates. If a significant portion of your traffic comes from bots, your overall engagement rate will plummet, making it seem like your content isn't resonating with real users.
  • Inaccurate conversion rates: Bots don't buy products, fill out forms, or sign up for newsletters. When you have thousands of bot-driven sessions with zero conversions, your overall conversion rate gets artificially pulled down, hiding the true performance of your sales funnels.
  • Poor strategic decisions: Every marketing decision - from which ad campaigns to scale, to which blog posts to write next - relies on accurate data. If your reports are contaminated by bots, you might cut funding to a valuable campaign or ditch a content strategy that’s actually working, all because the numbers are misleading.

Cleaning up your data isn't just a technical exercise, it's a fundamental step toward understanding what truly drives your business growth.

GraphedGraphed

Still Building Reports Manually?

Watch how growth teams are getting answers in seconds — not days.

Watch Graphed demo video

How to Spot Bot Traffic in Your Google Analytics 4 Reports

While Google Analytics has gotten better at filtering some bots automatically, determined spammers and crawlers still get through. Before you can filter them out, you need to learn how to spot them. Here are a few telltale signs to look for in your reports:

  • Sudden, Unexplained Traffic Spikes: Did your traffic double overnight for no apparent reason? While it could be a viral post, it's often a sign of a bot attack. Look at your real-time reports or your Traffic Acquisition report and see if a single source or location is driving the spike.
  • Odd Geographic Sources: If you're a local bakery in Chicago but suddenly see thousands of sessions from a small city in a foreign country you don't serve, you likely have a bot problem. Navigate to Reports > User > User attributes > Country to investigate.
  • Suspicious Referral Traffic: A huge red flag for bot activity is referral traffic from spammy-looking domains you don't recognize. These often have names that include terms like "buttons-for-your-website" or "get-free-traffic." Check your sources in the Reports > Acquisition > Traffic acquisition report.
  • Extremely Low Engagement Rate: Go to the Traffic acquisition report and look at the "Average engagement time" and "Engagement rate" for different channels. Sources with near-zero engagement are highly suspicious, as it indicates a visitor landed and immediately left without interacting.
  • "Not Set" Values: While "not set" can appear for various technical reasons, a large amount of traffic where the language, screen resolution, or city is "not set" can sometimes be an indicator of less-sophisticated bots.

If you see one or more of these symptoms, it's time to take action and start filtering.

Your First Line of Defense: Google's Default Bot Filtering

Unlike its predecessor (Universal Analytics), Google Analytics 4 automatically enables bot and spider filtering by default. GA4 identifies traffic coming from known bots and spiders on the IAB/ABC International Spiders & Bots List and excludes it from your reports automatically. There is no longer a checkbox you need to tick to turn this on.

While this is a great starting point and handles a lot of generic crawler traffic, it’s not a complete solution. It won't catch traffic from more sophisticated bots, targeted spammers, or other unwanted sources. For that, you need to create your own custom data filters.

Creating Custom Data Filters to Block Unwanted Traffic

Data filters are your most powerful tool for cleaning your GA4 reports. They allow you to permanently exclude traffic based on specific criteria, like an IP address or a referral source. Let's cover the most common filters you'll need.

GraphedGraphed

Still Building Reports Manually?

Watch how growth teams are getting answers in seconds — not days.

Watch Graphed demo video

1. Filter Out Your Team's Traffic (Internal IP Exclusion)

Before you hunt down external bots, the first thing you should filter is your own activity. Every time you, your employees, or your marketing agency visit the website, it gets counted as a session. This can significantly skew data, especially for smaller businesses.

Excluding internal traffic is a two-step process in GA4: first, you define what counts as internal traffic, and second, you activate a filter to exclude it.

Step 1: Define Your Internal IP Address

  • First, find your public IP address by searching "what is my IP address" on Google. Copy it.
  • In Google Analytics, go to the Admin panel (the gear icon in the bottom-left).
  • In the Property column, click on Data Streams and select your website's data stream.
  • Under Google Tag, click on Configure tag settings.
  • Click Show all, then select Define internal traffic.
  • Click the Create button.
  • Give your rule a name, like "Main Office IP Address."
  • Leave the traffic_type value as internal.
  • Under IP address > Match type, choose IP address equals and paste the IP address you copied earlier.
  • Click Create to save the rule.

Step 2: Activate the Internal Traffic Filter

  • Just defining your IP doesn't do anything yet. You now have to tell GA4 to exclude any traffic that matches that definition.
  • In the Admin panel, under the Property column, navigate to Data Settings > Data Filters.
  • You will see a filter for "Internal Traffic" with a "Testing" state. Click on the three dots to the right and select Activate filter.
  • Confirm your choice in the pop-up. The state will change to "Active."

That's it! From this point forward, traffic from that IP address will be excluded from your reports. Note that filters do not apply retroactively, they only affect data collected after they are activated.

2. Exclude Known Bad Referral Domains

Referral spam is designed to get you to visit a junk website, either out of curiosity or by polluting your analytics. These show up in your reports as if they're sending you traffic, but it's entirely fake.

You can create a list of domains to block within your Google Tag settings.

  • Again, navigate to Admin > Data Streams and click your web stream.
  • Click Configure tag settings.
  • Click Show all, then click List unwanted referrals.
  • Under Match type, choose a condition. "Referral domain contains" is often a flexible choice.
  • In the Domain field, enter the spammy domain you identified in your reports (e.g., spam-domain.com).
  • Click Add condition to add more domains to your exclusion list.
  • Once you're done, click Save.

This will prevent traffic that says it came from these spam sites from showing up in your future reports.

GraphedGraphed

Still Building Reports Manually?

Watch how growth teams are getting answers in seconds — not days.

Watch Graphed demo video

Best Practices for Ongoing Maintenance

Filtering bots isn't a "set it and forget it" task. New spammy domains and bots appear all the time. Keeping your data clean requires a bit of ongoing attention.

  • Monthly Data Review: At least once a month, spend 15 minutes reviewing your Traffic acquisition report. Sort by sessions or users and look for any new, suspicious referral sources. Add any you find to your unwanted referrals list.
  • Use Annotations: When you activate a new filter, create an annotation in GA4. This leaves a note right on your reports, so if you see a traffic drop after a certain date, you'll remember it was because you successfully applied a filter, not because of a real-world event.
  • Look Beyond GA: Use a tool at the server level, like Cloudflare or a web application firewall (WAF), to block bad bots before they even reach your website. This is a much more robust solution that also enhances your site's security and performance.

Final Thoughts

By regularly following these steps - leveraging Google's default filtering, excluding internal traffic, and managing a list of unwanted referrers - you can significantly improve the quality and accuracy of your Google Analytics data. Clean data is the foundation of smart business decisions, enabling you to confidently analyze what's truly working and invest your resources where they matter most.

Getting your data clean is one thing, analyzing it is another. Instead of manually digging through reports every week inside the Google Analytics interface, we simplify the whole process. By connecting your GA4 account to Graphed, you can use plain English to ask questions and instantly get real-time dashboards and reports. You can just ask things like, "Show me our top traffic sources this month," and get a live, automated report, without ever having to worry about configuring filters or building charts from scratch.

Related Articles