Identifying Bots in Your Analytics. How it Happens & What to Do About It.

Your organization recently posted a blog, then noticed the website’s bounce rate skyrocketed. Intuition tells you the blog post is driving the increased bounces, so you should delete it, right?

Your boss congratulates you for driving 100 percent more users to a specific page over the previous month. However, you know that your SEO work couldn’t have drawn that many clicks—but who’s to argue?

Then, upon opening the weekly report, you find that conversion rates for your e-commerce store dropped 17 percent week-over-week. Looking at the overall revenue number, you notice sales increased 2 percent. How is this possible?

There are many situations that may cause your website’s KPIs to drastically improve or deteriorate over a small period, but one of the most common reasons is bot traffic. 

What is a Bot, and What is Bot Traffic?

Automation allows organizations to become increasingly efficient with their processes and solutions. Once automating simple, repetitive tasks is checked off a company’s to-do list, employees are more available to focus on more important or complicated projects that benefit from human judgment. Automations for knowledge workers often take the form of a robot, or a “bot” for short. 

A bot is any software application with the sole purpose of completing repetitive online tasks. When referring to bot traffic, we mean any website visit and subsequent page views from bots. While many bots are innocuous, there will always be malicious bots to counter. According to the 2022 Imperva Threat Research’s Bad Bot Report, only 58 percent of all traffic on the internet in 2021 was from humans. Good bot traffic constituted 14 percent, so 28 percent of all traffic in 2021 came from malicious bots. Regardless of whether they’re good or bad bots, they all distort your website’s analytics.

The most common forms of bot traffic include data scrapers, research bots, and SEO crawlers. Most bots disguise themselves as normal human traffic, which makes them challenging to spot when assessing website performance. While most bots do not seek to disrupt your website, they will interfere with your analytics and affect website performance. 

What are the Business Impacts of Bot Traffic?

If your company makes business decisions based on website analytics, then bot traffic can have detrimental impacts on your organization’s digital strategy. As previously mentioned, bot traffic is often difficult to spot in high level reporting. This could result in metrics such as conversion rate, bounce rate and overall sessions to inflate or deflate without a clear reason. 

Bot traffic can also reduce website speed and security by overloading servers, resulting in slow page loads or, in severe cases, overall inaccessibility for users. Further, by allowing unwanted traffic to your site, unknown security vulnerabilities may occur. As an extreme example, malicious bots may look for susceptible areas of application code to inject malware. This malware can result in authentication bypass or even sensitive data loss.

How do You Identify Bot Traffic?

Bot traffic is not easily identifiable, but with these general principles, bots can be located and filtered out of analyses: 

  1. Unusual Increases in Page Views, Visits, and Unique Visitors

Bot traffic usually coincides with hundreds or thousands of different bots coming to your site. Each bot is usually considered a unique visitor. A significant increase in any of the above metrics may indicate the presence of bots on your website. 

  1. Abnormal Increases or Decreases in Page Views per Visit

Bots often crawl your entire website visiting many different pages in the same visit. Many times, this is to gather tons of information for research. In other scenarios, the bots may simply be looking for a particular item. For example, a bot could be set up to scrape your site for email addresses. In turn, this bot may reach most of your site’s pages, ultimately increasing the pages per visit in the process. It’s also possible that this type of bot activity abnormally decreases your bounce rate due to increases in non-bounce bot traffic.

Another bot behavior is “single-page-visits.” Thousands of bots will come to your site, land on a specific page, and then bounce. Often these visits have a “time-on-site” metric equal to 0 seconds. In your reporting, this will dramatically decrease the number of page views per visit and greatly increase your bounce rate. Although this type of bot activity is more common, seeing either is indicative of bot traffic.

  1. Sharp Decreases in Orders per Visit (OpV)

Bots will not usually place orders. If your site has e-commerce capabilities, orders per visit (OpV) should be a familiar metric. A sharp decrease in this metric indicates mass traffic that did not transact and could be a tell-tale sign of bot traffic to your site.

Google Analytics vs Adobe Analytics Specific Identifiers

Some identifiers in your website’s performance take a little more creativity to find. Below you will find five indicators that can be identified in both Google Analytics and Adobe Analytics.

  1. Increases in Direct Traffic without a Referrer

Bot traffic often travels directly to your site. If you see unusual spikes in traffic with these dimensions, it could indicate possible bot traffic:

Google Analytics: Channel = “Direct” 

Adobe Analytics: Referring Domain = “Typed/Bookmarked”

  1. Large Spikes in Traffic from One Area

The geographical location of the website traffic is frequently an obvious indicator of bot traffic. Often, bots are deployed from the same city leading to a disproportionate amount of your site’s traffic coming from a random city or country. For example, after opening a geo-location report, you may be shocked to notice that 25 percent of all traffic came from Ashburn, Virginia (a well known-data center hub of Amazon services). 

By consistently looking into geo-location reports, you can determine the origins of your page’s typical traffic. If you see spikes in traffic from seemingly random cities or countries, investigate to understand why that traffic occurred. While not always the case, bot traffic may be your first inclination. You can find geo-location reports here:

Google Analytics: Audience > Geo > Location (GA4: Reports > User > Demographics > Demographic details)

Adobe Analytics: Cities dimension

  1. Increases in Traffic with Unspecified Operating System

Bot traffic tends to hide information about the operating systems used to crawl your site. Sometimes, Google and Adobe Analytics are unable to detect operating systems, but if it occurs at an unusually high rate, this could indicate bot traffic. Use these dimensions and values to dive deeper:

Google Analytics: Operating System = “(not-set)”

Adobe Analytics: Operating Systems = “Not Specified”

  1. Increases in Traffic with Low Monitor Resolution

Often, bots are depicted in analytics reporting with resolutions that are irregular, not-set, or simply impossible values. The lower the resolution, the higher the likelihood of a bot. If you see a spike in traffic coming from users with low resolution screens, this could be an indicator of bot traffic. These resolutions are found in Google and Adobe Analytics using these dimensions:

Google Analytics: Screen Resolution

Adobe Analytics: Monitor Resolution

  1. High Volumes of Strange Browser Traffic

Bots often use custom user-agents that aren’t classified or show up as unknown. Identify this by looking at user browsers. These dimensions can help you find strange user-agents in your website’s traffic:

Google Analytics: Browser

Adobe Analytics: Browser

How Do You Stop Bot Traffic? 

Unfortunately, the possibility of bot traffic won’t vanish any time soon. The percent of website traffic attributable to bots remained steady at between 35-45 percent since 2015. However, you can act. 

The first action item—and a best practice—is to set up a dashboard that identifies bot traffic. In this dashboard, set baselines for the metrics and dimensions discussed above. Baselines should either consist of a full year’s data or a specific, meaningful time-period’s worth of data. For instance, if your e-commerce site regularly obtains high peaks of visitors during the holiday season, set your baseline values for just the holiday season. By doing so, your data will be highly reliable. 

This step is imperative as you must know what “normal” traffic looks like. Measuring the difference between current metrics to the baseline values will allow you to understand if your website is seeing huge spikes or valleys in many metrics indicative of bot traffic. In Adobe Analytics, alerts can be set up when metrics reach a certain threshold, which can be used as an ad-hoc, real-time bot detection dashboard. 

If you notice bot traffic on your website, filtering out the automated visits is the next step in the process. While very hands-on, this step will allow your organization to get back to making business decisions based on consistent and correct data. In the end, this step will save you money and time. 

“Exclude filters” are your friend during this step. Find all the common pieces of information that the bots share, such as location, browser, screen resolution, and operating system. Pair those with the metrics that they share, such as time-on-site, landing page, and bounce rate. Put together all the common attributes of the bots into a filter and exclude those visits. The trick is determining enough commonalities to craft a solution to target the bots, but not greedy enough to incorporate legitimate traffic.

Here’s an example: You noticed a spike in traffic from Ashburn, Virginia. The spike all happened on a single day, with 500 percent more users from Ashburn compared to the day prior. Digging into the data, you notice that almost all the traffic had a monitor or screen-resolution of 600×800, and the time-on-site for each visit was 0 seconds. 

A great Bot Exclusion Filter would exclude all traffic where:

  • City = ‘Ashburn, VA’

AND

  • Screen Resolution = ‘600 x 800’

AND

  • Time on Site < 1 second

And voila! Your Bot-Exclusion Filter is created. Make sure to apply this filter to all important website performance reports and analyses. Your organization will thank you for your creative problem-solving skills. 

How Evolytics Can Help with Your Bot Issues

While the above steps will take you a long way in detecting bot traffic, suspicious internet traffic is far from uniform and can benefit from a tailored analysis. If you have any questions or intriguing use-cases, don’t hesitate to contact Evolytics for a deeper dive!

Let’s talk about how we can reduce or get rid of bot traffic in your analytics.

Sources Cited for this Blog Post

Imperva Threat Research, 2022, 2022 Imperva Bad Bot Report. Accessed 24 Jan. 2023.

Websites Used to Inform this Blog Post

https://www.imperva.com/blog/evasive-bots-drive-online-fraud-2022-imperva-bad-bot-report/

https://yoast.com/what-to-know-about-bot-traffic/#:~:text=Commercial%20bots%3A%20Commercial%20companies%20send,serve%20users%20on%20their%20websites

https://www.imperva.com/learn/application-security/vulnerability-management/

https://datadome.co/bot-management-protection/exclude-bot-traffic-from-google-analytics/

https://www.cloudflare.com/learning/bots/what-is-bot-traffic/#:~:text=Bot%20traffic%20describes%20any%20non,the%20purpose%20of%20the%20bots.

https://yoast.com/what-to-know-about-bot-traffic/

Written By


Joe Connelly

Joe Connelly is an Analyst II, Data Science who assists in everyday analysis and creative strategic planning for his clients leveraging a background in statistics, business and sports analytics. He supports analysis for organizations including Sephora, Outreach International, and Vail Resorts.