Meet Us at World Aviation Festival 2019 | Book a Meeting Now

How Much of Your Traffic is From Bots?

Upon analyzing random samples from the top 1 million websites ranked by Alexa, our internal data showed that about 50% of all web traffic is comprised of bots, most of them with malicious intent. After categorizing and analyzing websites based on monthly page views, the graph below outlines our observations during periods of peak traffic:


how much traffic from bots

The Hourly Impact of Bot Traffic

Websites with over 100 million pageviews per month ─ 120,000 bot hits per hour

Websites with 10 million to 100 million monthly pageviews ─ 20,000 bot hits per hour

Websites with less than 10 million monthly pageviews ─ about 3,000 bot hits per hour


These numbers are only indicative but give you an idea of the amount of bot traffic your website can get if left unprotected. If we assume that these bot hits are fake page views, and consider that your website gets about 300 million pageviews a month:


120,000 (pageviews/ hour) X 720 (hours/ month) = 86,400,000 page views/ month


In this example, almost 29% of your website’s traffic is from bots, which wastes your server resources and slows down your site. Practically speaking, the bot hits per hour may not be a perpetual phenomenon, but when they happen, they hurt your website’s competitive advantage and your brand perception.

With the cloud computing resources and automated scraping tools available today, a scraper can easily scrape thousands of pages per minute. When an e-commerce site is preparing for the shopping season, its marketing team invests thousands of dollars running campaigns to acquire customers. If the marketing team sees a huge spike in traffic when the sale begins, they may be oblivious to bot traffic and think that their campaigns are working because site traffic increased two- or threefold. The real problem comes up when these visits don’t convert to sales, or in many cases, end up ruining the shopping experience for real customers.

How to Identify Bot Traffic in Google and Adobe Analytics

It’s important for website managers and marketing teams to know how much of their traffic is from real humans. This will allow you to react quickly, course-correct, and implement workarounds to attract the customers you’re looking for. Google Analytics and Adobe Analytics are two of the most commonly used website analytics tools which provide insights into traffic composition, conversions, and other campaign metrics to help fine-tune product and marketing strategy. Unchecked bot traffic greatly skews these metrics and hinders conversion funnel analysis and KPI tracking.

How to Prevent Skewed Analytics

In addition to shielding your website, app, and APIs from major bot threats such as Account Takeover and Application DDoS, ShieldSquare helps you prevent skewed analytics ─ it’s as easy as adding our JavaScript code snippet to your web page headers. This JS snippet insertion can alternately be done through analytics dashboards such as Google Analytics and Adobe Analytics. Implementing our solution helps marketers obtain clean and accurate website analytics to help them optimize their campaigns for better results.

To learn how much of your traffic is made up of bots, try our free Bad Bot Analyser that shows you a detailed breakdown of your web traffic.


Related Content

WhitePaper

BLOG

How Classifying Non-Human Traffic In Attribution Reports Can Help Improve Marketing Strategies

The-Big-Bad-Bot-Report

BLOG

How Cleaning Analytics Dashboards Can Course-Correct Marketing Tactics

WhitePaper

EBOOK

The Ultimate Guide to Bot Management

KNOWING YOUR TRAFFIC LEADS TO BETTER DECISIONS

Powered by Think201