Meet Us at AWS re:Invent 2019 | Book a Meeting Now

How to determine if your web traffic is genuine or not?


What are bots

One of the key factors for any online business is the traffic from genuine users. While traffic plays an important role in revenue generation and helps the business expand, there are some downfalls if the overall traffic is not monitored. Almost half the web traffic is received from non human sources. Automated programs known as bots, scrape the website and generate additional page views. Good bots like search engine bots, uptime bots, ping bots and crawler bots create traffic which are favourable. But bad bots, on the other hand, generate bogus traffic. This can cause serious damage to your business.

Some of the negative impacts due to bogus traffic are,

  • Damaged website reputation and search engine credibility: Search engines won't index your website when it gets millions of fake views generated by malicious bots. This damages your SEO and the reputation you built for your website.

  • Poor sales conversion: Bots generate bogus traffic, and this affects your sales conversion when the bots spam your form pages with fake details thereby generating fake leads.

  • Termination from google adsense: Advertising networks consider fake views as a form of fraud and can end up penalizing your business. If this trend is repeated, advertising networks could possibly blacklist your website.

  • Increased server and bandwidth costs: The server and bandwidth cost increases when the bots hit your website with millions of unwanted requests within a short time frame.


That being said, it becomes crucial to know the various ways to determine the credibility of your traffic.

Uneven traffic:
When you see an abnormal increase in your page views, bots can be the most probable reason. This is because bots crawl your website, scour the pages for information. This will result in the increase in bogus traffic. This bogus traffic can trick you into believing that the views are from genuine prospects.


Page duration:
Automated bots are programmed to perform repetitive tasks at high speeds. They crawl a huge number of web pages within a small time frame. If you see page durations which only span a few seconds, then you might as well be looking at a bot activity. However some bots traverse pages unusually slow, they remain on the page for too long trying to mimic a genuine user.


Bounce rates:
An increase in bounce rate can be due to bots traversing a high number of pages at a very high speed. When bots traverse through pages, they do not stay on a single page for a considerable amount of time, hence increasing the bounce rate. The increase in bounce rate caused by bots, can trick you into believing that the current page or the content is not working out when in reality you remain ignorant of the real reason.


These are some of the ways in which you can determine the credibility of your web traffic. ShieldSquare provides a more comprehensive approach to identify and block the malicious bot traffic.


Step Up and Take Action

Powered by Think201