The Difference Between Good Bots and Bad Bots

Bots are programs created to automate various and often repetitive tasks ─ useful as well as harmful ─ hence they are generally described as either good bots or bad bots. Several studies have shown that over 50% of all internet traffic is comprised of bots. However, the malicious ones get the most attention as they must be detected and blocked before they cause any harm.

What are bots

What Are Good Bots?

Good bots are beneficial to businesses as well as individuals. When you search for a website or phrases related to a website's products or services, you get relevant results listed in the search results page. This is made possible with the help of search engine spiders bots, also known as crawler bots (such as GoogleBot, Bingbot, and Baidu Spider, to name a few). Good bots are generally deployed by reputable companies, and for the most part they respect rules created by webmasters to regulate their crawling activity and indexing rate, which can be defined in a website’s robots.txt file for crawlers to see. Certain crawlers can also be prevented from indexing websites if they are not useful or needed by the business. For example, the Baidu crawler can be blocked if a business does not operate in China and/ or does not cater to the Chinese market.

Apart from search engine crawlers, good bots also include partner bots (e.g. Slackbot), social network bots (e.g. Facebook Bot), website monitoring bots (such as Pingdom), backlink checker bots (e.g. SEMRushBot), aggregator bots (like Feedly), and more. Even good bots such as crawlers can cause problems at certain times, such as when their traffic increases and starts to reach the limits of server capacity, or when their volumes result in skewed analytics.

Our research report 'Inside Good Bots' details the most common types of good bots, the functions they carry out, and recommendations on how to optimally manage them according to your specific objectives.

What Are Bad Bots?

Bad bots are programmed to perform a variety of malicious jobs. They work in an evasive manner and are mostly used by fraudsters, cybercriminals, and nefarious parties engaged in various illegal activities. They can be sent by third-party scrapers or your competitors to steal content from your website, such as product reviews, breaking news, product pricing information and catalogs, user generated content on community forms, and so on. When they make thousands of page visits within a very short span of time, they strain Web servers and choke the available bandwidth, slowing down the site for genuine users.

Bots can scrape your content and publish it elsewhere. This can affect your website’s search engine rankings. There have been instances of stolen content outranking the original content on Google search pages. This directly impacts the bottom-line of the websites that have invested large sums to create original content. They can spam community forums with intrusive ads or messages. They can create millions of fake leads on real-estate and classifieds portals, carry out cart abandonment on e-commerce portals, distort marketing analytics, take over accounts to steal stored value in the form of store credits, loyalty points, prepaid wallets and so on.

Bad bots have evolved to become highly sophisticated and human-like in their behavior and can leverage vast computing resources at cloud data centers to carry out their malicious activities, largely evading conventional or in-house bot detection methodologies. They can even influence political elections.

How to Stop Bad Bots

First, we recommend using our free Bad Bot Scanner to find out the extent of the bot traffic on your website, apps, and APIs. You could also try our free Bad Bot Analyser that shows you the types of bot traffic your site receives. If you’re thinking of developing an in-house bot manager, you should know that they have several downsides and are ineffective against sophisticated bots that can mimic human behavior. In addition, the rapid evolution of bots and their attack techniques makes it virtually impossible to reliably detect advanced bots in real-time before they do their damage. A dedicated bot management solution provides enterprises with specialized bot identification and detection capabilities that are regularly updated with the latest bot signatures. Moreover, solutions such as ShieldSquare leverage machine learning and advanced heuristics to detect sophisticated non-human traffic even in the absence of a recognized bot signature, ensuring protection from future automated threats.

In their ‘New Wave™: Bot Management, Q3, 2018’ Report, Forrester Research recommends that “Bot management tools must determine the intent of automated traffic in real time to distinguish between good bots and bad bots.” Before choosing a bot management solution, learn about the key evaluation criteria in selecting a solution that best meets your needs. Also, our Resources section contains useful research reports that detail the ways in which bad bots attack websites in various industries, and how they can be managed.


Related Content

WhitePaper

EBOOK

The Ultimate Guide to Bot Management

The-Big-Bad-Bot-Report

BLOG

Why Management Of Good Bots Is Crucial For Organizations

Product_Brief

REPORT

The Big Bad Bot Problem

INSPECT YOUR TRAFFIC WITH SHIELDSQUARE'S FREE BAD BOT ANALYSER

Powered by Think201