ShieldSquare named a Leader in The Forrester New Wave™: Bot Management Report, Q3 2018. Click to know more.

ShieldSquare named a Leader in The Forrester New Wave™: Bot Management Report, Q3 2018. Click to know more.

How Cleaning Analytics Dashboards Can Course-Correct Marketing Tactics

November 19, 2018 | All Bot Prevention Technologies ShieldSquare Research

A-Dedicated-Bot-Mitigation-Solution-Can-Restore-Accuracy-To-Analytics-Data

Website analytics data are crucial for marketers and product managers, giving them insights into visits, conversions, product metrics and campaigns to help fine-tune product and marketing strategy. Unchecked bot traffic greatly skews these metrics and hinders conversion funnel analysis and accurate KPI tracking. Invalid traffic inflates metrics, campaign data, and traffic analytics, giving marketers data that are considerably off the mark, which negatively impacts decision-making.


We recently examined how bot traffic skewed the traffic analytics on a website that caters to professional service providers. The portal provides information on a wide range of consumer services, and also functions as a classified ad site for service providers based on their location and specialization.


The portal earns revenue by providing leads, advertising, turnkey websites and search engine marketing services to professional service providers, to help professionals reach potential clients.

Why is clean analytics important?

The success of the website’s lead generation efforts was gauged by monitoring traffic and inflows from traffic acquisition campaigns. It was crucial to have accurate information on visitors to the portal as well as to the turnkey websites they hosted, and to determine if searches and forum queries by those seeking information were bringing in good leads for professionals registered on the site. Though its business model relies on high-quality leads being generated for professionals, unchecked bot traffic was severely skewing its visitor metrics and preventing marketing and product teams from gaining insights into genuine users’ journeys.


Steadily escalating bot traffic caused a range of problems — noise in metrics, form spam, comment spam, and fake leads to their clients. It was a frustrating challenge for product teams to determine what pages were effective at boosting engagement, and what traffic sourcing channels were bringing in good leads for professionals listed on the site. Moreover, the high volume of fake signups made by bots was a major irritant to the professionals and firms paying to be listed on the portal, especially for premium customers that had invested in turnkey websites and web hosting.

Conversion-Funnel

The graph above illustrates the page category-wise bot traffic percentages on the website. While roughly one out of seven visitors on the home page was a bot, six out of ten visitors on the search results page were bots that were scraping information. Also troubling was the fact that nearly one out of five contact form submissions was made by bots, which meant that a large number of inquiries were nothing but spam. This not only frustrated professionals listed on the site but also considerably skewed visitor analytics and made it hard for the marketing team to chart and analyze how genuine users navigated through the site.


While some of the symptoms of bot traffic (such as unexpected and unexplainable variations in their traffic metrics) were obvious to their team, it was the skewing of conversion path metrics that proved to be the biggest issue with unchecked bot traffic. Though overall traffic was steadily increasing, conversions and sign-ups were not. It was crucial for the business to clean up its website analytics to give its marketing and product teams a clear picture of traffic breakup, sources, and conversion paths.


The website’s marketing team regularly conducted A/B and multivariate tests to optimize the user experience and boost conversions. They were using an in-house A/B testing tool to perform segmentation, traffic analysis, and run experiments. The percentage of error in the results obtained during these experiments made it essential to implement a solution that would allow them to get clean, trustworthy, and actionable metrics to fine-tune their marketing and operational strategies.


The client chose to integrate ShieldSquare through our JavaScript code snippet added to their web page headers, which is one of the more popular choices amongst the wide range of integration options we provide. The code snippet can also be added through popular analytics dashboards such as Google Tag Manager or Adobe Dynamic Tag Manager. Implementing our solution helps marketers obtain clean and accurate website analytics to help them optimize their campaigns, while simultaneously eliminating a range of business problems caused by bots, such as form spam, content scraping, infrastructure abuse, and poor user experience.

Tags: , , , ,

Subscribe to ShieldSquare Research and Blog
Thank you for subscribing
Thanks. Sent confirmation email.

Step Up and Take Action

Powered by Think201