Website analytics data are crucial for marketers and product managers, giving them insights into visits, conversions, product metrics and campaigns to help fine-tune product and marketing strategy. Unchecked bot traffic greatly skews these metrics and hinders conversion funnel analysis and accurate KPI tracking. Invalid traffic inflates metrics, campaign data, and traffic analytics, giving marketers data that are considerably off the mark, which negatively impacts decision-making.
We recently examined how bot traffic skewed the traffic analytics on a website that caters to professional service providers. The portal provides information on a wide range of consumer services, and also functions as a classified ad site for service providers based on their location and specialization.
The portal earns revenue by providing leads, advertising, turnkey websites and search engine marketing services to professional service providers, to help professionals reach potential clients.
Why is clean analytics important?
The success of the website’s lead generation efforts was gauged by monitoring traffic and inflows from traffic acquisition campaigns. It was crucial to have accurate information on visitors to the portal as well as to the turnkey websites they hosted, and to determine if searches and forum queries by those seeking information were bringing in good leads for professionals registered on the site. Though its business model relies on high-quality leads being generated for professionals, unchecked bot traffic was severely skewing its visitor metrics and preventing marketing and product teams from gaining insights into genuine users’ journeys.
Steadily escalating bot traffic caused a range of problems — noise in metrics, form spam, comment spam, and fake leads to their clients. It was a frustrating challenge for product teams to determine what pages were effective at boosting engagement, and what traffic sourcing channels were bringing in good leads for professionals listed on the site. Moreover, the high volume of fake signups made by bots was a major irritant to the professionals and firms paying to be listed on the portal, especially for premium customers that had invested in turnkey websites and web hosting.
The graph above illustrates the page category-wise bot traffic percentages on the website. While roughly one out of seven visitors on the home page was a bot, six out of ten visitors on the search results page were bots that were scraping information. Also troubling was the fact that nearly one out of five contact form submissions was made by bots, which meant that a large number of inquiries were nothing but spam. This not only frustrated professionals listed on the site but also considerably skewed visitor analytics and made it hard for the marketing team to chart and analyze how genuine users navigated through the site.
While some of the symptoms of bot traffic (such as unexpected and unexplainable variations in their traffic metrics) were obvious to their team, it was the skewing of conversion path metrics that proved to be the biggest issue with unchecked bot traffic. Though overall traffic was steadily increasing, conversions and sign-ups were not. It was crucial for the business to clean up its website analytics to give its marketing and product teams a clear picture of traffic breakup, sources, and conversion paths.
The website’s marketing team regularly conducted A/B and multivariate tests to optimize the user experience and boost conversions. They were using an in-house A/B testing tool to perform segmentation, traffic analysis, and run experiments. The percentage of error in the results obtained during these experiments made it essential to implement a solution that would allow them to get clean, trustworthy, and actionable metrics to fine-tune their marketing and operational strategies.