What are bots & spiders, and why should we filter them?

Looking for a cleaner count of traffic and activity on your website? Google Analytics recently implemented a bot and spider filtering option, helping to improve the accuracy of your metrics and to remove non-human traffic from your analytics.

Bots and Spiders are automated computer programs, not people, that are hitting your website. It may be a search engine or program looking to list your content on their site, or it may even be a service that you have hired to make sure that your server is up; that it’s loading speed is normal, etc.

google-spider

Including traffic from bots and spiders artificially skews your data, and until now, filtering this kind of traffic out was a manual, time consuming and highly imprecise job. Google’s new filter utilises the IAB Spiders & Bots List to recognise known non-human traffic, and filters out this listed traffic that would previously have been counted in your Analytics. Membership to see this list would cost up to $14,000 a year, but by checking the box on your view means you get to, not see, but utilise the list for free.

In doing so, some of your traffic numbers may drop slightly, but importantly, the new count will give a more accurate picture of what’s happening, rather than artificially inflated numbers. Many numbers will improve with increased levels of accuracy by enabling the filter. For instance, the filter would not affect the number of conversions, but would typically decrease total site visits, so your conversion rate would go up. Likewise, your site’s bounce rate will likely decrease, as each page view from a bot is its own session, therefore spider & bot bounce rate is typically 100%.

Bot filtering is certainly something we recommend; to learn more about it, or for help configuring an optimal Google Analytics profile please get in touch.