Google Analytics up to now. was not able to filter out bots visits from your data.
But, what is a bot?
A bot (short for “robot”) is an automated program that runs over the Internet.
There are many different types of bots, but some common examples include web crawlers, chat room bots, and malicious bots. Web crawlers are used by search engines to scan websites on a regular basis.
These bots “crawl” websites by following the links on each page. The crawler saves the contents of each page in the search index. By using complex algorithms, search engines can display the most relevant pages discovered by web crawlers for specific search queries.
(Source: TechTerms.com) In your reports, it will look something like this: (Audience > Technology> Network)
Why should I exclude this data?
Traffic from bots and spiders is basically fake traffic. No real customers, no real data, no insights there… If you don’t exclude this traffic, it will skew your data, making it difficult to know if you are taking decisions on real data or fake. Bot traffic will:
- Increase your bounce rate
- Decrease your pageviews, page/visit ration, session duration, etc…
How do I exclude them from my traffic?
Google Analytics has introduced a new check box option in the Admin Panel that will allow you to do just this.
What bots will be excluded? Google will filter out only known bots and spiders listed in the IAB lnternational Spiders and Bots list.
Let’s set this up!
Sign in Google Analytics and Click on “Admin”:
Select the view you want to apply this settings and click on “View Settings”:
Scroll down until you find the setting “Bot Filtering” and click “Save“: