Visitor statistics collect information about at least two sorts of visitor:
People, who usually receive the file favicon.ico;
Robots, who access robots.txt or sitemap.xml, at least if they are well-mannered robots.
Both sorts are welcome, because I depend on the search engines' robots to let people find the site. Ultimately, though, I really want to know which pages get seen by people.
I am interested in statistics for both sorts of visit, but I would very much like to separate them. At the moment, the only way I know to discover which content pages are seen by the two sorts of visit is to look at the daily detailed logs.
Does anyone know of a way to configure visitor statistics to accumulate numbers for people and robots separately? Something like a utility to process the existing, behind-the-scenes logs would be ideal, because I am wary of scaring visitors away with anything that is visibly collecting data.
Many thanks for any suggestions.
Nigel.
Hi
Can you please supply the domain name so we can look into this for you.
Kind Regards
Fiona
BT Business Forum Moderator