Not sure if this is the correct board for this...
I host a clients site on a shared server with 50gb bandwidth allowance per month. At the moment Googlebots are using the entire 50gb and more crawling the site. Total bandwidth for my shared server this month is about 130gb and most of this is robots crawling various sites.
I have a robots.txt file which is only allowing the major search engines and it is restricting access to all directories on the site.
I have also used Googles webmaster tools to slow the crawling of the googlebots but nothing seems to be working.
Does anyone have any suggestions on how to dramatically slow the crawling of googlebots?
Thanks!
Barry
I host a clients site on a shared server with 50gb bandwidth allowance per month. At the moment Googlebots are using the entire 50gb and more crawling the site. Total bandwidth for my shared server this month is about 130gb and most of this is robots crawling various sites.
I have a robots.txt file which is only allowing the major search engines and it is restricting access to all directories on the site.
I have also used Googles webmaster tools to slow the crawling of the googlebots but nothing seems to be working.
Does anyone have any suggestions on how to dramatically slow the crawling of googlebots?
Thanks!
Barry