Google Webmaster Tools Reporting non-problems

Status
Not open for further replies.

mneylon

Administrator
Staff member
Not sure if anyone else has come across this ..

As part of a site move and general clean up I updated the robots.txt to block access to a subdirectory that should have been blocked all along and now doesn't exist on the site's new backend (I changed CMS)
Google webmaster tools, however, is reporting that "health issues", as the blocked URL(s) were indexed ..

It's not an issue - I want them removed from the index, but I'd prefer not to have the annoying "health warning" every time I log in ..

Any ideas?
 

link8r

New Member
Yeah, it's happened to me a few times. I've had clients ring me in complete horror and panic - Google says we have a health warning. Take two paracetamol and call me in the morning!

I reckon it will go away if you tell google to remove the content in those URL's - not sure if thats feasible as it can be time consuming but you might be better using a redirect for the whole sub folder than blocking it in Robots and getting the "yellow card" so to speak.

Once Google dumps the URL's from its cache, the warning stops but URL's can stay cached for 12 months...
 

mneylon

Administrator
Staff member
The warnings vanished a couple of days after I posted this, but the stupid thing is that using robots.txt is supposedly a "proper" way of removing URLs from the index *sigh* (or at least I thought it was .. )
 

link8r

New Member
um, ~ish. Robots.txt is the "proper" way to prevent files from getting indexed and the pre-requisite to getting them removed. You can't do a manual remove without a 404 or robots blocking. Manual remove is the only way to externally remove a URL from the index.

Just the reason for the yellow flag - its for noobs. You know that WordPress "feature" - "Privacy" - that's probably responsible for having 1000 sites blocked from Google a week. Seriously - is the number 1 cause of people being demoted from Google - and not just page 15 - but entirely. for one simple reason - it blocks the root of the site in Robots - thus the entire site. For a robot that understands 200 signals and probably makes 4000 descisions, it only takes one single line to guarantee complete exclusion within 15 days. That's why the huge yellow warning sign!
 
Status
Not open for further replies.
Top