Yup, I've been eavesdropping on her Twitter stream again
Donna Fontenot @DonnaFontenot
There's nothing more disheartening to an SEO than to see another "pro" highly-paid SEO ignore SEO 101...robots.txt blocking search engines.
Now, I don't have any idea of the context of this tweet. Given that it is Donna I presume someone made an accidental unintended disastrous booboo. Regardless, it seems a topic not much mentioned: the control of SE bots. If you are one of the multitude who are wide open to every visitor, bot and human, that comes a-calling the following is likely an unfathomable mystery as much why bother as how... but that is too much effort... feel free to bale at this point, I don't mind at all. I've always preferred quality over quantity.
I block SEs each and every day from all my sites. Actually, only a very few are allowed in, and of those only their main indexing bots; no media bots, no fetcher bots, no no no. And then there are large sections of my sites, varying from ~30% - ~40% of pages are simply off limits to those allowed indexing bots. Yes, they still try and at least occasionally get in by various means but so long as they then honour the meta noindex the leak is moot; if not then they hear from my law-type-person.
I'm among the most extreme of the selective SE bot access types that I know. That is not what is important, what is is that there may well be good business reasons for blocking in whole or in part some or all SE bots. If that is the case then good on ya; if, however, you inadvertently mistakenly made the following robots.txt:
User-agent: * Disallow: /
Please accept my heartfelt sympathy in your bereavement, where should I send the flowers?
Yes, it is bad of me to be rolling on the floor laughing hysterically but the relief in rather you than me was simply too much...
Is it too too much to hope that you are a competitor? Oh, please, with whipped cream and a cherry on top, oh please...