Final word on cloaking?
Posted 19 April 2006 - 08:21 AM
[...] but if you’re deliberately showing different links to Googlebot than to users, that’s cloaking and very high-risk behavior, because we want to score the same page that the user would see. My best advice is to figure out links that are useful both to users and for crawling purposes, and show that set of links.
The closest I ever got from a Google employee (via email):
Isn't that more or less the same? Show the Googlebot what you're showing users "in a different way"
I would use any type of cloaking, even if it's small scale and only intended to show search engines what you're showing users in a different way.
Or do you read something else into Matts texts (or between the lines)?
The most interesting part of it all seems to be that Google still has to manually do the comparison, they don't seem to have an automated system that checks pages for cloaking (otherwise they would need to have an "evil twin" of the Googlebot acting like a user from whoknowswhere). To me, that's just opening the door to misuse - what's the chance that your site will get manually caught for cloaking if you're not high-profile? + with things like "shadow domains" that use cloaking and point to other sites (which don't use cloaking, but based on the old mantra that "what links to you can't harm you" might be excepted from bans / penalties), it's hard to really make people fear the use of cloaking....
Posted 19 April 2006 - 08:41 AM
Here’s the short answer from Google’s perspective:
IP delivery: delivering results to users based on IP address.
Cloaking: showing different pages to users than to search engines.
IP delivery includes things like “users from Britain get sent to the co.uk, users from France get sent to the .fr”. This is fine–even Google does this.
It’s when you do something *special* or out-of-the-ordinary for Googlebot that you start to get in trouble, because that’s cloaking. In the example above, cloaking would be “if a user is from Googlelandia, they get sent to our Google-only optimized text pages.”
So IP delivery is fine, but don’t do anything special for Googlebot. Just treat it like a typical user visiting the site.
At the very least, it explicitly states that IP-based delivery of content is OK.
Posted 19 April 2006 - 10:31 AM
The IP delivery you mention is perhaps better described as mirroring. The .co.uk version of Google is a mirror of the main .com version.
Posted 19 April 2006 - 10:36 AM
Once again, it's the intent, not the technology that makes it against the rules.
Posted 19 April 2006 - 01:49 PM
What do you serve the Googlebot? (with a US-based IP)
Do you have to see where most of your visitors are and serve the Googlebot the according content? or do you serve the Googlebot a mixture? Do you take the tables-layout, menus, news-feeds, footers and styles and all out to serve him the content pure? All of this would fall under the same heading.... but so would lots of the black-hat cloaking (eg serving the Googlebot lots of content but the visitor only a pile of adsense for that content).
It's hard to draw a line (unless it is really bad). Remember the french newspaper that Nadir found which had some "cloaked", redirecting pages? Is that OK with Matt? It's just the "same" content served up slightly "differently" for search engines
Reply to this topic
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users