Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


The state of Cloaking today.

  • Please log in to reply
11 replies to this topic

#1 Black_Knight


    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 9417 posts

Posted 28 October 2003 - 11:17 AM

We don't often deal with this topic, though we have discussed it a few times. We have a pretty open attitude to techniques and technologies here, with no technique itself ever being 'good' or 'bad' per se, just more optimal for a given situation or less so.

However, cloaking is always a contentious issue, and is often one of considerable misunderstandings. What is, and what is not, cloaking tends to be discussed at the same time.

Whatever is and is not cloaking, this post should still apply as a point of interest to all, because what I'm looking at is the way that search engines can detect end-user content above and beyond what their spiders are given.

I was inspired to make this 'update' on the state of cloaking after recent discussions in a thread at the IHY/Best Practices forum (note: a huge thread, only the latter few pages of discussion are applicable).

Those engaged in cloaking need to be aware of 'decloaking' hazards, of what the engines can and cannot see. Of all decloaking hazards, the most effective, and quite probably the one that will finally see an end to deceptive cloaking, is the Google toolbar.

The Google Toolbar is plugged right into the browser of the end user. It 'sees' the human page, not one built for the spider. As I pointed out in the discussion I referenced, it is pure child's play for the toolbar to compare the size of the page the viewer is looking at to that size of the page Google spidered at the same URL. Not only would this help detect cloaking, but more usefully to Google in general, it could help detect when pages had updated and needed respidering.

Of course, the toolbar could easily do a lot more, but I was looking at systems that would involve minimal extra bandwidth in communication between the toolbar and data centers.

There is absolutely no way, at this time, to serve the toolbar with a different version of the page to that seen in the browser the toolbar is installed into.

There is no longer any such thing as undetectable cloaking where Google is concerned. It is all detectable. The only question is whether, or more likely, when, they will start to use the data they already have available.

For anyone who thinks blocking the cache will have any effect, I would of course point out that Google is a full-text indexing engine. What Google indexes is your full page. The cache is what Google has indexed. The NOCACHE stuff just lets you ask Google not to make that cached copy public. The only way Google will not have a copy of your page is if you forbid Google to index that page.

Google isn't the only problem of course.

Distributed spidering systems are being examined by several engines, as you probably know. Looksmart recently bought up such a system, and if it fails as a directory listings provider, could always set itself up as a spam detector, by using those distributed spidering agents. :lol:

The Wayback Machine (part of Alexa) may or may not use the Alexa toolbar to grab copies of pages. If it does, then this is exactly the same decloaking hazard as the Google Toolbar (or the AllTheWeb toolbar, the Teoma toolbar, etc, etc).

Cloaking as a means to prevent search engines from seeing something is no longer safely effective. There are far more unbeatable decloaking hazards, far more ways that the engines can (and do) receive the exact data sent to the ordinary user's browser, and this number is still growing.

I'm not one of those who go around announcing 'the death' of techniques and technologies. However, cloaking is no longer effective as a means to fully control what engines see. If you really need to hide something from a search engine (whatever your reasons), the only truly safe way is to make it completely inaccessible to browsers of all kinds.

#2 BillSlawski


    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15667 posts

Posted 28 October 2003 - 09:44 PM

Excellent post, Ammon.

#3 Guest_Lots0_*

  • Guests

Posted 29 October 2003 - 01:09 AM

Ammon you make some good points but I think you also miss the boat on a few.

The days of deceptive cloaking are fast coming to an end. The end of deceptive cloaking was not brought about by the SE’s, deceptive cloaking went the way of the dinosaurs because it did not work, it did not improve sales.

People were not happy to have their computer screen hijacked by any site. Deceptive cloaking is a good way NOT to make sales. People that clicked on “Thomas Jefferson Quotes” and found themselves at a porn site with a broken back button just don’t want to buy adult toys or videos.

In order for a search engine to ban a site for cloaking it should try to figure the intent of the cloaking. Was this persons IP cloaked site, cloaked to help the user or was it cloaked to deceive the user.

Currently, IP cloaking is used far more often to help the user have a better online experience than it is used to deceive the user.

Most of the big sites use IP cloaking (google uses IP cloaking), is Google (or any SE) going to ban those sites that use IP cloaking to enhance the user experience? I think the answer is out there and it’s no they are not. So I do think it is safe to say that IP cloaking is going to be with us (like it or not) for quite some time.

<added> Bill, you better stay out of the moon light, it seems to have an effect on you. :wink: </added>

#4 Guest_scottiecl_*

  • Guests

Posted 29 October 2003 - 02:05 PM

I think Google probably doesn't care much about your intent... if you are showing different pages to users than their spider is being shown, they are going to pull the plug.

They've clearly stated that they don't like that practice in their Webmaster Guidelines.

I still haven't seen a valid use of cloaking to improve user experience. If your site is all-Flash or all-graphics, for the same amount of work as cloaking, you can simply build a text-rich HTML version of the site that users can see (if they wish) as well as SE Spiders. Obviously, these become your landing pages for the SE's as well and users can then select the graphic- or flash-version of the site if they choose, or browse the HTML version you have so graciously provided for them.

It would be an interesting test to track how many users prefer which version...

#5 Guest_Lots0_*

  • Guests

Posted 29 October 2003 - 04:15 PM

I still haven't seen a valid use of cloaking to improve user experience.

I'll give you one good example of IP cloaking to improve user experence. When a person logs onto google and they are in france they get google.fr, when a different person logs onto google in Italy they get google.it and on and on.
That is just one use of IP cloaking to improve user experence that comes to mind right off the bat. Give the user content based on their location and language.

Of course, I understand that to some people IP cloaking will always be evil no matter how it is used. :wink:

#6 Guest_scottiecl_*

  • Guests

Posted 30 October 2003 - 12:41 AM


To me, that is IP delivery which has many valid uses. I agree it's not going away.

Cloaking (IMO) is showing the search engine something different than the user will see when they visit the same page. Cloaking is a word that means hiding, shrouding in darkness.

#7 Guest_Lots0_*

  • Guests

Posted 30 October 2003 - 02:20 AM

Your right IP cloaking and IP delivery is


they are both the exact same thing. The only difference would be intent... and for that argument you can re-read my first post in this thread.

And what was your argument anyway scottie?? That IP cloaking/IP Delivery is evil and only evil bad people use IP cloaking/IP delivery?? I think that argument is a little beneath you.

#8 Black_Knight


    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 9417 posts

Posted 30 October 2003 - 02:43 AM

No, I think her point was clearly divided into deceptive cloaking versus contextual delivery (IP, useragent or browser, used to serve appropriate, but representative content).

As a rough (very) rule of thumb, where IP detection is used to determine geography, then it is IP delivery. Most other cases would be cloaking.

When I first heard the term cloaking, by some of the guys who first coinrd the phrase, it was used specifically in the context of hiding something search engine optimized pages from users, and only serving them to search engines, and was used exactly as a more thorough method of doorway redirection than JavaScript allows.

People have, I admit, used (or rather misused) the word 'cloaking' many times over the years since then. I don't think that makes them right, anymore than the people who call refer to JavaScript as Java are right.

Cloaking is a specific use of IP or User-agent based delivery.

#9 Guest_Lots0_*

  • Guests

Posted 30 October 2003 - 03:37 AM

Cloaking is a specific use of IP or User-agent based delivery.

Point taken.

#10 gravelsack


    Mach 1 Member

  • Members
  • 401 posts

Posted 30 October 2003 - 05:14 AM

I think Google probably doesn't care much about your intent...

Can't really agree with you there.

Google care about intent in so far as it can make the SERPs look good or bad. There are plenty of large sites using IP Delivery without problem.

IMO the difference between cloaking and IP Delivery is down to intent, as Ammon says.

#11 Guest_scottiecl_*

  • Guests

Posted 30 October 2003 - 10:53 AM

If you show something contextually different to search engine spiders than you show to users, I don't believe Google cares that your intent was to improve the user experience. They want the page they spider to be the same page they deliver the searcher to. They can't tell your intent- they can only determine whether the page matches or not.

I'm sorry if I wasn't clear Lotso- Ammon cleared it up though. The technique of IP delivery in itself isn't a problem- it's only when it is used to hide things that it becomes cloaking in my book.

#12 gravelsack


    Mach 1 Member

  • Members
  • 401 posts

Posted 30 October 2003 - 02:54 PM

They can't tell your intent- they can only determine whether the page matches or not.

Despite the FUD spread by Google, they do not practice this even handedly.

A site using geo-targeting IP delivery is presenting very different information to the user than it is to the spider. If they had 100% automatic detection in place they would ban all these sites, and they don't.

Don't want to name names, but there are a few large on-line retailers who spend a fortune on Google Adwords and seem to get a 'get out of jail free' card on their cloaking.

IMO Its not even handed at all, its applied in a way that best suits their business objectives.

RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users