Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


Photo

How Effective Is Google At...


  • Please log in to reply
1 reply to this topic

#1 iamlost

iamlost

    The Wind Master

  • Site Administrators
  • 4633 posts

Posted 27 April 2012 - 11:47 AM

I've been seeing references to this vid from GoogleWebmasterHelp (aka Matt Cutts) How effective is Google now at handling content supplied via Ajax?.

Short version: Google parsing/understanding of Ajax is still somewhere down the road (at least publicly).

Moral of fable: blocking javascript and CSS via robots.txt is (undefinably) bad.

Why did I bother with this same old, same old story?
Because Matt felt the need to once again find an excuse to cry over - obviously an implicitly large number/percentage of sites - Google being officially blocked from presentation (CSS) and behaviour (scripting, most commonly javascript).

I find this plea rather disingenuous as Google doesn't only crawl through a site by following page links but by following backlinks and made up and guessed links including those to - or likely to be to - robots.txt blocked directories et al. Is G feeling uncomfortable about the use of this increasingly prevalent 'bypass workaround'? Or are enough of those sites blocking such information also blocking direct access? Mmmm...

#2 Michael_Martinez

Michael_Martinez

    Time Traveler Member

  • 1000 Post Club
  • 1354 posts

Posted 29 April 2012 - 01:30 AM

These kinds of videos don't strike me as being desperate pleas for help from Google. More often, they seem to presage eventual algorithmic adjustments -- or they explain algorithmic features/adjustments that have already been rolled out into the field.

Google is certainly doing more to render pages virtually now. Blocking CSS and Javascript impairs that judgement so you have to take that into consideration.



RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users