Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


Photo

the -noblog option or blog tab: can it be done?


  • Please log in to reply
41 replies to this topic

#1 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 11 February 2004 - 02:15 PM

For some time I've been reading rumours about Google's attempts to get rid of blog posts showing up very high in searches. A -noblog option is mentioned as is a seperate blog tab, not unlike the images or groups tab.

Now images and usenet groups are uniquely seperate content. Filtering them out into a tab, page, bucket or whatever is as easy as pie. But filtering out *blogs*?

How would they identify a blog? Some pages have tell tale signs. Standard b2 templates will contain a good number of comment lines about this being the start of the b2 engine. Some might identify themselves through meta tags. But that's about it.

I use blogging scripts because they're essentially the leanest, meanest CMS you can have for small to medium sites. Some content remains the same for a long time but I also have sections I update daily. How in the world would Google diffirentiate between a personal bla-bla blog and a valid resource?

Apart from a pure theoretical interest I have a practical one as well; see my mention above that I *use* blogging scripts :-)

ps: what are the likelihoods that what we're observing at Google are in fact attempts to get rid of blognoise while at the same time remaining relevant?

Ruud

#2 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 11 February 2004 - 05:09 PM

>what are the likelihoods that what we're observing at Google are in fact attempts to get rid of blognoise while at the same time remaining relevant?

Currently, I can't say that I've seen any concrete attempts to limit blognoise though AOL did recently put pressure on Google to do something in the algo (AOL went in and hand-edited some infamous examples involving G. Bush). Soooo, I would say that blogs are definitely on the yellow-flag list.

Particularly if I were using MoveableType, and largely because of its (A) popularity and (B) vulnerability to comment spam, I'd start eliminating fingerprints --and there are a LOT of them. I don't use a blog, but I do use one or two stock (and quite a few custom) CMS scripts. Before I launch the first page using a stock script, the first thing I do is spend some time renaming and recoding the key phrases; cgi names, file calls, etc.

#3 cre8pc

cre8pc

    Dream Catcher Forums Founder

  • Admin - Top Level
  • 13557 posts

Posted 11 February 2004 - 05:14 PM

Seems a tad hypocritical and not so cost effective to ban the thing you promote for the company you bought?

Google Options

Kim

#4 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 11 February 2004 - 05:32 PM

>to ban the thing you promote for the company you bought?

Note that I didn't use blogger.com as an example. ...But conspiracy theories aside, blogs are drawing a lot of negative attention and SEO and negative attention don't mix over the long term.

#5 cre8pc

cre8pc

    Dream Catcher Forums Founder

  • Admin - Top Level
  • 13557 posts

Posted 11 February 2004 - 05:48 PM

I'm not into conspiracy theories myself. I'm a blogger.com user who got a free Blogger sweatshirt from Google when they bought Blogger.

I've been bribed to believe. 8)

Kim

#6 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 11 February 2004 - 05:51 PM

>I've been bribed to believe.

So have I; adsense.

#7 bragadocchio

bragadocchio

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15634 posts

Posted 11 February 2004 - 06:43 PM

Hi Ruud,

The genesis of that rumor is in a misinterpretation of a statement made by Google CEO Eric Schmidt. The quote was supposedly:

"Soon the company will also offer a service for searching Web logs, known as blogs."


A reporter for the Register, Andrew Orlowski, reported that this meant that Google would remove blogs from the general search population: Google to fix blog noise problem

A nice retort to the Register Article is here: Andrew Orlowski is a lousy blogger...

How in the world would Google diffirentiate between a personal bla-bla blog and a valid resource?


Exactly. How would it know?

Would it remove a site from its main index. One that has been indexed on Google since the beta days, after realizing that the site added a blog page to keep visitors informed of changes and updates, and to share some expertise with them on the topic of the site?

Funny, some of the most useful information I find on the web these days are on blogs. The web would lose a great amount of relevancy if blogs were isolated from the main index. One example - when I do a search for:

web standards css

and I notice that there are blogs such as http://zeldman.com in the top ten, I count that result as a particularly good one. I'd much rather rely upon the author of that blog for information about web standards then the World Wide Web consortium. Zeldman is much better at explaining the intricacies of standards.

Just because it's a blog, doesn't mean it's noise.


Disclaimer: I've been using blogger for free since July, 2001. :)

#8 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 11 February 2004 - 08:28 PM

AOL: "Given the increase in link spam and the attention on it, we will focus our efforts on working directly with our partner Google on the larger issue rather than attempt to enforce it one link at a time,"

Direct Marketer News

Given a choice, I'd wipe my blog prints.

As to the how to identify, I'd guess that would not be too difficult (assuming we're going above the simplistic example of simply skimming off 'powered by moveabletype' pages).

Though I haven't delved into the details (so this may very well turn out to be a bad example), I did notice recently that Technorati seems to be identifying blog links as a subset of inbound links. If Technorati can do it....

Disclaimer: I installed and tested blosxom once.

#9 bragadocchio

bragadocchio

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15634 posts

Posted 11 February 2004 - 08:45 PM

The focus of that article is upon google bombing and "link spamming", which may happen upon blogs, but is a failure of a link popularity indexing system.

One solution is to devalue link text, and increase a value based upon themes and links from themed sites. Another might be to remove blogs from general listings.

But I sort of see today's blogs as tomorrow's content management systems for most sites. It makes some sense to build sites that clients have more control managing. Will Google have a role in that? What would you do with Blogger besides context sensitive ads on blogspot?

Are the changes currently taking place in Google's latest updates of the type that makes its index less prone to link spamming? Localrank or Hilltop, or Latent Semantic Indexing, or some similar indexing might help remove that problem.

#10 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 11 February 2004 - 09:14 PM

>focus

The delivery mechanism isn't the real problem, so it's of little concern whether we're talking about linkfarms, guestbooks, or blogs. The question for the concerned webmaster is whether any one of the major search engines will someday decide that the damage being done to their serps outweighs the collateral damage they will cause when that SE decides to take defensive action.

Do I think blogs will be eliminated from the serps? No. Do I think that blogs (and backlinks from blogs) are strong candidates for a negatively weighted filter? Yes.

#11 bragadocchio

bragadocchio

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15634 posts

Posted 11 February 2004 - 10:37 PM

Good points, nicely phrased. We are concerned here about what this might mean to web site owners.

But, I think we need to consider a range of potential approaches.

I can see how you could envision a negatively weighted filter on links from blogs and link farms and guestbooks. That is a reasonable possibility. But, is it the ideal approach? I'm not so sure.

Is the ideal approach obtainable? Maybe.

Do we improve the present system by creating a large series of negative filters? Probably. Until at some point another method is discovered that ends up being better, or at least having the potential to be better, and is possibly less expensive.

A favorite description of what might be described as the pre-Florida update Google, from a Google patent application on usage statistics to be used in retrieving documents:

[0005] People generally surf the web based on its link graph structure, often starting with high quality human-maintained indices or search engines. Human-maintained lists cover popular topics effectively but are subjective, expensive to build and maintain, slow to improve, and do not cover all esoteric topics. 

[0006] Automated search engines, in contrast, locate web sites by matching search terms entered by the user to an indexed corpus of web pages. Generally, the search engine returns a list of web sites sorted based on relevance to the user's search terms. Determining the correct relevance, or importance, of a web page to a user, however, can be a difficult task. For one thing, the importance of a web page to the user is inherently subjective and depends on the user's interests, knowledge, and attitudes. There is, however, much that can be determined objectively about the relative importance of a web page. 

[0007] Conventional methods of determining relevance are based on matching a user's search terms to terms indexed from web pages. More advanced techniques determine the importance of a web page based on more than the content of the web page. For example, one known method, described in the article entitled "The Anatomy of a Large-Scale Hypertextual Search Engine," by Sergey Brin and Lawrence Page, assigns a degree of importance to a web page based on the link structure of the web page. 

[0008] Each of these conventional methods has shortcomings, however. Term-based methods are biased towards pages whose content or display is carefully chosen towards the given term-based method. Thus, they can be easily manipulated by the designers of the web page. Link-based methods have the problem that relatively new pages have usually fewer hyperlinks pointing to them than older pages, which tends to give a lower score to newer pages. 

[0009] There exists, therefore, a need to develop other techniques for determining the importance of documents.



The idea approach, is creating a system that can provide material and relevant pages, without having a human editor index them. The more determination that can be done before a search is conducted, without significantly increasing a requirement for memory and processing power, the better.

That approach may require that the words used in a link become less important, and the context of the link becomes more important. It really shouldn't matter if a link is from a blog, or a more static page. The more material and relevant result is the one that should show up first. There are times when that result will be on a blog.

A negative filter on links from a blog has the potential of causing more harm than good to search results. Especially if the link from the blog is the better one.

#12 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 12 February 2004 - 12:09 AM

Very interesting stuff.

I sort of see today's blogs as tomorrow's content management systems for most sites. It makes some sense to build sites that clients have more control managing.


Absolutely. B2 for one has moved towards that. MT does it. What starts as a simple blogging system is hacked up until in the end CMS is a better description. With some major, relevant resources already running purely off a blogging script the delivery mechanism by itself cannot be the way to determine that relevancy.

Do I think that blogs (and backlinks from blogs) are strong candidates for a negatively weighted filter? Yes.


There might be my answer. Not having the mind for algorithms I can't wrap my mind around this as thoroughly as I would like to but we all know, realize, feel that the way bla-bla blogs are (inter)linking is very different from the way content blogs are linking. Therefor it is conceivable to come up with a weighing system, right?

As to the how to identify, I'd guess that would not be too difficult


But it is. If I paste my Notepad handcoded XHTML valid, CSS powered template into a blogging system you can't see where it comes from. Who could see from a webpage if it was coded in notepad or wordpad?

A nice retort to the Register Article is here: Andrew Orlowski is a lousy blogger...


Very good! Thank you!

Funny, some of the most useful information I find on the web these days are on blogs. The web would lose a great amount of relevancy if blogs were isolated from the main index.


Still, to my feeling they have changed something. "Back then" I wasn't "into" SEO so much so I have nothing to back this up but I clearly remember a time when several searches, "current issues" ones for example, would result loads of bla-bla blogs. I can't find bla-bla blogs on the first 20 results in searches for current issues.

Seems a tad hypocritical and not so cost effective to ban the thing you promote for the company you bought?


Not at all. Usenet groups in a Groups tab is not banning, it is relevant. Product searches in Froogle is not banning, it is relevant. Books in a books search is not banning, it is relevant. Personal blogs in a blog search is not banning, it is relevant. And relevant is the name of the game, right?

Now which community has or would have its own search powered by the market's dominant player? Not PHPNuke hack sites, not DVD ripping communties - we're talking personal bla-bla blogs here. That is an honour almost!

So why buy the company?

Powered by Movable Type: 52,700
Powered by B2: 319,000
Powered by WordPress: 101,000
Powered by pMachine: 71,000

Makes for a total of 543,700

Powered by Blogger: 889,000

By buying blogger/blogspot Google has acquired over 60% of identified blog-ware user sites.

Now.... if Google would want to put personal blogs into a seperate search, which a lot of those *users* would find terribly interesting, then it has a major base to start with. From day one that search would be highly relevant. Then offer other bloggers out there a meta tag like <meta name="googleblog" content="index"/> and for sure complete tribes will follow.

I love the brainstorming sessions on the board!

Ruud

#13 wayne h

wayne h

    Ready To Fly Member

  • Members
  • 25 posts

Posted 12 February 2004 - 07:09 AM

Blog comment spam is not the fault of bloggers. It is a weakness and flaw in the commenting systems. To blame blogs is to misplace the anger with all spamming. I would hope that the same people upset with blog comment spammers are also concerned with all spammy black hat SEO too.

Blogs are doing what the search engines, particularly Google in its own recommendations suggest.

The guidelines say to update regulary. Blogs certainly do that with daily and even multiple times per day.

The guidelines suggest good content being important. Blogs that have been in existance for any length of time have tons of content, neatly archived and linkable from other sites.

The guidelines express the importance of incoming links. Bloggers are free and generous linkers. Blogs constantly link to one another's and static website content without asking for reciprical links.

Blogs are doing what the search engines want. To add a "blog filter" would only penalize blogs for doing what is recommended by Google and other search engines. Such a filter would probably drive out good informational blogs with the much maligned "personal journal or diary" with its "cheese sandwich for lunch" content.

On the other hand, blogs that show up highly in the SERPs tend to have strong theme related content. Isn't that what the search engine user is seeking?

Wayne Hurlbert

#14 bragadocchio

bragadocchio

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15634 posts

Posted 12 February 2004 - 08:21 AM

On the other hand, blogs that show up highly in the SERPs tend to have strong theme related content. Isn't that what the search engine user is seeking?


Excellent point. Topical blogs can be some of the best sources for information on the subjects taht they cover. Not only are they wonderful places to find links and the latest news, but the person who runs the site is often an expert on the subject, and can add a lot to it.

Google is a failed attempt at building a system to annotate the web. It's very successful as a search engine, but the original intent from Brin and Page was to build some way of allowing people to find the best of the best. Blogs are a much better avenue to that destination than Google. Interestingly, blogs have the potential to help the search engine by allowing "authorities" to point to material and relevan sites on specific subjects.

#15 Guest_rustybrick_*

Guest_rustybrick_*
  • Guests

Posted 12 February 2004 - 09:51 AM

Interestingly, blogs have the potential to help the search engine by allowing "authorities" to point to material and relevan sites on specific subjects.


Are you saying that blogs are the authority or that authorities will point to blogs?

Thanks.

#16 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 12 February 2004 - 10:06 AM

>authority

The Blogging Iceberg - Of 4.12 Million Hosted Weblogs, Most Little Seen, Quickly Abandoned

#17 Guest_rustybrick_*

Guest_rustybrick_*
  • Guests

Posted 12 February 2004 - 10:08 AM

rcjordan, is that a response to my question?

#18 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 12 February 2004 - 10:58 AM

>response

Yes, but pretty much a summation of the 'why' part of this thread as well. Are there useful blogs out there? Sure. There are, ...well were, some good guestbooks out there, too. Same for (mostly on-site) link directories. I can distinctly recall ferreting out good research data from both, along with those old threaded board posts (what was the name of that script? can't recall). At any rate, there were some great individual sites, but their genre as a whole began to destabilize the serps.

>can it be done?

Absolutely. Will the SEs spend a great deal of effort fine-tuning the algo in order to carefully tip-toe through the blogs and examine them on their individual merits? Looking at their existing SEO stop-words and the scroogle hit-list, my guess would be "no."

#19 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 12 February 2004 - 11:35 AM

In my opinion they are the authority and point to relevant, authorative material and resources.

Ruud

#20 Guest_rustybrick_*

Guest_rustybrick_*
  • Guests

Posted 12 February 2004 - 11:45 AM

i posted this elsewhere.

I am pretty sure there is a big difference between sites that link to other sites and sites that are authorities.

A blog might be an authority because it has a lot of links to it, but it must be within a "hub" to obtain a certain theme.

I have seen tons of confusion at all the forums between authorities and hubs.

An authority is http://www.searchenginewatch.com/

A hub is (this)* Yahoo Directory Link

So with blogs, they might be linked to by an authority or might link to an authority. They might also be linked to by a hub. All these links, in reality make a theme.

By blogs pointing to relevant and authorative materials, it just expands the hub. I don't see how that will help a blog rank better.

[size=9]*<moderator's note - I made the URL shorter by making it a link- It lessens the impact of Barry's point a bit because you can no longer see the directory structure, but it makes it easier to read this thread on a small monitor - the length of the URL was forcing the whole page to need to scroll horizontally. Thanks for your understanding. WJS (Bragadocchio) >

#21 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 12 February 2004 - 12:56 PM

I am pretty sure there is a big difference between sites that link to other sites and sites that are authorities


Great example.

Fora and blogs are often oustanding and authoritive sources of information because they're written by people who do or love the thing they write about - but even better; they know what they're talking about; they're an authority of some kind in that area.

Take bragadocchio's search for web standards css as an example. The blogs on the first SERP contain very valid, very authoritive information.

Likewise a search for css 3 column layout produced relevant blogs.

In both instances I know that these are referenced throughout the community. We know that they are authoritive. And they're not afraid to link out - not to other blogs or to bla-bla blogs but to relevant, on topic content.

Compare this search for microsoft projects with the results for the search microsoft projects blog. Same goes for "microsoft development" repeated with "blog".

If you want to read about Microsoft's [ongoing] projects you're better of reading the blogs....

So: it is not the fact that certain blogs link to other content (theme) it is the fact that they provide highly specialised, on topic information about certain searches. This makes them an authority, imo.

As to the number of blogs published/updated/read/abandoned... that is useless information they're after. It doesn't tell you anything about the relevance of that blog. One blog with one post about one very specific solution to IE's box model bug (CSS) is and remains more relevant than a daily updated multi-page blog about someone's daily wanderings on this planet.

If you scour the web you will find thousands of abandoned websites and website projects. An equally high number of well run websites that are read by "even less". That in no way means there are no good websites, that websites are not visited, read and referenced as a whole, etc.

Ruud

#22 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 12 February 2004 - 01:06 PM

>it just expands the hub. I don't see how that will help a blog rank better.

It's not the blogs' rankings that's the problem, it's the use of their prolific link-making capabilities to falsely elevate their target as hubs.

Here blogdex admits/explains why you'll see porn sites rise through their rankings. A similar process works with Google's pagerank.
http://blogdex.net/news/

But, when the SE devalues the links generated by blogs (incoming as well as outgoing) it will have the effect of reducing a blog's individual position in the serps, too.

#23 Guest_rustybrick_*

Guest_rustybrick_*
  • Guests

Posted 12 February 2004 - 01:13 PM

when the SE devalues the links generated by blogs


Will they?

#24 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 12 February 2004 - 01:29 PM

The only thing I see that can be done in regard to blog-style linking is filtering on "trackback". And from what I see in Google they must already have done that. No links lead to the trackback pages itself. Searching on trackback in the URL still produces relevant links about trackback itself with an occassional trackback link itself.

I really think that an optional meta tag to have your blog included in a special blog index is the only way SE's can pull out the bla-bla blogs from the main index. For the posters it would be good news because the chance of getting found on a search is so much more better when only searched within "blogs" than within the whole main index. A similar approach could be (should be) used for fora. Win-win. Very specific results for the searches - better chance of targetted traffic for the owner.

Ruud

#25 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 12 February 2004 - 05:06 PM

Now for those that love a good conspiracy theory....

Google goes Atom


Changing Blogger's fingerprints? Nah! Well, maybe....

#26 wayne h

wayne h

    Ready To Fly Member

  • Members
  • 25 posts

Posted 13 February 2004 - 04:14 AM

I am always fascinated by the hostility that many static website owners (not all of course... :) ) have towards blogs.

The argument is often made that many blogs are abandoned. So are many websites.

I find a ton of old abandoned never updated (in the memory of anyone now living) websites in the SERPs. Is that hostility extended to them, and websites are suddenly to be excluded too because some were...gasp...abandoned?

One of the main problems is the lack of distinguishing of blogs that discuss a specific topic, often from an expert opinion, from the online diary. There is a huge difference between a blog that discusses what is happening at a company, and a diary that discusses the cheese sandwich eaten for lunch.

While both are referred to as "blogs", they are entirely different. I never discuss lunch on my blogs, and neither do many of the leading blogs in their topic areas. Maybe, I'll discuss my lunch some day, and become part of the evil. :evil:

Wayne Hurlbert

#27 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 23 February 2004 - 01:36 PM

Feb 23 - Quote from a certain Google rep:

...over the last few weeks, Google has started deploying better technology that negates the effect of blog comment spamming. The changes haven't fully rolled out yet...



#28 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 23 February 2004 - 01:44 PM

Where blog comment spamming would refer to those sites that on purpose spam blogs to increase pagerank?

Ruud

#29 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 23 February 2004 - 01:50 PM

>would refer to those sites that on purpose spam blogs to increase pagerank?


Yes. But search engines, google included, usually aren't too careful with the scalpel when they go in to remove a cancer. Collateral damage tends to be high.

#30 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 26 February 2004 - 12:13 PM

Whether or not Google should suppress blogs is an interesting question. If Google decides to do so, they'll probably use a sledgehammer. Instead, what they should try first is to solve the Google bomb problem, and then see what the situation looks like.

An easy fix for many bombs: Google should not use terms in external links to boost the rank of a page on those terms, unless those terms are on the page itself. This is a no-brainer. But it means another CPU cycle per link, which is why Google won't do it.

Google Watch has started what we hope will be The Last Google Bomb. We want to wake up certain out-of-touch executives that contrary to what Craig Silverstein has suggested, Google bombs are not cute and harmless.

You know what to do....

#31 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 07 March 2004 - 06:05 PM

Google Bomb progress report:

As far as I can tell, only one of my three "out-of-touch executives" links to Google's corporate executives page has kicked in. This is from the PR 7 scroogle.org page. Two more are in place on PR 6 pages that have yet to get counted.

Even with this one link, the Google Bomb "out-of-touch executives" is now forcing Google's page to show up in a search at position 21 out of about 53. It's still a very weak Google Bomb, because the quotation marks are required in the search box. But considering that Google's page is PR 10, that's at least evidence that the bomb will work in theory.

Now all I need is more co-conspirators to add links to their pages....

#32 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 13 March 2004 - 11:27 AM

Darn, I'm not doing all that well in Google. But in Yahoo, I'm already at number one for "out-of-touch executives" -- and this is true whether or not you use the quotation marks, and whether or not you use the hyphens.

It looks like Yahoo-bombing will be all the rage now, because it's so easy. Yahoo's only saving grace is that unlike Google, Yahoo isn't stupid enough to feature an "I'm feeling lucky" button.

#33 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 24 March 2004 - 10:30 PM

Success. Number one for "out-of-touch executives"

-- On Google it's number one ("I'm Feeling Lucky" honors)
-- On Yahoo it's number one
-- On MSN it's number one

#34 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 26 March 2004 - 12:11 PM

-- and number one on Alltheweb

Alltheweb just switched over to Yahoo results. Will AltaVista be next?
Are there no more mountains for my Google Bomb to climb?

#35 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6298 posts

Posted 27 March 2004 - 09:34 AM

Cool. This is the first time Cre8asite has been on the first page for a GoogleBomb, not to mention being a part of the planning that started the "out of touch executives" bomb. Thanks for the fun, Everyman. :P

G.

#36 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 31 March 2004 - 08:38 PM

Now number one in Google, Yahoo, MSN, Alltheweb, Altavista. This is even true without quotation marks or without hyphens.

It was interesting that the first week or two, this forum and my domains pushing this Google bomb were riding right below the actual Google execs.html page that I was targeting. That was the special boost for fresh pages that we're all so familiar with. Now these pages have dropped below the fold, where they belong in a search for "out of touch executives."

But the bomb itself is still firmly planted at number one, and I think it will stay there a good long time.

This entire experiment demonstrates that Google and Yahoo are ranking pages on the cheap. The word "touch" is not on Google's page, and it's not rocket science to figure this out. They don't want to invest the CPU cycles required to do a decent job of analyzing content on the page. Any additional CPU cycles, if they're done on-the-fly after the search terms are received from the searcher, means a slowdown in response time.

And any slowdown in response means you'd probably need 20,000 computers instead of 10,000 computers to recover the speed you lost. I'm beginning to believe that the massive spam we see in both Google and Yahoo is because they're more interested in their bottom line than in their search excellence. Whip out the SERPs, go for the 200 million searches per day, grab all the advertising revenue, cut corners wherever you can, and dazzle the media by claiming that your only interest is in excellence.

The media has swallowed it hook, line, and sinker. They think Google, in particular, is an example of corporate excellence and innovation. Bah, humbug.

#37 Black_Knight

Black_Knight

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 9339 posts

Posted 31 March 2004 - 11:47 PM

The word "executives" doesn't seem to appear on the page either, and yet we all know that's what it is about. Yet the only way the search engine can know is through applied semantics and through link-word analysis (or link reputation).

Most so-called Googlebombs are entirely pointless as anything but amusement to the initiated. Of all the Googlebombs that have hit the headlines, only a very few, such as "weapons of mass destruction" was a term anyone would be likely to search for in any instance except to see the Googlebomb effect they'd been told about.

I'd certainly never made an active search for 'out of touch executives' before, and predict that I never would have except for the other kind of PR effect - the publicity you managed to get for the term, and the curiosity to see it having heard of it.

The more worrying thing is when a search for "Google executives" or "Who runs Google" has a high-placed listing in the SERPs for a negative presentation. Yet that kind of situation has been happening for as long as there have been search engines.

Bowmac Internet had such an experience recently, where a search for their own company name brought up a thread in these forums where an apparently unhappy customer was letting rip about them. That's not a Googlebomb, but is something that can happen in any and all search engines, and happens every day.

In fact, it is so established that people set up entire domains just to attack companies. For examples see all the 'sucks' domains out there that try to do this all the time.
http://www.google.co.....llinurl:sucks

#38 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 01 April 2004 - 12:22 AM

OK, maybe I'm being dense here but between where this thread has gone and where the copy <> links thread is at, are we saying that linking is the #2 factor and anchor text linking the #1?

If so, then semantics is an additional downfall for the PR models out there, right? I mean, this way the web really becomes a model of trust: if they say it is about XYZ, it has to be XYZ.

Thinking on, this would be THE ultimate black-hat SEO trick, right? You cannot start to penalize sites for certain anchor text links because this way via a thread like this we could bomb ANY company out of the #1 results.

So basically, and now I'm not asking questions but thinking out loud on the paths laid out, what we're talking about is that until either the speed/power or the money comes along to go through the *fresh* results this trick will work. It is easy to spot but no-one does it because we all want our fresh index to be there a.s.a.p.

Have any test been done with this with terms one would search for? As Black_Knight pointed out, many of these projects are pointless....

Ruud

#39 Everyman

Everyman

    Whirl Wind Member

  • Members
  • 64 posts

Posted 01 April 2004 - 01:33 AM

Bowmac Internet had such an experience recently, where a search for their own company name brought up a thread in these forums where an apparently unhappy customer was letting rip about them. That's not a Googlebomb, but is something that can happen in any and all search engines, and happens every day.

I could have done this with any term, and targeted any page -- whether it was a legitimate page that I wanted to describe adversely with my terms, or whether it was my own adverse page that I put up, that I wanted to rank highly for an information term such as "Google executives."

You can call one a Google bomb and not call the other a Google bomb -- I don't really care about how you chose to define "Google bomb." And yes, it certainly depends on how competitive the keywords are. "Google executives" would be harder than "out-of-touch executives" was. Instead of five domains I might need fifteen. And "Britney Spears" would be even harder -- I might need a hundred domains. We all know that the worse the competition, the harder it is. So what's your point?

My point in that search engines -- Google and Yahoo specifically -- are ranking on the cheap. Anchor text in links is a poor man's content analyzer that allows you to snapshot the content of a page with fewer CPU cycles. And furthermore, it's a "two-fer" -- two for the price of one. You get link popularity factored in at the same time as you construct a cheap mini-keyword dictionary of a page.

Cheap, cheap, cheap. For all the high-sounding, long-winded threads about Hilltop this, and topic-sensitive PageRank that, and CIRCA the other thing, what it amounts to is that Google and Yahoo are little better than the spammers who play them like a violin. Google/Yahoo and the spammers deserve each other.

#40 rcjordan

rcjordan

    Gravity Master Member

  • Members
  • 189 posts

Posted 01 April 2004 - 10:46 AM

>Instead of five domains I might need fifteen. And "Britney Spears" would be even harder -- I might need a hundred domains. We all know that the worse the competition, the harder it is.

Exactly. Automate the process and a few thousand legitimate, independently owned domains linking back with varied, yet tuned/themed backlinks are but a click away.



RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users