Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


Photo

Supplemental Index Ratio Calculator


  • Please log in to reply
10 replies to this topic

#1 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9011 posts

Posted 03 November 2007 - 08:34 PM

I see there's a new Supplemental Index Ratio Calculator. (Tip of the hat to SEO Buzz Box)

From tests I've done it seems to be in line with prior methods of estimating that no longer work. I wonder how long this one will survive, since Google seems to feel this supplemental index ratio is Company Confidential.

#2 EGOL

EGOL

    Professor

  • Hall Of Fame
  • 5443 posts

Posted 03 November 2007 - 11:34 PM

I wonder what percent the average site has in supplemental? What is good, what is deadly?

#3 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9011 posts

Posted 04 November 2007 - 10:52 AM

I find the following with sites that have a few hundred web pages.

A blog might have only about 25% +/- in the supplemental index. If there are significantly more in the supplemental index then you've really got to work on getting inlinks to more individual blog posts. It's all a matter of pagerank. If the homepage has a pagerank of 3 then it's that much tougher.

An ecommerce website that has pages that may almost be duplicates of the manufacturer's description can have 70% or more in the supplemental index. That's tough to correct.

Andy Beard in Sphinn has noted that this method was mentioned in Search Engine Journal a while back and it doesn't work for his site. He is still using the method using a Google search for site:domain/*. I find that method now brings anomalous results for me.

#4 EGOL

EGOL

    Professor

  • Hall Of Fame
  • 5443 posts

Posted 04 November 2007 - 11:07 AM

Thanks Barry. I appreciate the link and the additional information. So, it sounds like a lack of links and unique content are the common reasons for pages going supplemental.

#5 JohnMu

JohnMu

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 3518 posts

Posted 04 November 2007 - 01:19 PM

Just quoting :D

From http://googlewebmast...mainstream.html

These are often pages with lower PageRank or those with more complex URLs.

The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results."


Personally, I would not count on any of these tools to show you values for what you think you are measuring.

I imagine you are trying to measure is the percentage of ancillary URLs among your indexed URLs, in other words:

ancillary ratio = (# indexable URLs - # content items) / (# content items)
[just made the name up :) ]

You could determine that by checking the number of content items (blog posts, articles for sale in your shop, etc) and comparing that with the number of indexable URLs (check with a crawler like Xenu or one of the others -- or if you want an easy but certainly incorrect number: use a site:-query in one of the search engines). You'll never get that number down to zero, as there will always be pages that you need as navigational aides, but if it is very high (say over 100%), you might want to look at what kind of indexable URLs you find.

John

#6 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9011 posts

Posted 04 November 2007 - 02:14 PM

Thanks, John, that sounds like a very sound measure for a site. The $64 question of course is whether that ancillary ratio is equal to the proportion of your indexable URLs that Google puts in the supplemental index. It would be nice, but I imagine it isn't.

#7 JohnMu

JohnMu

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 3518 posts

Posted 04 November 2007 - 02:46 PM

Ah, but take a step back and look at the bigger picture. The old "supplemental index" is just a metaphor for the issue you are probably really worried about: that your content is not being valued as well as it should be. I know you can promote your work (site + content) so let's ignore that part. So how could the content be valued below your expectations? If the effects of your promotion are spread too thin. ... or if you were to say promotion is proportional to "PageRank", you could say that the PageRank of those URLs is too low :D .

John

#8 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9011 posts

Posted 04 November 2007 - 06:48 PM

You're right, John. All we're trying to find out for a website is whether in Google's eyes the content is valued as well as it should be.

It would be nice to have something a little more informative than the Toolbar PageRank measure. That's really only comparing a website against the world. Clearly a small entrepreneur will never have a website that can get the weighty inlinks. But it would be nice if there was another measure that compared the current website with how well it could do. The Supplementary Index Ratio is an attempt at that, which Google doesn't seem to like to have published. Is there any other solid, calculable measure that gives a quick indication of how well the website is doing?

#9 iamlost

iamlost

    The Wind Master

  • Site Administrators
  • 4623 posts

Posted 04 November 2007 - 07:49 PM

Is there any other solid, calculable measure that gives a quick indication of how well the website is doing?

I have set data flags (up/down) for the following page/site exceptions:
1. Traffic volume.
2. Traffic volume by referer.
3. Traffic volume change rate.
4. Traffic volume change rate by referer.
5. Conversion rate.
6. Conversion rate by referer.
They pick up 'going supplemental' as well as other difficulties that may need treatment.

I try to stay away from other-party results simply because they come with no guarantee of quality or longevity. As shown by the recent inability to access prior data and the apparent changes in methodology.

#10 incrediblehelp

incrediblehelp

    Ready To Fly Member

  • Members
  • 24 posts

Posted 05 November 2007 - 10:52 AM

Some reason I have found that pages fall into the SI:

a. Duplicated content;
b. Too much content similarity;
c. Pages with low or no content;
d. Orphaned web pages. Pages that no one links to, including yourself;
e. Error pages, if a site does not use If-Modified-Since, Last Modified and/or Expires rules;
f. Poor website navigation;
g. Pages due to canonicalization problems;
h. Too low PageRank. Not enough back links;
i. Long URLs, especially with long parameters, starting with a question mark (?) and being separated with an ampersand (&) and are not rewritten;
j. Suspicious pages for spam-indexing, like non-unique and irrelevant to page content meta tags, or linking to bad neighborhoods, etc.

#11 redcardinal

redcardinal

    Ready To Fly Member

  • Members
  • 11 posts

Posted 08 November 2007 - 04:39 PM

Have you heard of Halfdeck's Pagerank bot?

http://www.seo4fun.c...pagerankbot.php

If you grab the Java version it will calculate the internal pagerank flow within your site.

It may give some strong indicators of pages that are more likely to end up in supp, which *apparently* is caused primarily by insufficient PR.

Rgds
Richard



RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users