Jump to content

Cre8asiteforums

Web Site Design, Usability, SEO & Marketing Discussion and Support

  • Announcements

    • cre8pc

      20 Years! Cre8asiteforums 1998 - 2018   01/18/2018

      Cre8asiteforums In Its 20th Year In case you didn't know, Internet Marketing Ninjas released many of the online forums they had acquired, such as WebmasterWorld, SEOChat, several DevShed properties and these forums back to their founders. You will notice a new user interface for Cre8asiteforums, the software was upgraded, and it was moved to a new server.  Founder, Kim Krause Berg, who was retained as forums Admin when the forums were sold, is the hotel manager here, with the help of long-time member, "iamlost" as backup. Kim is shouldering the expenses of keeping the place going, so if you have any inclination towards making a donation or putting up a banner, she is most appreciative of your financial support. 
bwelford

Supplemental Index Ratio Calculator

Recommended Posts

I see there's a new Supplemental Index Ratio Calculator. (Tip of the hat to SEO Buzz Box)

 

From tests I've done it seems to be in line with prior methods of estimating that no longer work. I wonder how long this one will survive, since Google seems to feel this supplemental index ratio is Company Confidential.

Share this post


Link to post
Share on other sites

I wonder what percent the average site has in supplemental? What is good, what is deadly?

Share this post


Link to post
Share on other sites

I find the following with sites that have a few hundred web pages.

 

A blog might have only about 25% +/- in the supplemental index. If there are significantly more in the supplemental index then you've really got to work on getting inlinks to more individual blog posts. It's all a matter of pagerank. If the homepage has a pagerank of 3 then it's that much tougher.

 

An ecommerce website that has pages that may almost be duplicates of the manufacturer's description can have 70% or more in the supplemental index. That's tough to correct.

 

Andy Beard in Sphinn has noted that this method was mentioned in Search Engine Journal a while back and it doesn't work for his site. He is still using the method using a Google search for site:domain/*. I find that method now brings anomalous results for me.

Share this post


Link to post
Share on other sites

Thanks Barry. I appreciate the link and the additional information. So, it sounds like a lack of links and unique content are the common reasons for pages going supplemental.

Share this post


Link to post
Share on other sites

Just quoting :D

 

From http://googlewebmastercentral.blogspot.com...mainstream.html

These are often pages with lower PageRank or those with more complex URLs.
The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results."

 

Personally, I would not count on any of these tools to show you values for what you think you are measuring.

 

I imagine you are trying to measure is the percentage of ancillary URLs among your indexed URLs, in other words:

 

ancillary ratio = (# indexable URLs - # content items) / (# content items)

[just made the name up :) ]

 

You could determine that by checking the number of content items (blog posts, articles for sale in your shop, etc) and comparing that with the number of indexable URLs (check with a crawler like Xenu or one of the others -- or if you want an easy but certainly incorrect number: use a site:-query in one of the search engines). You'll never get that number down to zero, as there will always be pages that you need as navigational aides, but if it is very high (say over 100%), you might want to look at what kind of indexable URLs you find.

 

John

Share this post


Link to post
Share on other sites

Thanks, John, that sounds like a very sound measure for a site. The $64 question of course is whether that ancillary ratio is equal to the proportion of your indexable URLs that Google puts in the supplemental index. It would be nice, but I imagine it isn't.

Share this post


Link to post
Share on other sites

Ah, but take a step back and look at the bigger picture. The old "supplemental index" is just a metaphor for the issue you are probably really worried about: that your content is not being valued as well as it should be. I know you can promote your work (site + content) so let's ignore that part. So how could the content be valued below your expectations? If the effects of your promotion are spread too thin. ... or if you were to say promotion is proportional to "PageRank", you could say that the PageRank of those URLs is too low :D .

 

John

Share this post


Link to post
Share on other sites

You're right, John. All we're trying to find out for a website is whether in Google's eyes the content is valued as well as it should be.

 

It would be nice to have something a little more informative than the Toolbar PageRank measure. That's really only comparing a website against the world. Clearly a small entrepreneur will never have a website that can get the weighty inlinks. But it would be nice if there was another measure that compared the current website with how well it could do. The Supplementary Index Ratio is an attempt at that, which Google doesn't seem to like to have published. Is there any other solid, calculable measure that gives a quick indication of how well the website is doing?

Share this post


Link to post
Share on other sites

Is there any other solid, calculable measure that gives a quick indication of how well the website is doing?

 

I have set data flags (up/down) for the following page/site exceptions:

1. Traffic volume.

2. Traffic volume by referer.

3. Traffic volume change rate.

4. Traffic volume change rate by referer.

5. Conversion rate.

6. Conversion rate by referer.

They pick up 'going supplemental' as well as other difficulties that may need treatment.

 

I try to stay away from other-party results simply because they come with no guarantee of quality or longevity. As shown by the recent inability to access prior data and the apparent changes in methodology.

Share this post


Link to post
Share on other sites

Some reason I have found that pages fall into the SI:

 

a. Duplicated content;

b. Too much content similarity;

c. Pages with low or no content;

d. Orphaned web pages. Pages that no one links to, including yourself;

e. Error pages, if a site does not use If-Modified-Since, Last Modified and/or Expires rules;

f. Poor website navigation;

g. Pages due to canonicalization problems;

h. Too low PageRank. Not enough back links;

i. Long URLs, especially with long parameters, starting with a question mark (?) and being separated with an ampersand (&) and are not rewritten;

j. Suspicious pages for spam-indexing, like non-unique and irrelevant to page content meta tags, or linking to bad neighborhoods, etc.

Share this post


Link to post
Share on other sites

Have you heard of Halfdeck's Pagerank bot?

 

http://www.seo4fun.com/php/pagerankbot.php

 

If you grab the Java version it will calculate the internal pagerank flow within your site.

 

It may give some strong indicators of pages that are more likely to end up in supp, which *apparently* is caused primarily by insufficient PR.

 

Rgds

Richard

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×