Supplemental Index Ratio Calculator
Posted 03 November 2007 - 08:34 PM
From tests I've done it seems to be in line with prior methods of estimating that no longer work. I wonder how long this one will survive, since Google seems to feel this supplemental index ratio is Company Confidential.
Posted 03 November 2007 - 11:34 PM
Posted 04 November 2007 - 10:52 AM
A blog might have only about 25% +/- in the supplemental index. If there are significantly more in the supplemental index then you've really got to work on getting inlinks to more individual blog posts. It's all a matter of pagerank. If the homepage has a pagerank of 3 then it's that much tougher.
An ecommerce website that has pages that may almost be duplicates of the manufacturer's description can have 70% or more in the supplemental index. That's tough to correct.
Andy Beard in Sphinn has noted that this method was mentioned in Search Engine Journal a while back and it doesn't work for his site. He is still using the method using a Google search for site:domain/*. I find that method now brings anomalous results for me.
Posted 04 November 2007 - 11:07 AM
Posted 04 November 2007 - 01:19 PM
These are often pages with lower PageRank or those with more complex URLs.
The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results."
Personally, I would not count on any of these tools to show you values for what you think you are measuring.
I imagine you are trying to measure is the percentage of ancillary URLs among your indexed URLs, in other words:
ancillary ratio = (# indexable URLs - # content items) / (# content items)
[just made the name up ]
You could determine that by checking the number of content items (blog posts, articles for sale in your shop, etc) and comparing that with the number of indexable URLs (check with a crawler like Xenu or one of the others -- or if you want an easy but certainly incorrect number: use a site:-query in one of the search engines). You'll never get that number down to zero, as there will always be pages that you need as navigational aides, but if it is very high (say over 100%), you might want to look at what kind of indexable URLs you find.
Posted 04 November 2007 - 02:14 PM
Posted 04 November 2007 - 02:46 PM
Posted 04 November 2007 - 06:48 PM
It would be nice to have something a little more informative than the Toolbar PageRank measure. That's really only comparing a website against the world. Clearly a small entrepreneur will never have a website that can get the weighty inlinks. But it would be nice if there was another measure that compared the current website with how well it could do. The Supplementary Index Ratio is an attempt at that, which Google doesn't seem to like to have published. Is there any other solid, calculable measure that gives a quick indication of how well the website is doing?
Posted 04 November 2007 - 07:49 PM
I have set data flags (up/down) for the following page/site exceptions:
Is there any other solid, calculable measure that gives a quick indication of how well the website is doing?
1. Traffic volume.
2. Traffic volume by referer.
3. Traffic volume change rate.
4. Traffic volume change rate by referer.
5. Conversion rate.
6. Conversion rate by referer.
They pick up 'going supplemental' as well as other difficulties that may need treatment.
I try to stay away from other-party results simply because they come with no guarantee of quality or longevity. As shown by the recent inability to access prior data and the apparent changes in methodology.
Posted 05 November 2007 - 10:52 AM
a. Duplicated content;
b. Too much content similarity;
c. Pages with low or no content;
d. Orphaned web pages. Pages that no one links to, including yourself;
e. Error pages, if a site does not use If-Modified-Since, Last Modified and/or Expires rules;
f. Poor website navigation;
g. Pages due to canonicalization problems;
h. Too low PageRank. Not enough back links;
i. Long URLs, especially with long parameters, starting with a question mark (?) and being separated with an ampersand (&) and are not rewritten;
j. Suspicious pages for spam-indexing, like non-unique and irrelevant to page content meta tags, or linking to bad neighborhoods, etc.
Posted 08 November 2007 - 04:39 PM
If you grab the Java version it will calculate the internal pagerank flow within your site.
It may give some strong indicators of pages that are more likely to end up in supp, which *apparently* is caused primarily by insufficient PR.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users