Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


Photo

Can You Have Too Many Web Pages?


  • Please log in to reply
12 replies to this topic

#1 bwelford

bwelford

    Peacekeeper Administrator

  • Admin - Top Level
  • 8995 posts

Posted 22 April 2007 - 06:38 AM

John Scott of the v7 Network has an interesting post this morning entitled Excess pages polluting your website?. He describes what he did as follows:

And I did not delete the threads - I simply moved them to a private, hidden, admin-access-only forum.

Within a couple weeks, I started to see the remaining pages performing much better. Within two weeks, search engine referrals were up 7,000 per day.

I've noticed a significant increase in Google traffic over the last two weeks and I've been wondering whether the Google algorithm has improved. I commented on John's blog that if this were so then in 6 weeks he could reverse the decision and see no harmful effect. That would be a decision not to be taken lightly?

What's your view on this? Is John Scott right?

#2 A.N.Onym

A.N.Onym

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 4003 posts

Posted 22 April 2007 - 06:59 AM

I've heard a similar opinion earlier somewhere. Like too few backlinks and too many pages will put most of your pages beyond the visibility threshold. I think so too, as it can be naturally explained.

After you get solid backlinks, I'd think it'd be safe to increase the number of pages proportionally.

If John sees this thread (or not), it'd be nice to know the size of pages moved to the total number of pages, as well as total traffic per day. I wonder if those are comparable or it can be explained by a shift (if 7k per day is less than 1% of total traffic, for example).

Edited by A.N.Onym, 22 April 2007 - 07:04 AM.


#3 EGOL

EGOL

    Professor

  • Hall Of Fame
  • 5177 posts

Posted 22 April 2007 - 07:46 AM

I am glad that you brought this topic up for discussion. I've been thinking about it in regards to a blog that I would like to retool.

Let's say you have a site with a few hundred decent links. That might be enough to make a powerful niche site but if you had a site the size of wikipedia then you would have a ton of pages that don't get enough spidering to stay healthy in the index. I think that some of the juice would disappear into those deep pages and none of your pages would rank very well.


The blog that I am working on does news commentary and reviews. Lots of the pages now link to dead URLs or redirects on other sites. What should I do with those post pages. Here is my thinking.....

1) pages that have links and get nice search traffic... these should be rewriten into small articles, updating the original post with a strong story that might attract links, then refeature on the homepage because it is a proven popular topic.

2) pages that get a little search traffic.... these should be kept, maybe edit them to say that the original reference link is now not working and perhaps find an alternative URL that visitors might like. Look at their SEO, maybe a few minutes of KW research or reoptimization would pull a little more traffic.

3) pages that get little to no search traffic. Here is the dilema... delete them? I like deleting them because I think that they add dead weight to my site - at least for google. They might help a bit with the rankings of other pages on your site that they link to - at least on the MSN search engine. I don't like these posts because the topic has not generated traffic or links - they make my blog look untended to anyone who visits these pages.

I think that I will delete them from the blog. What do you think?

Edited by EGOL, 22 April 2007 - 08:45 AM.


#4 EGOL

EGOL

    Professor

  • Hall Of Fame
  • 5177 posts

Posted 22 April 2007 - 08:50 AM

I have also seen an increase in google traffic over the past two weeks. Some of it is a result of better SERP position because a couple of my competitors got dropped out of the SERPs. So, it might not be improved rankings for my site - just some of the competition getting toasted. Also, google did an update of image search that has brought in more traffic.

Traffic has been down a bit over the past two days - I am assuming that good weather in the east is attracting people outside!

#5 bragadocchio

bragadocchio

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15634 posts

Posted 22 April 2007 - 08:52 AM

3) pages that get little to no search traffic. Here is the dilema... delete them? I like deleting them because I think that they add dead weight to your site - at least for google. They might help a bit with the rankings of other pages on your site that they link to - at least on the MSN search engine. I don't like these posts because the topic has not generated traffic or links - they make my blog look untended to anyone who visits these pages.



Might there be some value to going back and revisiting those posts, and adding updates? It could be for some of them that what has happened since they were written is interesting enough to do that with.

That could be done in conjunction with a new post on the topic, with a link pointing back to the original, and a note on the original pointing to the new link, explaining a little about why you might have wanted to do an update.

One of the more popular blog posts I've written was about "20 ways that search engines may rerank search results" and I pointed to a number of earlier blog posts in that post. Some of them had received traffic while others hadn't gotten much at the time that they were written or since. But, they provided a nice basis and background for the new post, and a chance for people to learn more than what was discussed in my overview post.

Did they help with search traffic? Without them, chances were good that the reranking post wouldn't have received the attention that it has.

Traffic has been down a bit over the past two days - I am assuming that good weather in the east is attracting people outside!


It's been unseasonably cold here on the east coast so far this spring, up until the last couple of days. There were a lot of folks around my neighborhood running around outdoors so far this weekend. ;)

#6 bwelford

bwelford

    Peacekeeper Administrator

  • Admin - Top Level
  • 8995 posts

Posted 22 April 2007 - 09:07 AM

Well perhaps I should state my position since I probably represent one end of the spectrum of opinions.

Firstly I believe that the dilution of page rank transfer by cutting out the number if internal links from a web page is probably a minor issue. Unless you were changing this by an order of magnitude or even say down to a quarter of what they were, then this won't help much.

On the other hand I think it's good to have lots of content on the website given the 'long tail' nature of searchers' keyword queries. So I would leave all web pages up. However Bill's suggestion of revisiting web pages and editing them to make them stronger is excellent.

#7 AbleReach

AbleReach

    Peacekeeper Administrator

  • Admin - Top Level
  • 6457 posts

Posted 22 April 2007 - 01:15 PM

I think that the way blog posts are archived by date is a little weak. If I already remember that there was a neat post last year on some topic I may use the month by month archive listings, but I'm more likely to use topical search terms or look through a category listing. The only times I use the date-by-date links are to see how active the blog has been over time, if I don't know the blog, or to see what someone I know wrote earlier in the week.

Could sites that display the calendar interface for past links improve their link juice (and usability) by replacing it with one link called "archives" or somesuch? Replacing a potential site-wide 30-link block with one link to an archive page would take control of 30 or so links out of the hands (software has hands?) of the software. Give that room to the stars. Posts that are sort of like the belles of the ball could be linked to a "featured posts" or "favorite posts" or "popular posts" link or section of links. Plugins can do that, but once in a while the featured posts may need to include something not picked up on by a plugin that sorts them for you.

Edited by AbleReach, 22 April 2007 - 01:16 PM.


#8 rustybrick

rustybrick

    Eyes Like Hawk Moderator

  • Moderators
  • 1361 posts

Posted 23 April 2007 - 06:01 AM

I just like to add this may be totally external... There was a major Google shake up on April 10th which actually began back a week prior but took major notice on April 10th.

Interesting thread, nevertheless.

#9 JohnScott

JohnScott

    Mach 1 Member

  • Members
  • 350 posts

Posted 23 April 2007 - 10:51 AM

If we assume that the the pages with the most inbound links carry the most weight, the rest is just logic.

http://www.v7n.com/f...b-designer.html

That is the type of thread that was removed. It isn't in Google's cache. It isn't indexed because it is too far away from any pages with link weight.

The page on v7n with the most links is the forum home page. Try to get to the example page from there and you will find it is a solid 4 clicks away from the forum home page.

Forum threads that are 3 clicks away from the forum home page tend to be indexed, and forum thread that are 2 clicks from the forum home page often generate search engine traffic.

But the way we have the forum set up, there are only 9,240 spots available within the 3 click range. Of course we could easily change and even double the number of threads within 3 clicks, but it would further dilute the link weight.

So, we have 9,240 odd spots. Should I fill those spots with threads titled "Hi I'm new", which will drive absolutely no search engine traffic, or should I fill those spots with threads that stand a chance to drive traffic?

:)

#10 Ruud

Ruud

    Hall of Fame

  • Hall Of Fame
  • 4887 posts

Posted 23 April 2007 - 11:33 AM

Of course this depends on what kind of site you have for whom. A company might not want to add too much content and might not want to remove older material either.

A forum is another beast. Pruning threads is in itself not a bad idea. If those are all "hi I'm new!" threads than I don't see it negatively impact search.

Purely working on gut feeling I would say that on a forum this large the gain in internal link distribution is similar to optimizing meta tags: if all is equal it helps a bit.

Let's say you have a site with a few hundred decent links. That might be enough to make a powerful niche site but if you had a site the size of wikipedia then you would have a ton of pages that don't get enough spidering to stay healthy in the index. I think that some of the juice would disappear into those deep pages and none of your pages would rank very well.


A variation of the "bleeding PageRank" idea?

I don't really believe in that. My view remains that every page is a site. For most purposes sites don't exists; only documents. If page A on your site has a given weight (call it PageRank or whatever) and you use that weight to vote/lift other pages, then page A itself does not lose weight itself.

Also, site A does not become a powerful player in a niche: single pages on site A do. Is page A ranks well for "mini widgets", it will continue to be ranked high for that term whether or not site A consists of just that one page or 100 pages.

Nor do I believe that regular spidering = healthy in the index. Google uses various factors to determine if they should check a page again and if so, with which priority. A document which remains unchanged over a period of time, functions as always and has no inbound link patterns which suggest a change needs to be spidered less often than one in constant flux.

The blog that I am working on does news commentary and reviews. Lots of the pages now link to dead URLs or redirects on other sites. What should I do with those post pages.


If this is a more or less consistent pattern, as can be the case with many news stories which disappear within 2-4 weeks, than you can use an automated solution. After 2-4 weeks remove the link or replace it with a search for the title of the original item.

#11 Halfdeck

Halfdeck

    Gravity Master Member

  • Members
  • 110 posts

Posted 24 April 2007 - 03:35 AM

Firstly I believe that the dilution of page rank transfer by cutting out the number if internal links from a web page is probably a minor issue.


Dilution of PageRank isn't a minor issue. Why? You need to satisfy a minimum PageRank to stay in the main index.

Think of IBL juice as ... a pizza pie. Weak IBL profile = small pizza. Strong IBL profile = extra large. You can cut your pie up into as many slices as you want, but if you ordered a small, do you really want to cut that up into 10,000 slices?

#12 A.N.Onym

A.N.Onym

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 4003 posts

Posted 24 April 2007 - 03:55 AM

Well, it is known that a page can only pass a certain amount of weight, which is distributed among the links on its page. If all of your billions pages have the 1/bil weight of the homepage (in theory), they will be invisible from the search engines. Unlike when each of your dozen pages bears some weight from the homepage (and internal links).

Then again, we shouldn't forget that each page links to the homepage, thus increasing its weight. So the issue isn't that one sided as it seems.

#13 EGOL

EGOL

    Professor

  • Hall Of Fame
  • 5177 posts

Posted 24 April 2007 - 07:29 AM

If this is a more or less consistent pattern, as can be the case with many news stories which disappear within 2-4 weeks, than you can use an automated solution. After 2-4 weeks remove the link or replace it with a search for the title of the original item.

These are great ideas! Thanks for sharing them. One question.... would this automated solution be based upon time - such as delete after two weeks.... or, can it be based upon the presence of content on the target site?

Well, it is known that a page can only pass a certain amount of weight, which is distributed among the links on its page.

I believe that a page can pass two factors... 1) importance (as defined by the number and power of its inbound links), and 2) relevance (as defined by the topic of the pages that link to it).

I think that relevance flows through an entire site without resistance or dilution. However, the amount of importance passed decreases with each step along the path. One of these makes Wikipedia the domainant force on the internet, and the other gives my site the ability to defeat Wikipedia in specific SERPs. Webmasters who understand how to best play these can consistently defeat sites of superior strength - up to a point.

Edited by EGOL, 24 April 2007 - 07:30 AM.




RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users