Footers Go Back To The Future
Posted 19 January 2013 - 06:41 PM
I remember being intrigued by Barry's experiment/examples but felt they weren't appropriate for my sites even as an experiment - I'll get to my main reason why not later in this post. And - I'm sure he'll correct me if I'm wrong - I believe he eventually went back to more conventional footers.
The foundation of both Jill and Ann's articles are quite simple thresholds:
* if it's only in the footer for SE consumption it shouldn't be there.
* if it looks spammy it probably is and shouldn't be there.
* footer content should be relevant and useful, clean and concise; what is commonly expected.
The comments are quite revealing: spamming the crap out of the footer did and still does 'work'. However, I don't recall anyone defining 'work' in this context so will just let it stand and accept that for their purposes it provides the desired results.
Note: I presume, but don't know, that they mean they get the page returned for additional search query terms. Why they can't be bothered to incorporate such within page content is beyond me. I suspect that while they may draw additional query traffic it doesn't convert as well as it might. Of course they may also not be bothered to track such a mundane business measure.
My main concern with Barry's heavyweight footer was never spam as such - not Barry's thing so far as I know - but link value dilution, which Jill also mentions. Shortly after Google became prominent several PageRank flow emulators appeared; indeed I built one myself. And I've used it happily ever since not because I care about PageRank, I don't per se, but because it allows me to see how links direct flow from page to page, how dampening factors and subsequent URL link devaluing, affect that flow, which pages are dry or only getting a trickle, which pages are being flooded, where to add or subtract which links that will even or extend flow the best... Yes, my site is a waterworks and the links are the pipes and the pages need some threshold amount to be viable...
In the simplest model if a page has a value 100 and there 10 outgoing links (intra/intersite) and the dampening factor is 10% then each link is flowing a value of 9 (100/10-10%). In this simple model if one then loads up a footer with links, i.e. adds 10, each link is now flowing a value of 4.5 (100/20-10%). One may be getting traffic for additional terms but ranking may have been affected for some others.
Of course the more 'real' (only each SE knows for certain) the inputs, values, thresholds, dampening factors, et al the more complex the model. Some values may page specific, others page summed and site applied, page or site value weighted against niche averages/means, etc. For instance, Google saying that only the first page link to a URL is counted, means that adding main navigation links to the footer is of neutral value, except of course, to any visitors that follow them.
Given that SEs can apparently segment a page reliably: separate main from secondary navigation, header from sidebar from content from footer, etc. it seems reasonable that links in each such are would be given either varying weight or a varied dampening factor. And so judging flow becomes complicated as well as complex.
Regardless, links transmit values as well as visitors. The more links from a page the greater the probability of diluting the values carried by each link. The only counterweight besides additional backlinks to the page is building what may be countervailing site values such as trust or accumulative site values such as authority.
Whether Google or any other SE is discounting/penalising spammy footer links is perhaps up in the air; but that they could, and given other tightenings, i.e. B&W Swimming Bird, are or shortly will be targeted to some extent. Just because something 'works' may not be all that needs to be considered: what else is affected both on page and on subsequent linked pages; and is it good, bad, or indifferent?
And a final question: is being a little bit spammy similar to being a little bit pregnant?
Posted 20 January 2013 - 10:41 AM
I was really expanding on the footer principle that Darren Rowse was applying in his Problogger blog at the time. My thinking was that if anyone did a jump to the bottom of the page then the whole screen would be filled with useful links to other parts of my online properties. I had buttons at the top and the bottom that allowed you to jump from one to the other.
It was clearly intended for any visitor who was intrigued enough to want to explore other things I had done. However it was unexpected and I don't think most visitors (if they got that far) found it was useful. So I now rely on the upper parts of the web page and my footers are minimal with just 'the small print' it is useful to have on the web page.
In terms of the Google view on all this, I go along with Jill Whalen's thinking.
Posted 20 January 2013 - 05:39 PM
Just because something 'works' may not be all that needs to be considered: what else is affected both on page and on subsequent linked pages; and is it good, bad, or indifferent?
So true... And something I have to explain almost daily!
I feel so old when someone mentions my article and I need to go back and find it to bring up in my memory. I used to remember everything I've written by heart!
Reply to this topic
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users