Jump to content


Discussing Web Design & Marketing Since 1998
Closing May 25. Investment Opportunity.


500 Posts Club
  • Content count

  • Joined

  • Last visited

  • Days Won


phaithful last won the day on October 28 2015

phaithful had the most liked content!

Community Reputation

3 Neutral

About phaithful

  • Rank
    Light Speed Member
  1. Per the conversation on Twitter, @Kim There are a lot of academic articles and research around "anonymity / second-screen" behavior and rudeness or disinhibition. If you search on ACM academic portal (http://dl.acm.org/results.cfm?h=1&cfid=556599021&cftoken=73166031) and look for "online disinhibition". But there are plenty of articles around the web that talk about online trolling and some of the psychology behind it. Why We Are So Rude Online » http://www.wsj.com/articles/SB10000872396390444592404578030351784405148 Why Is Everyone on the Internet So Angry? » http://www.scientificamerican.com/article/why-is-everyone-on-the-internet-so-angry/ Online disinhibition and the psychology of trolling » http://www.wired.co.uk/news/archive/2013-05/30/online-aggression
  2. That's an interesting question. My gut feeling is no, that shouldn't have any effect on canonicalization. In almost every server system I know of, www.mysite.com/ should be equivalient to www.mysite.com. However, technically you could serve 2 different distinct pages if you setup your server correctly, which leads me to believe it's probably safer if you update your breadcrumbs to the canonical URL that you desire (probably without the slash).
  3. You might want to check out Dave Naylor's Keyword Density Tool. Of course it does more than just keyword density analysis. I think it's relatively equivalent to what you're looking for. The only caveat is that the content needs to be accessible via a URL ... can't just copy and paste the content.
  4. I would say Aaron Wall is well respected in the industry and there are many people who have recommended his SEO Book. I haven't personally purchased it myself so I couldn't comment on it's worth. However, if you're looking for a good source of free information, I'd suggest taking a look at SEOmoz's Beginner's Guide. There's also a more updated, yet incomplete version. Also check out DazzlinDonna's post: Beginner and Comprehensive Guides to SEO. It's a very comprehensive list of the free and paid resources out on the intertubes.
  5. phaithful

    Help On Geo Targeting

    I'm not sure if this is too helpful, but I do know a few people using MaxMind for their GeoIP lookup. It's a paid service, but if you only need it at the country level it's only $50 / site with a $12 / month update fee. Not a whole lot of money if you're actually making transactions on a regular basis.
  6. phaithful

    Pagerank Question

    I posed a similar discussion on a separate forum, and two people I would consider experts in IR as it pertains to SEO agree with Barry and iamlost that CTR is still too noisy of a signal to be utilized in the wild. I still think there's a case for engines to be using 'searcher behavior' metrics, but instead of misleading someone reading this thread ... in general, I would say the expert opinion leans toward CTR not being a current factor for search rankings due to the noise 'position click bias' poses and the easy ability of AIR to add additional noise to the signal.
  7. You'll probably have to do a bit more troubleshooting to determine where the problem lies. If these are old page that received a good deal of traffic and these new 302 links on the pages were added, it could very well be that. However, if these are new pages, just adding these new pages shouldn't drastically decrease your SEO traffic to other pages by 5x. Having a large number of redirects on a page doesn't inherently equate to a penalty. However, the page does look somewhat like a doorway page for explorimmo, since almost all the links excluding the navigation are 302s pointing at explorimmo. Is there a particular reason why you're using a redirect URL as opposed to linking directly? Is it soley for analytic tracking purposes? If so, I'd recommend a different tracking implementation. Mostly because redirects like that have historically been abused by nefarious hackers, of which Google and the major engines are very much aware of. Again, it may not be the cause for your problems, but it is problematic to have links like http://www.immobilier.com/redirection?url=.../www.google.com .... where Google.com could be a very bad URL that leads to a phishing site. I'd probably also check your redirection script logs to see if any baddies are using it. Which definitely could be a reason why you've dropped in rankings.
  8. phaithful

    Pagerank Question

    This issue is also addressed in some of the papers. It's also why CTR is only one signal of many considered, similar to a PPC campaign where CTR is only one factor of many. One way engines maybe addressing this issue is through factors like Query Deserves Freshness (QDF) or the new site rank boost that most people see last for a couple of days to a couple of weeks. This should allow the engines to collect CTR and engagement metrics and then readjust the rankings accordingly. Agreed!
  9. phaithful

    Pagerank Question

    I'd actually contend that there is a strong likelihood that the major engines are using some sort of SERP engagement factor (e.g. CTR, Bounce Rate, query refinements, etc.) to determine relevancy of listings. Of course, no one really knows for sure and research papers are not a clear indicator of usage, but the number of IR and AIR (adversarial IR) papers that address SERP metrics such as clickthrough rate have been numerous since 2004. Many of the papers originating from contributors of major search engines for the past 5 years. Of course, I agree with iamlost that certain signals are fairly easy to game (which are addressed in a few of those papers). However, I would imagine that even with a basic bayesian-algo could derive a signal with a decently high level of confidence for popular head queries that have a large pool of sources (e.g. millions of clicks from millions of users) With that said, I have no expectations of how significant a search signal SERP engagement factors has on the search rankings. And I agree that links from authoritative sources probably has orders of magnitude larger impact in comparison, just as EGOL has pointed out.
  10. phaithful

    Pagerank Question

    I wouldn't be so sure that a single click from page 13 would move you to page 2 or 3, but I am a believer that engagement metrics, such as, CTR are signals that have an effect on search rankings.
  11. Those redirects just look like simple 302 redirects to explorimmo.com ... I wouldn't say those are obviously problematic that would affect your rankings for immobilier. Are these links relatively new, or are there other changes you've made to the site that you suspect would contribute to changes for immobilier?
  12. Check out Ping-o-Matic. Basically pinging is a way to notify engines or services that you have new content. Most blogging CMSes have this as a module or is built-in. You can find a list of ping services here: Ping Services ... but using something like Ping-O-Matic is simpler.
  13. For the most part your site should get "indexed" fairly quickly (a matter of days to weeks) by the major search engines, as long as there's at least a handful of external backlinks to your site from sites that get crawled regularly. As far as "ALL" your pages, that may never happen depending on the size of your site. The reason being, engines have a finite amount of time and bandwidth to crawl the web. There are many different algorithms for determining which pages they'll crawl next (e.g. depth first, breadth first, best first, etc.). I prefer to think major engines use some sort of "best first" approach. So unless "ALL" your pages have a good amount of backlinks, they probably won't all make it into the index. Also there's a big difference between making it into the index and actually being ranked. Even if all your pages have been crawled (which essentially means they've been indexed), the pages may actually never appear in a search result page (SeRP). This of course goes into how they rank pages, main index vs. supplementary index, and precision & recall. I would say, don't be too concerned that ALL your pages rank, but make sure that the important pages get indexed and ranked. Important pages depend on your business model, but think conversions instead of just traffic.
  14. phaithful

    Image File Name

    In a word, yes. But by what magnitude ... probably not by a whole lot, but ever little bit helps. Things like images are natural occurrences on web pages and can enhance the overall user experience (Ux) of your site. Just make sure your images, videos, or other media that are relevant to the content of the page. Also, don't forget captions can also help the engines understand what is happening in the picture. You can use the HTML caption tag to semantically tag the captions for your images.
  15. I'm not much of a .NET guy, Python is my game ... But you're correct, if you're pumping things out in C++ and your developer can handle the memory allocation and freeing, you'll definitely have a speedy program on your hands (otherwise it'll be fast, but crash over time due to memory leaks). I'm not a big Java guy, but I do appreciate the auto memory garbage clean-up that is lacking with C. Like I said, I don't think there really is a good mass-available solution, especially for your particular needs. Since you have the horsepower you might as well use it. Elance.com or Guru.com has developers that you can contract and could probably build what you need if you just write out the business and functional requirements. I know it's less than ideal, but from my research on SEO tools, I only see 3 real viable options to capture everything you need: 1. Use the free tools, but you'll have to manually make the requests page by page (Slow and a pain in the backside ... unless you have an intern) 2. Use the paid tools, but they're mostly web apps which means monthly cost. 3. Contract / Build the tools yourself In the long run, building the tool yourself is probably cheaper, but that's if the engines don't change. Even the popular desktop software I've seen are charging for updates ... so in the end, it's not really that much more cost effective than the monthly web services. But in your case, where you're dealing with 2,000+ domains and possibly tracking 10k+ subdomains, I'd go with contracting it out, and making sure they build it so it can scale.