Jump to content

EGOL

Hall Of Fame
  • Content count

    5,960
  • Joined

  • Last visited

  • Days Won

    187

EGOL last won the day on November 6

EGOL had the most liked content!

Community Reputation

892 Excellent

About EGOL

  • Rank
    Professor

Recent Profile Visitors

47,634 profile views
  1. I started the site that I spend most of my time on about 13 years ago. It had one article when it went live. For a couple years, every article was linked to from the homepage. The longer the homepage became the more time people spent on the website. Today the site is way too big to put everything on the homepage. And, now it has grown to the point that the homepage and some of the category pages have way over 100 links on them. On the homepage and most of the category pages, each link has an image, an anchor text, and a one sentence description. People do scroll all the way down and click items in the bottom row. A few years ago, Donna created a content manager for me that kept track of how often each link was clicked. Then cron jobs, which ran several times per day, would rearrange the pages and move more frequently clicked links up one position. With that system, you can add new content, improved content, or seasonal content, to the top and allow it to float or sink according to its popularity. On those dynamic pages with over 100 links, a popular link can be placed at the bottom of the page and it will be up to the top within a week or so, depending upon its popularity. So, we can use weighting factors to make individual items more or less buoyant, depending upon how much promotion we want them to have. Why? You can place a popular item in the bottom row and it will get hammered. So, I do know for a fact that lots of people will scroll all the way down a really deep page and click stuff at the bottom. I suspect that browse depth depends upon the type of website, how enthusiastic visitors are about the information, how well the people who run the site present the content. If you have a news site, a system like described above might make sense. New stuff is added to the top and it floats of sinks depending upon its popularity and depending upon how heavily new stuff is piled on. Also, without a system as described above, the number of links on a page can become almost irrelevant if the links are organized properly. Let's imagine that you have a website about widgets from different countries, and also imagine that widgets are made from brass, wood, stone, and plastic. Your widgets page could have four columns, one for each material that the widgets are made from and the countries in alphabetical order in each column. Four different materials, each from about 150 countries -- that's six hundred links on that page and people will happily scroll down to click the Zambia links if they want them. Again, popular stuff at the bottom of a deep page will get hammered if people want it.
  2. Nice work by Ian Carroll. I would have been fooled by the green text. ... but I think that anyone can get a security certificate and those who issue them do very little to determine, and probably have no ability to determine, if the security certificate is issued to a criminal person, criminal company or even to a dog.
  3. I don't like it.... and somebody is making money selling our personal information... to scam callers, to junk emailers, junk snailmailers, etc.... BUT... if you have a biz, your information is much more valuable, and the professional scammers are after you.
  4. Wow. That's amazing. Frightening at what it might produce.
  5. This is some of the worst weasel stuff that I have seen in a while. And those bigass Rolex ads on the Quartz website probably pull down a fortune.
  6. lol Right! That's what it is. I enjoy when you make things so simple. Google talks about "looking at the author" to determine if they are experienced, authoritative and trustworthy. Google calls this EAT and it is an important part of their Search Quality Evaluator Guidelines. Although those Guidelines are written for humans. The goal is to find information on the website being evaluated or other websites of importance that can be used as a proxy for being "subjective". If "fake news" and inaccurate content is getting through then Google isn't doing a good job of looking at the author or looking at the publisher. I'd think that it would not be very hard for an organization with the resources of Google to accomplish. I keep seeing prattle on garbage websites of the eHow variety rank above organizations that are small, but the "world's foremost authority" in their field, with dozens of degreed and certified experts on their staff. That simply shows that google is giving links an enormous weight over EAT - even for "Your Money Your Life" topics. So, until Google "learns" that organizations staffed by the world's leading experts should trump links their "machine learning" sucks... and before they can build "artificial intelligence" they have to have subjective common sense as a foundation.
  7. An interesting article in The New York Times about "fake news" and how it might be filtered out by search engines. The article then morphs into "marginalized voices" https://www.nytimes.com/2017/09/26/technology/google-search-bias-claims.html It brings up interesting questions about what is "fake" what is "marginalized".... then should this stuff be censored or filtered or demoted or whatever. Perhaps they are looking at the wrong angles. Perhaps they should be looking for "what is accurate".... "what is true"... A "socialist" or an "evangelist" website of any kind should not be filtered or censored or demoted if it is accurate. At the same time you have sites like The Onion... that writes goofy stuff for its entertainment value. That's very different from advertising hiding behind a headline or an image... and then the advertising purpose is not disclosed or worse yet "understood" until the reader gets down to the last line... and maybe some readers never realize that they were snookered. This isn't a simple question and it goes far beyond news and advertising and into content that is produced without any regard for accuracy and expertise.. it was only made for Adsense.
  8. China 50, United States 10?

    Damn.... amazing... holy smoke....
  9. Those with the lowest skill simply pick a topic and blather. Matters not if their blather makes sense or even if it is correct and precise. They just want to make text on a page, that search engines will index and they can collect money from ads. Others with low skill visit a number of relevant sites, grab sentences here and there, sort them topic, slightly edit each one, and then make a page of one-sentence paragraphs. This relieves them from writing in a way that their sentences work together. It is a page of "ideas" that Google might rank well - especially if it is a long page. Some with more skill go to your site, copy/paste your content into their word processor, and do a sentence-by-sentence rewrite, keeping the paragraph structure. Google likes this even better. Both of the above hyperlink relevant words to amazon.com, and make buckets of money for very little work. Then you have the ones who will do it at scale. They pick one of the methods above, have a computer scrape the content, do a automated rewording and toss up a few websites per day. Google has been talking about EAT and YMYL (explained here by Julia Spence-McCoy). I think that they have promise but I have not seen them sorting the SERPs yet. I see websites owned by the world's foremost authority in certain fields outranked consistently by sites run by scamps who produce nothing but blather - even blather with negative YMYL implications of a medical nature.
  10. Yes. Some of the grammar rules that I learned suggest that "it's" would be possessive. Contractions in the writing that I read don't bother me very much... but it's obvious that Melissa Fach really dislikes them. I like to use two spaces after a period, and you can get into some huge arguments over that. Folk can spend their energy on whatever they want. :-)
  11. I enjoyed a few of the items in her numbered list...... This one was the best..... You can have advanced degrees, certifications, awards, and 40 years of experience and still not know everything about your topic - or have the ability to communicate it well. A lot of the content on the web was produced by "writers" with no experience or education who don't know what they don't know. Fake news, fake content... this fake stuff is everywhere.
  12. Most sites move to https with few problems. ================================= One possible problem area is redirecting the http version of the site to the https version of the site. If there are old redirects and all of them are not accounted for and redirected again properly then the link value into the historic URLs might not be passed on to the https version of the site. If your site had a lot of historic redirects, it might pay to have someone who knows a lot about htaccess take a look at how yours are being handled. ================================= Another problem is when all of the possible URLs have not been addressed and some of these... http://www.xyzed.com https://www.xyzed.com http://xyzed.com https://xyzed.com ... are not being accommodated. ================================ And, some folks have speculated that when a site goes from http to https that it receives a quality review and if it doesn't satisfy the review the traffic starts slowly slipping. I can't point to any references on this but I do know of sites that have been on a month-by-month traffic decline for six months to a year after converting to https.
  13. Followed the link above to a post iamlost made on webmasterworld.... and kept reading some of his posts from earlier this year. He keeps saying he is semi-retired.... that means.... details here if the quote below is not satisfying. Something to think about when you are competing with people in other countries or when people start copying your widgets... (more detail here) Enjoyed those above and many more.... Thanks!
  14. The past couple of weeks have been extremely volatile for the SERPs that I watch. Huge ups, huge downs. Bigger than I have seen in a long time. <rant about mobile> Honestly, most people have converted their websites to some format of mobile... I think that a lot of this has been done blindly and in haste... lots of people just slapt this up. - how do you do it for SEO when you don't know what Google is lookin' at? - how do you do it to make a pleasing website experience? - how do you do it maximize visitor engagement? - how do you do it to maximize ad revenue? - how do you do it to maximize sales? - how do you do it for speed yet maintain features and functions? Really, does anybody understand it. Lots of people talk a good talk but we know how people bullshit. They are fguessing at it. Lots of people have sites that might have [should have] pages in a number of different formats. I am still trying to understand these things for desktop and I have been experimenting with it for a long time - and I have enough traffic that I get experimental results quickly... but experimenting is complex and costly if you are only looking at desktop... now you have to get the same code to render properly in two browsers on an infinite number of screen sizes.
  15. Who is paying for this bandwidth? I bet that the visitor is racking up data charges on his mobile phone. I think that the advertiser should be paying for it... and for that reason, I wonder if the ISP should be the ad network and allocating the charges to the right party. I know that I am naive and only looking at one side of this many faceted problem, but I don't like to see charges foisted onto people who shouldn't be paying for them.
×