SEO Industry Directs Attention to Usability

Bruce Clay Inc. hosted a Twitter Chat, #SEOChat, called “The UX Force Awakens”, where Cre8asiteforums founder and administrator, Kim Krause Berg, was invited by Bruce Clay moderators.

We were also delighted to have Kim Krause Berg — the veritable UX whisperer — join us for the chat.

— Source: What Is UX? Who Owns User Experience Optimization? What You Need to Know About SEO & User Satisfaction from #SEOchat

At long last, online marketers are taking user experience seriously and more importantly, are interested in understanding what, exactly, usability is.

Cre8asiteforums, launched in 1998, is the first forums for search engine marketers, website owners, and web developers to host a usability forum. It’s founder, Kim Krause Berg, already established as an SEO, crossed the bridge to usability and accessibility in 2000. At that time, and for the next decade, the number of SEO’s wanting usability site audits for their clients could be counted on one hand, from around the globe.

Today, there are two hands worth.
Just kidding.

Three hands.

Usability is referred to by some internet marketers as “conversions”, or “customer experience”, thus ignoring what usability is. The result? Websites that fail to perform will also fail to convert.

“I was pleased to see that many SEOs place a high value on user experience for all devices, and that they understood why … I’m thrilled that Bruce Clay, Inc. is educating people on UX,” Berg said post-chat. “UX is HUGE … it includes empathy for every human, using every device and every software application and every search engine wanting to provide what humans want, in every environment, with an understanding of the limitations of age, bandwidth, Internet availability, use cases and business requirements specific to one’s business or web page intent.”

Do SEO’s Understand UX?

Screenshot from article on SEOChatDo you? Here is a transcript of the entire discussion, Summary: The UX Force Awakens on #SEOchat.

Cre8asiteforums Thread – #seochat Twitter Discussion With Bruce Clay On Usability

The Best Damn Web Marketing Checklist, Period!

Even if you have been practicing internet marketing strategies for years, there is always room for refresher ideas and perhaps even new methods to try.

Stoney Degeyter, from Pole Position Marketing, answered the silent cry for help by countless newcomers and anyone struggling to be sure they covered everything they should be doing. His new book is a hot seller, packed with 35 checklists providing 675 action steps. Not only that, he offers the reasoning behind each technique.

Hurry while supplies last!  We hear there may be a Kindle version coming as well.

The Best Damn Web Marketing Checklist, Period! 

The Best Damn Web Marketing Checklist, Period!

Here is my pre-publication review, which was included with the book:

Checklists on steroids! Stoney’s Best Damn Web Marketing Checklist, Period! is going to be attached to you from the first page and follow you around wherever you go until your online business dreams come true. I have always been a checklist and cheat sheet fanatic. There never seem to be enough of them, and they are always spread out in various places. Stoney compiled and organized 36 checklists, with more than 675 action points, along with examples, and he explains the reasoning behind each recommendation. Not only do you get the checklists with descriptive information, but you can download a quickie cheat sheet to apply to a number of websites. Everybody will want this book.

When Rand Fishkin Defended A Guest Article and Google Shot Its Arrow

In this latest episode of the Dark Night of SEO, we find a guest article has found its way into a firestorm of Google wrath because MOZ ran the piece on its site.

What Was It About That Link?

Two people disagreeing.When Scott Wyden wrote an article, it was used by MOZ. The URL to it has a category called “UGC” in it, that stands for user generated content. This is considered by some SEO’s to signal to Google the article is not good enough for their search engine.

Subsequently, anyone who has ever written for MOZ is freaking out, because Scott got a nice note from Google telling him he needed to remove a link from his site, to his guest article on the MOZ site that he wrote.

Add to this, there were links inside the guest article that were not allowed, according to Google. One of them was a link in the author profile to his own web site. This means that any link to sites we own or work for that are found in bio’s and author profiles are not safe and to appease Google, it is best to “no-follow” them.

It doesn’t matter if the link is to a reputable, legitimate website. It is a Google sin and nobody has died to save it.

Except Rand Fishin, CEO for MOZ.   When he learned that his guest author was penalized for his article link from the MOZ site, he wrote in Dear Google, Links from YouMoz Don’t Violate Your Quality Guidelines:

Scott’s link, ironically, came from this post about Building Relationships, Not Links. It’s a good post with helpful information, good examples, and a message which I strongly support. I also, absolutely, support Scott’s pointing a link back to the Photography SEO community and to his page listing business books for photographers (this link was recently removed from the post at Scott’s request). Note that “Photography SEO community” isn’t just a descriptive name, it’s also the official brand name of the site. In both cases, Scott linked the way I believe content creators should on the web: with descriptive anchor text that helps inform a reader what they’re going to find on that page. In this case, it may overlap with keywords Scott’s targeting for SEO, but I find it ridiculous to hurt usability in the name of tiptoeing around Google’s potential over enforcement. That’s a one-way ticket to a truly inorganic, Google-shaped web.

Meanwhile, discussions erupted after Rand demanded to know why his site caused this issue for one of his writers.

Google Hypocrisy: Keyword Rich & User Friendly Links Should Die, where Barry Schwartz wrote:

Back in the days before Google, online usability folks were all about making user friendly hyperlinks that communicated to the user what the link was about and what to expect when they clicked it. That means, a keyword rich anchor text link that describes the page it is linking to.

Hypocritical Google Dislikes User Friendly Links. Some comments:

1. How does Google know what our intent is? Persuasive design is about presenting an idea or call to action and at that exact moment the reader has been given the incentive to go, this is where the link goes. This also is why I never advise putting a pile of embedded text links into large chunks of content, because we have short attention spans and are easily distracted. Nobody ever reads, follows a link, returns, reads more, follows a link, returns, reads more, repeatedly. This practice is a dead giveaway that the site is spamming and not directed to humans. But to say, never link to your own stuff is not something advise.


2. Frankly I find this issue infuriating. A short while ago I read a piece in an seo blog and found one small issue interesting. I discussed it with the blogger. It ended up that we both looked at it independently, did two different pieces of research and found some interesting results. Our findings have not, as far as I can see, hit publication of any sort. I think they are newsworthy in a geeky sort of tech way, and possibly link worthy. Should we publish the articles side by side on the blogger’s site, as was our first thought, the risk of a link back to one of my smb sites runs the risk of penalties from google’s dictatorial perspective.

Like Rand, I found nothing wrong with the links, their landing pages or the anchor text. However, other SEO’s did, such as finding fault with the link in the author profile to the author’s own web site.

Bulls eye and dartBefore leaving on his “vacation”, Matt Cutts wrote in an email to MOZ:

Short of that, keyword rich anchor text is higher risk than navigational anchor text like a person or site’s name, and so on.”

This fueled the fire because now we have gone past the practice of spammy links into putting descriptive links in danger of being suspect. For example, mystery links are ignored by humans. The only links we click on are those that promise to give us something we want, such as a different site, a product, more information on a specific topic, link in a sales funnel, link that describes an image in detail, and links to take an action. The words in the anchor text are needed for accessibility (screen readers), and to motivate us to click.

Google demands a “no-follow” on links that many site owners don’t have any issue with sharing. For example, if someone writes a guest article for your site for free, the least that could be done is to send them “link juice” or a click to their site.

What’s also confusing about the Scott article disavow example is that the article is from 2012 and he is just now getting called out on it. One theory for the problem was not so much the MOZ URL, but that the site hosts guest blog posts at all. Google went after MyBlogGuest for that same reason.

Danny Sullivan weighed in at the MOZ discussion:

Let me start by saying I’d be as annoyed as Rand is if Google started telling people that Search Engine Land was a source of “inorganic” links. We have contributors; we take care to edit and be selective in what we allow. And ultimately, it’s our site — we’ll decide what we think makes sense to have as links and how they should appear.

It is understandable that Google is doing whatever it can to build a database of accurate content but at what cost? How many businesses will fail because they broke a Google “guideline”? When did Google stop caring about the user experience of the sites in its index?

Danny concluded, as did a few other SEO’s, that the issue was not MOZ, and not Scott’s article but actually Scott’s web site itself. That may be so, but don’t mess with Rand by including a link from his site as a potential threat!

When Search Engines Fail

Doctor holding clipboard

Search Engines Limit Our Queries

During a visit to my doctor, my curiosity was aroused when I watched her search for information she wanted to give me.  She had been trying to get me to go to a dermatologist and was unhappy when I hadn’t made an appointment yet.  My excuse was that I got as far as getting names but didn’t know whom to choose and did she have any recommendations?

She always carries her tablet that contains our medical records and apparently, a direct connection to the Web.  She sat next to me and went right to Google, searching like any of us would…dermatologists near where I live.  She printed out the page of search results she got, circled the names of people she knew and handed me the paper.

I later thought it was interesting that first of all, she ran a search the same way you or I would.  Google had no idea a doctor was looking for other doctors.  Google did not know she may have had previous knowledge or experience with any of the professionals it provided back to her.  She was not able to somehow say, “Hello Google.  I am a doctor.  Show me the names of dermatologists I know and can refer my patients to in such and such a place.”

Google intends on being the world’s most massive knowledge center and to do its work means relying on us to tell it where we are, who we are, what we like and dislike, who are friends with, where we eat, who we email, who are friends are, our favorite topics…and yet a doctor could not make a referral.  Perhaps she could have, had she had time to participate in Google Plus, but how many physicians have time for social networking online? How much would consumers really tolerate a system where doctors are paid to refer other doctors?  That would be one way of tracking referrals.

There is likely software that could be or is available for the tablets physicians use but my doctor did not use one.  She went out and searched like any of us.  There was no “Refer this doctor” box for her to check in her search results.  And even if there was, would Google place more value on a “vote” from a physician?  Would it somehow find a way to monetize this and wreck our trust in the ratings?

The Meaning of Words

In this example, taxonomies come to my mind.  More and more semantic search relies on metadata, like tagging, author, title, anchor text, etc. to “understand” meanings.

Someone started an interesting discussion on Facebook about the use of the word “marriage” when used for same-sex unions.  He felt that part of the problem that some people have is the use of a word that, by tradition, is something that happens between a man and a woman. He was not judging same sex marriages. He was thinking that maybe the gay community should find their own word to define their legal commitment.

So I started to think, what if the word “marriage” in a search engine brought back male/female unions and another word brought same-sex unions?  Would it matter?  Is it a way of passing judgment because a sacred, traditional term is being ousted or is having one’s own unique definition a thing of pride?  As it is now, the way to separate out the difference is to search “same-sex marriage” vs “marriage”.

If you live in a state where same-sex relationships have no legal status, would you search for information differently than in states where it is legal?  Can a search engine be programmed to understand marriage laws by state and then deliver results based on the source of the search query?

Even a simple word like “happy” used as search term needs qualifiers and maybe even help with location.  For example, the video made in Iran to the song “Happy” landed the 3 women and 3 men who made it in jail.  A search for “happy” in Iran clearly has a different meaning that the same search for “happy” in other countries.  Do search engines deliver results based on the customs of each country?

Any thoughts?  Join the discussion in Where the Search Engines Fail Us

Let’s No-Follow Google, Shall We?

Google has generated such a negative reputation that it is called “G#######” inside the forums by angry Cre8asiteforums members.

You know nothing GooglePenalizing MyBlogGuest, which is a “good person’s intent” website and not evil, is for many people, the last straw. When Matt Cutts shared his rant, The decay and fall of guest blogging for SEO, he upset millions of guest contributors and blog owners.

Cutts wrote:

Back in the day, guest blogging used to be a respectable thing, much like getting a coveted, respected author to write the introduction of your book. It’s not that way any more.


To the Google Spam Protection Squad, it may appear as though all guest posts are spam, but to those of us who take our guest writing seriously, pour over research, gather our facts and struggle over every comma and period, this was unkind and untrue. And unfair.

Cutts wrote:

In general I wouldn’t recommend accepting a guest blog post unless you are willing to vouch for someone personally or know them well. Likewise, I wouldn’t recommend relying on guest posting, guest blogging sites, or guest blogging SEO as a linkbuilding strategy.

Wrong again.

Woman bending backwardsThis thinking is the same practice as a teacher who punishes the entire classroom for the act of one bad student. I’ve never respected those teachers and lost respect for Google even more than I already have.

I recently put up a website for my high school. It is entirely guest post driven, by students who have stories to share about life, growing up, high school memories, or anything they want to share. I don’t promote it. I don’t optimize it much. I don’t ask for links to it. Its readership comes from our high school classmates Facebook, where the idea to share our stories originated.

No spam. I monitor the comments. There are no “no-follow” links in it. Its intent is to share stories among old friends and classmates from a small country high school. We can vouch for each other but I doubt that matters much to Google.  We clearly must be up to something rotten.

Cutts softened up a bit later:

I’m also not talking about multi-author blogs. High-quality multi-author blogs like Boing Boing have been around since the beginning of the web, and they can be compelling, wonderful, and useful.

And then MyBlogGuest was shot.

Eric Enge wrote Is Link Building Dead?, which describes the latest head games.

Here are some of the lessons we need to learn:
1. Intent is Not What Matters: It is only the results that matter. Ann Smarty (the owner of MyBlogGuest) did not design her service to support spammy behavior, but it appears that much of that resulted, and this was exacerbated by the No NoFollow policy. To Google, this ended up supporting bad link building practices by others. So regardless of Ann Smarty’s intent, Google did not like it, and they acted.

If you own a website and are unable to control the actions of others, you risk receiving a Google penalty.

As Barry Welford wrote in one of the discussions covering this latest hoopla,

Does Google expect the whole world to change its practices in order that the Google PageRank-based algorithm should continue to function. It’s ludicrous. I think Matt is rattling sabers here.

What are Cre8tive Members feeling about all this?

Matt Cutts Warning: Guest Blogging Is Done

Googles New Unnatural Links Messages For Myblogguest And Other Networks

The Only Safe Seo Method?