Jump to content

Cre8asiteforums

Discussing Web Design & Marketing Since 1998

  • Announcements

    • cre8pc

      Thank you! Cre8asiteforums 1998 - 2018   01/18/2018

      Internet Marketing Ninjas released many of the online forums they had acquired, such as WebmasterWorld, SEOChat, several DevShed properties and these forums back to their founders. You will notice a new user interface for Cre8asiteforums, the software was upgraded, and it was moved to a new server. Thank you for your support as we turn 20 years old.  

Michael_Martinez

1000 Post Club
  • Content count

    1,278
  • Joined

  • Last visited

  • Days Won

    7

Michael_Martinez last won the day on June 29 2012

Michael_Martinez had the most liked content!

Community Reputation

12 Good

About Michael_Martinez

  • Rank
    Time Traveler Member
  1. Google doesn't care about "guest posting" or "guest blogging" or "guest articles" with proper attribution. Google cares about manipulative linking practices, which may include using guest posts for the sake of obtaining links that would not be sought if the people asking for the links didn't believe they would help with Google's search results. If you felt you were providing your visitors with good quality content and if your visitors were giving you good feedback, keep doing what you're doing for your visitors. Let the search engines figure out what they want to do with their indexes.
  2. Donna's idea is the simplest and probably most effective. iFraming content publishes once, serves many times. So far it seems to be the most elegant solution. Google will attempt to crawl the iFramed page. You could either block it in robots.txt or embed a "robots" meta directive on the page that tells Google not to index it. If you embed any links in the page then you should let Google crawl it and follow the links.
  3. Recaptcha Programming Problem

    If you're using a platform that supports a Stop Forum Spam plugin (like Wordpress) I recommend you use that in addition to any other measures you take. It will check a central database and block a huge number of spammy submissions. You'll occasionally get false positives but you should be able to whitelist any real people.
  4. Recaptcha Programming Problem

    Why would you want to use ReCaptcha? It's user-unfriendly and doesn't work against spam.
  5. Ideas For Mortgage Site?

    I don't mean to sound condescending but you do realize that Florida's mortgage market is declining, right?
  6. I tried looking at the site on my 'Droid and it never finished loading. I have no doubt that Donna is right -- more companies will design for mobile environments. But they have to get the design right. For now I would say this site is a failed experiment. It probably works for some users but since it doesn't work for all it's not an acceptable design.
  7. I don't think you can put two custom search widgets on the same page, Barry. If they are using DOM variables for any reason they will experience a conflict in data ownership. I've had to write multiple variations of simple Javascript code that use different variable names in order to get two or more instances running on the same page. It wasn't clear to me before this was what you were trying to do. Maybe you could iFrame the second widget but I'm not sure how that would look.
  8. They do seem to be emotionally invested in a mathematically failed concept. Nonetheless, they are making billions of dollars in profit each year so I don't expect them to change their system that much any time soon.
  9. Every one of these notices that I have evaluated for clients has checked out. I think Google does a pretty darned good job of evaluating links. They'll never be perfect but they have found a LOT of naughty links. Of course, "penalties" can be masked by improper goals. For example, if the links are targeting unuseful anchor text (as is the case more often than most people realize) then you may not see much of a traffic dip. The ridiculous debate over negative SEO needs to end, however. It has been around for a long time, isn't going away, and isn't any more of a problem today than it was before Penguin.
  10. Is the blog's PRIVACY setting activated? If so you're probably blocking some crucial Google crawler-thingamajigee. Last year I accidentally blocked the media-bot on a subdomain and couldn't get AdSense to work. Such a simple oversight in rewriting a robots.txt file can have unexpected and frustrating consequences.
  11. Of course, there is nothing about "rel='nofollow'" that indicates a link was paid for.
  12. Google announced an algorithmic change a few months ago where they said something like "you will see more results from trusted Websites in certain queries", Combine that with the fact that many Websites were downgraded for keyword stuffing, bad links, or low quality by the Penguin and Panda algorithms and you should see many more SERPs with the "Amazon Effect" (so-named because about 12 years ago Amazon and Altavista were criticized for injecting so many Amazon pages into the Altavista SERPs).
  13. Arrogant Wordpress

    To preserve empty lines I just use non-breaking space inserts ( ) on each line. I have had pretty good luck with tables, but it seems that some themes work with them better than others. It may not be Wordpress that is causing your problems but the theme you have chosen. As an alternative to "Pretty View" I usually just right-click on "View Page" and open it in a new browser tab so I don't have to leave my editing window anyway.
  14. Arrogant Wordpress

    Nobody's perfect and I'm not necessarily the biggest fan of Wordpress, but I do find that Wordpress 3.4.1 is pretty tolerant of a lot of stuff I do. What have you been including that is getting retouched internally?
×