Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


Photo

Write Fake Forums Conversions


  • Please log in to reply
14 replies to this topic

#1 cre8pc

cre8pc

    Dream Catcher Forums Founder

  • Admin - Top Level
  • 13454 posts

Posted 29 December 2011 - 06:54 PM

Someone at somewhere has advertised for freelancers to write fake forum posts for the generous pay of $1 US for every 500 words.

Just when I think I've seen it all!!!

:emo_gavel: :emo_gavel:

:saywhat:

#2 jonbey

jonbey

    Eyes Like Hawk Moderator

  • Moderators
  • 4361 posts

Posted 29 December 2011 - 07:15 PM

wow.

sad thing is, there is a probably a long queue of people signing up!

#3 AbleReach

AbleReach

    Peacekeeper Administrator

  • Site Administrators
  • 6467 posts

Posted 29 December 2011 - 09:45 PM

Can you spell Mechanical Turk?

#4 EGOL

EGOL

    Professor

  • Hall Of Fame
  • 5360 posts

Posted 30 December 2011 - 08:37 AM

With that low rate of pay I bet most of the postings are copy/pasted or done by a robot.

No wonder the web is so full of crap.

#5 jonbey

jonbey

    Eyes Like Hawk Moderator

  • Moderators
  • 4361 posts

Posted 30 December 2011 - 09:07 AM

Yeah, not very likely to be highly converting conversations :)

#6 DCrx

DCrx

    Hall of Fame

  • 1000 Post Club
  • 1280 posts

Posted 30 December 2011 - 09:14 AM

The potential of the term content realized and the full flowering of its meaning, at last, understood.

Edited by DCrx, 30 December 2011 - 09:15 AM.


#7 tam

tam

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 2041 posts

Posted 30 December 2011 - 11:53 AM

That used to be the way to get your forum going, you'd pay people to post so it looked busy. Which sort of makes sense if you pick the right people and pay them enough they don't automate/scrape.

#8 iamlost

iamlost

    The Wind Master

  • Site Administrators
  • 4567 posts

Posted 30 December 2011 - 01:57 PM

Much web content became a commodity (interchangeable without noticeable differentiation) years ago driven by scanning (lowered comprehension level), texting (mutilated grammar and tortured spelling), search (little to no real language capability), and ad networks (content as disposable wrapper). It is what you do with a commodity, i.e. web content, that creates or adds tangible value, the commodity itself has little. Webdevs are in a race to find the broadest most popular common denominator and it appears (to me) to be depressingly low.

This is not something new, printed content has a similar history. Television and Cinema as well.

Fortunately...
* there remains a market for literate thoughtful prose and illuminating non-stock imagery.
* articles can be constructed to appeal to both the scanner and the reader, the 'just the answer', 'why?', and 'what's this mean' audience segments.
* specialty outlets can thrive in the very shadow of massive cheap cheaper cheapest emporiums; it pays to be nimble and quick, to have high standards and wild imaginations...
* ... :)

There is always someone somewhere who will do whatever for less.
And if not a robot or algorithm that can.
Deal with it.

Pick two: fast, cheap, good.

Remember the maxim of David and Goliath: do not engage on your competitor's terms.

Etc.

#9 glyn

glyn

    Sonic Boom Member

  • Hall Of Fame
  • 2486 posts

Posted 03 January 2012 - 03:44 AM

Kim - Ha, it's bad.
Ablereach....Mechturk is going back a few years now, tech has got much more advanced :)

On Autoposting....I am of the opinion that if such a post helped a person, whether it's automatic, scraped, posted for payment or human (which seem to be the ultimate justification of ethical validity...when in fact history shows humans don't have a particularly great track record) - it really doesn't matter.

Two cases:

A Tsunami sweeps across Europe which results in the displacement of 30 million people. It is estimated that 25% of the population are connected to Twitter with 10% having an active forum participation based in Europe. A Government website is created where people can go an re-register their details so that releatives can then make contact with the displaced persons. Would it be ethical for a government to tweet to every twitter user in Europe or post a message on every european forum in a relevant forum section with information about the website telling people about the helpline?

A company sells a piece of software that claims to get you to the first page of Google in a matter of weeks. Would it be ethical for that company to post information about this product in 100s of forums relating to online marketing tools in the relevant sections of advertising.

Automatic or manual for me is a bit of a lame argument and doesn't carry much weight, and in the cases above coming down on one side or the other could leave a useful tool out in the cold.

With good automation you won't be able to discern it from useful content, but you don't hear about that because it looks like content.

:)

#10 iamlost

iamlost

    The Wind Master

  • Site Administrators
  • 4567 posts

Posted 03 January 2012 - 03:44 PM

Kim - Ha, it's bad.
...
With good automation you won't be able to discern it from useful content...

The real concern is not so much whether it is automated or manual but (1) whether it is useful (define as you will) or mere disposable filler, and (2) what the people involved receive for their work.

The default definition of quality on the web has become Google's algorithmic approval. And that is depressingly low - contrary to some folks belief Google does not, can not, reward content 'quality' per se but rather must deduce indirectly via various signals, all of which can be gamed.

Granted, crappy content sites get burned all the time but the investment is typically so low that a day and ten bucks replicates whatever was dropped from the Google index and it rises again and again ad nauseum. The real victims tend to fall into three camps: those who know not quite enough build and burn automated webdev, those who are generally trying hard but incorporating some practice(s) that are now targeted - often via faulty SEO understanding or misplaced belief in personal immunity, and those who are false positive collateral damage (a .01% algo error is still tens of thousands of improperly affected sites).

The web is not some fair playing field.
ToS are neither regulations or laws.
Much of the problem with SEs is that they can not actually directly discern value via algorithmic means and the immediate (wages) and future (liability) costs of widespread manual inspection means that the crap (which make the SEs, especially Google, as well as the automated site builders a lot of money and really weighs against some definitive cahnge) will continue to propagate.

Fortunately, there are increasing alternatives for site traffic.
Of course most involve some amount of additional effort and that can be in short supply without automation where appropriate. Don't be afraid of automation: the SEs thrive on it. :)

#11 A.N.Onym

A.N.Onym

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 4003 posts

Posted 04 January 2012 - 02:58 AM

Glyn, it's not a matter of automation or manual, but of perceived quality.

Quality, in this case, comes from the nature of the message:
- whether the publication had published it itself, in support to the victims
- whether the readers actually need that information and/or want it
- whether it's posted in the right forum/article section (otherwise, it won't be needed by those, who want only to read the niche [technical] forum threads)

One might argue that run-of-the-mill automated posting doesn't guarantee with keeping up with the quality elements I've outlined above, and a few more (one of which is the text of the message itself, but it's another story).

That being said, I don't really mind automation. I mind badly crafted and performed automation.

Let's take your tsunami example.

Suppose, it does hit Europe. Here's what might happen:
- all reasonable publications immediately create sections, where they post updates on the event from their sources
- a number of social sites appear, including .gov ones, to help victims find their families/friends and get help
- some sites might scrape Twitter API for mentions of victims or the tsunami topic and display it in a consumable way
- people with blogs, journalists and other online publications link to the best websites,where people can get help (if a .gov site is the most helpful, then it'll be noticed)
- if the .gov site marketers really want to get noticed, they only need to contact the publications that write about finding relatives, so those seeking the information about them, would find it via the .gov website.

Suppose, if someone posted a note about the tsunami in our SEO or, moderator forbid, Website Hospital forum (not asking for help, sadly), then it'd be removed as spam, whether automated or not. Before that, however, I'm sure we'd discuss that in our After Hours section with links to relevant websites and pages, whether relevant discussions take place.

However, my point of view might seem naive to you. I prefer to consider it to be optimistic (and much more efficient in the short and long runs) ;)

Edited by A.N.Onym, 04 January 2012 - 02:59 AM.


#12 glyn

glyn

    Sonic Boom Member

  • Hall Of Fame
  • 2486 posts

Posted 04 January 2012 - 11:38 AM

Glyn, it's not a matter of automation or manual, but of perceived quality.

No perceived quality is too subjective. I can't perceive your world and you can't mine. Do you mean that the message makes sense and doesn't look like spam?

I mind badly crafted and performed automation.

You mean content that looks like spam? I agree.


Suppose, it does hit Europe. Here's what might happen:
- all reasonable publications immediately create sections, where they post updates on the event from their sources
- a number of social sites appear, including .gov ones, to help victims find their families/friends and get help
- some sites might scrape Twitter API for mentions of victims or the tsunami topic and display it in a consumable way
- people with blogs, journalists and other online publications link to the best websites,where people can get help (if a .gov site is the most helpful, then it'll be noticed)
- if the .gov site marketers really want to get noticed, they only need to contact the publications that write about finding relatives, so those seeking the information about them, would find it via the .gov website.


I agree, these are all things that could and would probably happen, my point is whether a debate about all automation being bad because everyone thinks it's spammy, is a good place to be and I don't think. I like to mix a bit of light with dark to get fantastic results.


PS, those publications you spoke of are under water.

#13 A.N.Onym

A.N.Onym

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 4003 posts

Posted 04 January 2012 - 07:23 PM

I don't think that automation itself is dark, it's how you use it defines the color.

You can read a more detailed response about using black hat SEO in white hat here.

P.S. Well, whoever remains online and those publications, which are hosted elsewhere and which journalists survived or had arrived from abroad. Clearly, we don't take into account those publications, that did go under water.

Edited by A.N.Onym, 04 January 2012 - 08:14 PM.


#14 jonbey

jonbey

    Eyes Like Hawk Moderator

  • Moderators
  • 4361 posts

Posted 04 January 2012 - 10:17 PM

yeah, if automation was bad then Google would have a tough time building their index!

#15 glyn

glyn

    Sonic Boom Member

  • Hall Of Fame
  • 2486 posts

Posted 05 January 2012 - 08:39 AM

Clearly, we don't take into account those publications, that did go under water.


Just messing with you :infinite-banana:

Edited by glyn, 05 January 2012 - 08:40 AM.




RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users