Skip to content

Web 2.0 Spam: Advanced Content Recycling | Manipulating Digg

Looking for a perfect example of somebody using someone else’s old content and Web 2.0 tools to create a lot of buzz for themselves?

The guy over at SiliconCloud.com which is just a two-month old weblog went through Jakob Nielsen’s old lists and a couple of other lists floating around out there and chose twelve items which web designers/owners are still doing.

Here’s just one example: Forms.

From Silicon Cloud’s 12 Ways to Irritate Your Visitors:

7) Unnecessary Questions – Ensure that the subscription form to your ezine or newsletter spam contains at least 36 questions more than needed. Why stop at the username and email address when you can ask them for information such as their mailing address and at least 3 different phone numbers (home, work and mobile). By adding other pointless questions such as age, sex, hobbies, religion and inside leg measurement is a sure-fire way to prevent people ordering your product or subscribing to your mailing list.

From Jakob Nielsen’s Top Ten Web Design Mistakes 2005:

7. Cumbersome Forms

People complained about numerous form-related problems. The basic issue? Forms are used too often on the Web and tend to be too big, featuring too many unnecessary questions and options. In the long run, we need more of an applications metaphor for Internet interaction design. For now, users are confronted by numerous forms and we must make each encounter as smooth as possible. There are five basic guidelines to this end:

  • Cut any questions that are not needed. For example, do you really need a salutation (Mr/Ms/Mrs/Miss/etc.)?
  • Don’t make fields mandatory unless they truly are.
  • Support autofill to the max by avoiding unusual field labels (just use Name, Address, etc.).
  • Set the keyboard focus to the first field when the form is displayed. This saves a click.
  • Allow flexible input of phone numbers, credit card numbers, and the like. It’s easy to have the computer eliminate characters like parentheses and extra spaces. This is particularly important for elderly users, who tend to suffer when sites require data entry in unfamiliar formats. Why lose orders because a user prefers to enter a credit card number in nicely chunked, four-digit groups rather than an undifferentiated, error-prone blob of sixteen digits?

Forms that violate guidelines for internationalization got dinged by many overseas users. If entering a Canadian postal code generates an error message, you shouldn’t be surprised if you get very little business from Canada.

Frankly, Nielsen’s advice is far better and more detailed.

Anyway our friend Thomas over at Silicon Cloud, then went on to post his own linkbait article to Digg. It took. Far more interesting from an SEO perspective, than the recycled twelve errors is his own account of his Web 2.0 manipulation:

Step 1 was to post the article into the Digg site. This was fairly easy as we already had a Digg account. Once our article was in digg on the diggall list we sat back and watched what happened next. Quite quickly a few people ‘dugg’ the posting and within about 15 minutes the post had 10 diggs and appeared as the next level of popularity in the cloud view. Things were going well. All this was helped by the first comment received on the article which was almost as funny as the article itself. Thanks James.

I have to agree with reader James’s comment – the most annoying current practice on the web is to break long articles up into multiple pages, making it slower to read them and harder to reference them (i.e. over at Silicon Cloud). Why do commercial site owners do this? To increase the number of ad impressions and clickthrus. Strangely it has the opposite effect on me. I will avoid sites which will slow down and attack my browser or make me click through three or five pages (SEOchat.com, anybody?) to read what is a 1000 word standard article.

For those actually interested in usability issues and the various plagues that site owners and web designers unleash on us the hapless users (instant remedy Firefox and AdBlock), here is a list of most of Jakob Nielsen’s top ten no-no lists. I’ve bolded the three that I find most useful and still actual (it includes one from 1997!).

Read Nielsen and weep. The errors of 1996 in large part, persist.

Takeaway lesson: Web 2.0 is doomed to fall to the spammers shortly if the ramparts are not built high. The number of trackback spam and blog spam I get even on uncoy.com is astonishing and a nuisance.

Spammers and cloakers – Web 2.0 has arrived – on your marks, get set, go.

* Thomas Clay is also the creator and owner of Whatbooks.com – another fine example of search engine manipulation – it’s a review site of best selling books only: Tom Clancy, Stephen King, John Grisham, J.K. Rowling – you get the drift. Thomas is holed up in the Cotswolds which is in the south of England. For some reason the Brits are a good deal better at more subtle and long lasting manipulation of search engine results. I attribute to the life-long vow of hypocrisy and dissembling which is British society. Manipulation of the social atmosphere just comes naturally.

My favorite SEO, Ammon Johns (where the hell is his website?) is a Brit. Why Ammon Johns? Ammon Johns is one of the most helpful people in the SEO world and he was one of the first to fix his attention on helping his clients market their business, rather than on pure rankings.

3 Comments


  1. Warning: Undefined array key 1 in /home/uncoyorg/public_html/site/wp-content/plugins/thoughtful-comments/fv-thoughtful-comments.php on line 1148

    Seem to remember his website is http://www.webmarketingplus.co.uk

  2. Nice to know I have made an impression. Thanks.

    Thanks also to Mr. Young there for correctly identifying my personal website.

    I also blog sporadically at the Fresh Egg Blog – link to freshegg.com – when I get the time.

    I really enjoyed this post – there’s a certain irony to reusing ‘user contributed content’ and web 2.0 tools, since Tim O’ Reilly’s truest definition of web 2.0 is things that creatively use or rely upon community generated content. Google is web 2.0 because it takes community content (sites and data) and processes them to create new content.

    Of course, this somewhat loose definition of Web 2.0 means that scraper sites are also very Web 2.0 :)

  3. I’m not sure I can agree with Web 2.0 being about user distributed content.

    For me, it is about tools which actually work – making the browser effectively the front end of a full featured application. Bringing much of the desktop to the web.

    As a Mac user since forever (first round was actually Atari ST and GEM), I think those who have been working within the Apple paradigm are particularly well-qualified to design and create the simple and limited interfaces which work well on the web.

    Interesting notion about Google being Web 2.0. I would agree in principle but on a different ground. I would say their candidacy hangs on making all kinds of tools which work, not on their world-leading scraping ability.

    An engaging sophism though, concerning scraper sites Web 2.0 status.

    Thanks for stopping by, Ammon.

    In terms of SEO, I’ve found that your strategic focus on helping clients businesses rather than their rankings has been a very successful paradigm. Rankings are important only insofar as they advance the client’s business. The greatest difficulty is for the client to understand and fully act on the potential we can create.

Leave a Reply

Your email address will not be published. Required fields are marked *