December 18, 2011
Google's New Algorithm Makes the World Less Safe for Spammers
It's that time of year again when Google, the world's most prominent and oft-used search engine, revamps its algorithms in an effort to crack down even harder on spam sites. The trouble is, they're such a megalithic force on the internet that anytime they make even the most minor of shifts, everyone has to adjust accordingly. Here's a breakdown of the changes that have gone into effect, based on Google's announcement, and some changes you may have to implement on your own website to ensure you're not negatively impacted. A completely revamped spam detection program that looks at individual pages and filters out what it identifies as numerous instances of repeated keywords and phrases. By pushing sites like this down to the bottom of the search results, it limits the likelihood that someone searching for information on European vacations, for example, will be directed to an ad-ridden site. As a consequence, sites with actual content about vacations to Europe will populate the top of the heap, thus giving the user a better overall web browsing experience. In order to make sure that you're not lumped in with these spammy sites, now's the most important time to reevaluate your content. If you're not sure whether your site will be impacted by this new algorithm, review your site's content and look for instances of repeated keywords. It may be necessary for you to restructure your web copy to be less keyword-driven and more content-driven. Google is also cracking down on duplicate content and content farms. Not quite driven by the desire to expunge plagiarism from the face of the earth, the ultimate goal here is to prevent the duplicate and triplicate re-posting of legitimate articles that originally appeared elsewhere as a way to attract visitors to an add-riddled website--thus obtaining click throughs and revenue. This type of duplication of existing articles is called "scraping" and has long since been the bane of article authors, as well as users interested in finding unique, original content who wind up running into the same article over and over again. Content farms use programs to aggregate information obtained from various sources, and re-post them in the hopes of attracting visitors. But through the new algorithm, someone conducting a Google search will be more likely to actually come across the website that originally published an article or a blog, instead of being directed to a site that's scraped or accumulated the information from various other sources. Unless you're operating a program that scrapes articles from other websites to bring traffic to your own, or unless your website is comprised primarily of re-posts of other people's work, this change shouldn't affect you. Google has also said it's greatly improved its ability to detect sites that have fallen prey to hackers and have been taken over for purposes of spamming. These words should come as great relief, especially to websites just starting out who are concerned with the abilities of hackers to do major reputational damage with minimal effort. As far as all the other changes go, the best advice remains one that's been a solid piece of advice for years now: create original content frequently, and you'll rise in the search engine rankings. Period.