The way we write our articles, or the content we allow in them if we use 3rd parties, is changing dramatically and swiftly. Over our shoulder comes big G, (Google to be formal), watching and appraising every move we make. We have all heard many times that good unique content is king and Google continues to take steps to weed out poor quality spun and duplicate content.
The page theme, Latent semantic indexing, page context and “buzz” words found within any specific niche will just be a part of what the Google algorithm will be looking for when determining the value of our articles.
This is not exactly something new which has appeared overnight, it is a logical conclusion as to how the Search Engines decide to combat the latest wave of people trying to outsmart the system with article spinning software and automatic website creation scripts.
Although there is nothing inherently wrong with using PLR articles the unfortunate thing is that hardly anybody really knows how to make them work effectively. Either through laziness or ignorance, or perhaps a little of both, these articles are purchased at low cost, or obtained free, and systematically pumped out at great rate to every article directory on the Net using their original content.
There will always be a battle between the engines and the wizards but it doesn’t take Google long to catch up and start working towards solutions to maintain quality. As I write this I have read today of sites vanishing out of Google overnight with a subsequent loss of revenue to the website owners. I will not speculate on whether or not their sites were of poor quality but the message is clear that if your house is built of cards it won’t stay up long.
Don’t forget that the people churning out the latest script or PC program make money fooling the engines and this will be a life long process for them. It could also be a lifelong process for someone who continually keeps buying the scripts but never actually builds a solid business.
At the moment Google is dealing with the problem of duplicate content. Their answer has been to, (generally), take the original copy of an article and ignore the others found around the web with the same or almost the same content. There never was any “penalty” imposed by Google for dupes.
Perhaps all this is about to change very soon. There is no doubt that the Google management want and demand quality to give the best user experience possible and I can’t find fault with this. If advertisers become dissatisfied with the quality of the pages their, (paid), advertisements are being shown, then it’s only a mouse click away to Yahoo.
So the overall continuing strategy is to clean house and accept only good quality relevant content into the index. Trying to “lead” Google by the use of meta tags, titles, filenames with keywords, keyword density and the like may be futile in the future. The best answer is to go back to what we all use to do anyway, write naturally using relevant topics centered around a specific theme and to give the article our best attention for our readers.