For years, it has been known that an important part of external SEO is back link building. Having plenty of “quality” links back to your site(s) has counted as a “vote” of sorts as to where your site should rank in search engines and has been heavily weighted on the anchor text that is used.
Out of this, debate has raged in some corners about whether obtaining too many links could hurt a website, or whether having lots of inbound links coming in from low quality or “bad neighbourhood” sites would penalize your website in the eyes of Google and other search engines.
Up until recently, many SEO experts as well as representatives from Google have maintained that there really isn’t such a thing as a penalty for too many links. Some do believe that obtaining too many in a unnatural fashion could devalue the worth of the links as far as your search engine rankings.
But the problem with this is that it could make negative SEO a profitable venture. What is negative SEO? It’s a theoretical (at this point) case where you want to drop a competitor(s) site in the rankings, which would benefit your site’s search engine rankings, and to do this, you use all kinds of automation to build thousands of links from low quality websites pointing at your competitors.
Today, many are worried that with some recent Google algorithms, negative SEO is now a possibility. Last month, the mighty search engine was able to discover the footprint of a popular blog network and deindex the entire network from its search databases. Many webmasters would pay Build My Rank (BMR) a monthly fee in order to submit articles with built in links pointing back to their own websites, and thereby increase the number of backlinks from websites that had a PR of at least 2 and up to 6.
At the same time, many webmasters that participated in this scheme are also reporting they received the dreaded Google Webmasters’ message informing them of the discovery of “unnatural link building” to their websites. A good number of those involved in various forums that discuss such matters also report that their own search engine rankings dramatically dropped, or their sites were de-indexed all together.
Google’s Matt Cutts has also recently come out and said that there are new algorithms to detect “over SEO optimization” of websites and to take that into account when ranking sites in searches.
These of course leads to all the speculation that negative SEO is about to become a reality. How can Google know who built a bunch of back links to a website? The website owner or a competitor, or perhaps they are even legitimate back links due to a sudden interest in some new topic, subject or product.
I do not believe that this type of SEO is in any way shape or form, ethical. To try to negative SEO your competitors is a very poor and unethical way to get better search engine rankings for your own site. Having said that, I do like information and there is a case study going on with a participant at the TrafficPlanet.com website. Now, I don’t know that this person’s results will prove anything one way or the other, but the results will be interesting to see if they are ever posted. I’ll be following the thread and you can too – this link will take you to page 7 of the forum discussion.
Personally, I hope we’re not in the age of negative SEO – but it’s difficult to know what Google has in mind when they talk about looking for signals that indicate “over optimization” and are at the same time, apparently sending out quite a number of “unnatural link detection” messages.
2 thoughts on “Negative SEO – Are We Almost There Yet?”
Pingback: Rank Jumpers Gets Deindexed - Negative SEO Really Is Here? | Ian Scott
Pingback: Google Mayhem - Stock Market Ticker Watching | Ian Scott