Rank Jumpers Is Dead

Google is hitting the blog networks really hard and has their sights set on them. Today it is official that one that tried to emulate Build My Rank has now been almost decimated by the de-indexing of Google.

Rank Jumpers sent out the following to their subscribers today:

Rank Jumpers is very disheartened to say that the recent
change by Google to go after blog networks has now affected Rank Jumpers. We have had a large deindexing of blogs.
After careful consideration we have decided to cease the
blog network services by Rank Jumpers. In short we are
shutting down the blog network. Google is very aggressive
with their attempts to go after the networks and it would
be worthless to try to replace blogs etc.

Our Process:
We will offer a full credit back to any customers that purchased our services in April. We will pro-rate the credit back for customers that purchased the monthly service in March. We will also credit back any credits that have not been used!

The deindexed sites will have all posts removed as soon as
possilble. All other posts will be removed shortly after!

 

It’s interesting that after the fall of Build My Rank, Rank Jumpers was touted by many others including themselves as a viable alternative. Problem is, they had such a similar footprint that I predicted it would not be long before Google was able to track them down and de-index them. It wasn’t that difficult to figure out, even for me (and I’m not a Google algorithm) what sites were probably in the Rank Jumper network.

And again, of course, many website owners are lamenting the fact that there search engine rankings have dropped considerably and they are getting the dreaded “unnatural link building” warning message from Google.

If you solely relied on this type of back linking, it does mean trouble for you.

But, all these “unnatural link building” warning messages from Google certainly does increase the specter of negative seo as I wrote about the other day.

I am actually seeing this happen to one of my own sites, and it is not because of any unnatural link building that I’ve done. A couple of months ago, Google Webmaster Tools (GWT) reported about 14,000 backlinks to the site – which was pretty normal – no unnatural link building was every done to the site as that is not how I work.

A couple of weeks ago, I was shocked to see that GWT was reporting over a million backlinks! Upon further investigation, I discovered that Google had somehow found the IP address as well as the server name that the website exists on – and in doing so, had found two different “aliases” for the website that in reality, it should not have. So it is actually “seeing” this as several different sites, all with links back to the main website, and in this manner, has counted up hundreds of thousands of back links that really are not “real” backlinks at all. This is a huge problem with the Google Bot’s way of scouring the internet – and is obviously a weakness of Google.

There is a plan in place to correct this as obviously, going from 13,000 back links to over a million in a short span of a month or so would appear unnatural under most circumstances (but not all).

But subsequently, I noticed some major search terms that the site used to rank for, drop out of the sky. I have yet to see a “unnatural link building” warning, but it is puzzling to me how Google could claim they have precautions in their algorithm to ensure they are not counting the wrong thing as “unnatural link building” when it would appear that they may not.

I am fairly convinced the drop in rankings is related to the enormous number of so called “back links” (which are not really back links at all) that Google claims to have found.

And that leads me to believe that negative SEO is truly something that is going to be experimented with in a major way by those who are so unethical to do so. I’m not sure what Google is going to do about this, because it is obvious to me that in punishing some quality sites, they are promoting others that are obviously garbage – and even some that really have nothing to do with a particular search term at all.

Time to let the dust settle after these changes.

Negative SEO – Are We Almost There Yet?

For years, it has been known that an important part of external SEO is back link building. Having plenty of “quality” links back to your site(s) has counted as a “vote” of sorts as to where your site should rank in search engines and has been heavily weighted on the anchor text that is used.

Out of this, debate has raged in some corners about whether obtaining too many links could hurt a website, or whether having lots of inbound links coming in from low quality or “bad neighbourhood” sites would penalize your website in the eyes of Google and other search engines.

Up until recently, many SEO experts as well as representatives from Google have maintained that there really isn’t such a thing as a penalty for too many links. Some do believe that obtaining too many in a unnatural fashion could devalue the worth of the links as far as your search engine rankings.

But the problem with this is that it could make negative SEO a profitable venture. What is negative SEO? It’s a theoretical (at this point) case where you want to drop a competitor(s) site in the rankings, which would benefit your site’s search engine rankings, and to do this, you use all kinds of automation to build thousands of links from low quality websites pointing at your competitors.

Today, many are worried that with some recent Google algorithms, negative SEO is now a possibility. Last month, the mighty search engine was able to discover the footprint of a popular blog network and deindex the entire network from its search databases. Many webmasters would pay Build My Rank (BMR) a monthly fee in order to submit articles with built in links pointing back to their own websites, and thereby increase the number of backlinks from websites that had a PR of at least 2 and up to 6.

At the same time, many webmasters that participated in this scheme are also reporting they received the dreaded Google Webmasters’ message informing them of the discovery of “unnatural link building” to their websites. A good number of those involved in various forums that discuss such matters also report that their own search engine rankings dramatically dropped, or their sites were de-indexed all together.

Google’s Matt Cutts has also recently come out and said that there are new algorithms to detect “over SEO optimization” of websites and to take that into account when ranking sites in searches.

These of course leads to all the speculation that negative SEO is about to become a reality. How can Google know who built a bunch of back links to a website? The website owner or a competitor, or perhaps they are even legitimate back links due to a sudden interest in some new topic, subject or product.

I do not believe that this type of SEO is in any way shape or form, ethical. To try to negative SEO your competitors is a very poor and unethical way to get better search engine rankings for your own site. Having said that, I do like information and there is a case study going on with a participant at the TrafficPlanet.com website. Now, I don’t know that this person’s results will prove anything one way or the other, but the results will be interesting to see if they are ever posted. I’ll be following the thread and you can too – this link will take you to page 7 of the forum discussion.

Personally, I hope we’re not in the age of negative SEO – but it’s difficult to know what Google has in mind when they talk about looking for signals that indicate “over optimization” and are at the same time, apparently sending out quite a number of “unnatural link detection” messages.