Changing Algorithms and Google’s Achilles’ Heel

The Old SEO: Automated Linkage

My first website was a contractor directory. It listed contractors by state, city, zip code and specialty. It had one of the best MySQL backends I’ve ever seen on a contractor directory.

Contractors paid annually to be listed in the directory, and we promoted it like crazy. However, it’s an old and stale business model. There are probably more contractor directories and contractor referral services on the Internet than there are contractors (slight exaggeration). We were spending tens of thousands of dollars monthly with Overture, and we wanted to save some money by getting organic rankings.

We turned to a so-called professional SEO, David Gilmore of EvergreenWebServices.com. He changed some page titles, changed the meta keywords and meta descriptions and got us booted out of Inktomi (this is back when Inktomi actually drove traffic).

On the SEO forums at that time, SEO’s were talking about page titles and meta tags and keyword density. We tried those and it didn’t work.

There came a point when I said, If you want it done right, do it yourself. I set out to reverse engineer the algorithm. I’d do searches like “contractor” and “contractors” and “home construction” and find pages listed in Google that didn’t even mention those terms. It wasn’t long before I realized that Anchor Text was King.

We immediately hired folks to spam every guestbook on the Internet, and within a month or so we had first page (mostly top three position) rankings for most of our keywords, including “contractor” and “contractors”.

Even with those rankings, the site lost money, and was eventually bought out by ServiceMagic.

The New SEO: Editorial Links

Somewhere along the way, Google bombing became Google bowling. Google implemented filters which treated low quality links one way, and high quality links another. Editorial links are valued; non-editorial (forums, guestbooks, blog comments) are not.

The entire basis of this change had to do with human review. Google knows that links that are not human-reviewed are low quality, and wants to treat them accordingly.

Yet SEO’s, in their laziness, are still attempting to beat Google with automation. This is folly. Google’s Achilles’ heel is their blind and fanatic obsession with algorithmic filters. Google is religiously opposed to human review. And that, to the SEO, is a wide open barn door.

So how does one exploit Google’s weakness? Well, let’s get into that.

So let’s go back to reverse engineering. But instead of reverse engineering the algorithm – which we know to be based on links – reverse engineer the link profiles of top ranked sites.

Look at the sites that dominate the SERPs. What do those links look like? Going through those links, you may be surprised to see a lot of low PR link pages, low PR blog posts and low PR article pages.

It isn’t so much the PR that makes the link count, or not count. It’s often the unique-ness, or lack thereof.

A lot of folks think that Supplemental Index is a bad thing because pages relegated to the supplemental index do not drive traffic.

I believe the supplemental index is almost wholly pages with non-unique text, and Google’s motivation for creating another index for that has more to do with filtering (devaluing) links from those pages than it is with keeping those pages out of their search results pages.

Google’s algorithm is still 95% link based. When you are a search engine that is driven by links and manipulated by links, it is in your interest to restrict link weight distribution authority. Google has smartly done this by devaluing link weight from non-unique pages (read: supplemental index); I am guessing that they also devalue non-unique links.

So, for those who need it spelled out, automated linking is out. Unique links count more. Unique anchor texts, unique contexts on pages with entirely unique text.

Non-unique = bad, very bad.

Unique = good, very good.

Leave a Reply