Cloaking Is As Cloaking Does

The cloaking debate pops up again. This debate rears it’s head once or twice a year, makes a lot of noise, then disappears back beneath the surface. Unresolved.

In the ironically name thread “a quick word about cloaking“, a l-o-n-g debate is raging once again, amidst frequent calls for people to please stop debating cloaking again, as it is making everyones heads hurt.

Those calls will probably fall on deaf ears while the hypocrisy around cloaking remains.

The reality is that some search engines allow some sites to cloak. I suspect they don’t want to say that some cloaking is ok, because they fear this will give more people the incentive to push the boundaries. So, search engine guidelines and reps often state all cloaking is bad, then attempt a tortuous, and not-entirely-convincing redefinition of the term “cloaking” in order to validate their ongoing inclusion of sites which appear to be showing one page to humans, and different page to search bots.

For those not familiar with cloaking, the technical process is simple. A site shows a different page to a search bot than it shows to a subsequent human visitor. Obviously, the potential for abuse is huge, although there are valid reasons why a webmaster may want to show different pages to search engines, and another to visitors, as outlined in the thread.

The problem occurs when a moral redefinition of what is essentially a technical process, occurs. Some say cloaking is “evil”, “deceptive” or “tricking” or “bad”, and if it isn’t those things, then the technical process of showing one page to bots and another to humans isn’t cloaking. In reality, cloaking isn’t morally dubious, any more so than a hammer is morally dubious. A hammer could be used for dubious purposes, but not all hammers are used for dubious purposes. A hammer is, in the end, just a tool.

The search engines internal definition of cloaking may differ from what they state in public. The reality is that if a search engine wants to delist your site, they’ll delist it. They don’t need to provide a reason, cloaking or otherwise.

So, the argument remains unresolved. It will remain unresolved while the search engines are inconsistent in their application of policy in this area. While sites that show one page to bots and another to humans are listed, then the cloaking debate will resurface, time and again.

For the record, I want the serp to lead to the same content outlined in the snippet. Not a logon page, subscription page, or anything else.

Leave a Reply