Top 11 Euphemisms for Cloaking
Online marketing information can change quickly This article is 18 years and 211 days old, and the facts and opinions contained in it may be out of date.
Euphemisms are used in many areas of politics. The definition of cloaking to an engineer, and to an SEO is marginally different in terms of semantics. Cloaking has been villafied by search engines when users and bots are served different content. Engineers believe bots are pretty smart (they normally are) – and SEO’s believe bots should be lead around by the nose only to appropriate areas. “Cloaking” often implies intent and extent that conflict with SE terms of service – but there are many very grey areas as far as what is acceptable and what isn’t. By definition – cloaking is NEVER acceptable – so be sure you are using the proper terminology. Of course this is a bit tounge and cheek – but the point is that there are certainly valid reasons for selectively delivering content – and that “cloaking” is mainly defined by intent. I’m pretty glad I’m not the guy at the SE’s that has to determine the intent of redirects.
-
1. IP delivery
- Geo-targeting
- Flash Detection
- Server speed analysis
- Duplicate content detection and reduction
- Member experience discovery
- User agent detection
- Browser extension
- Spider detection
- User experience maximization.
- Selective demographic delivery
What are the best reasons for “selective delivery” that you’ve heard? Do you think search engines would frown on that type of delivery if detected?
Thanks to Dan, Marshall, Brad, Neil, and Cameron for their contributions to the conversation that spawned this.

You forgot “Adsense Optimization”…
Here’s a reason voiced by Stephan Spencer to the search engines at SES Chicago last year.
What is your current official position on simplifying the URLs selectively for bots like Googlebot, Yahoo Slurp, etc. by user-agent detection in order to drop session IDs and other superfluous parameters from the URL? Do you consider it cloaking? And if so, is it good cloaking or bad cloaking?
Will the same page content display to the user if that user types into their browser the URL that was given to the bot? I responded with a “Yes,” then all four search engines all confirmed individually:
No problem.
Then Charles Martin from Google jumped in again with:
Please do that!
So easy to justify: paid advertisements from an external feed or source should be cloaked from bots. By removing the feed you are protecting your advertisers from inflated hit counts by non-human visitors. Non-humans don’t convert. You also don’t want those links getting indexed as part of your cached page.
Likewise any content that is paid for by impression (CPM) should not be shown to non-human visitors, because it’ll get indexed, cached, crawled and abused.
Cloaking can be a great way to preview a new design/website in situ for stakeholders while delivering the current version to the general public.
[…] Top 11 euphemisms for cloaking, from Stuntdubl. […]
[…] The Top 11 Euphemisms for Cloaking • • • […]
[…] The very cool Todd Malicoat on Cloaking Euphemisms […]