Cloaking is the term given to websites that present one version of their webpages to Google’s crawler bots, and another to web users. It’s a “Black Hat” technique that is both against Google’s guidelines and can seriously harm SEO if detected – seeing webpages delisted or demoted.
Cloaking is achieved by using “noarchive” within metatags, which serve the purpose of making it difficult for Google’s bots to access and index webpages. The purpose of cloaking is to conceal nefarious activity, spam content, or other content which webmasters know will result in their demotion or delisting on search engines. You may be able to expose cloaking on a webpage by activating Google Translate on a suspect page, as this uses crawlers just as webpage indexing does.
While cloaking is regularly associated with negative ends, there are some cases in which it’s not against search engine guidelines. For instance, cloaking can be used in order to replace complex links and search engine unfriendly data (such as those containing session IDs) with links and data that’s easier for website searching crawlers to read.