Cloaking

Cloaking
David Kaufmann
SEO Tutorials
7 min read

There are many search engine ranking techniques, but we can distinguish between the good or careful ones included within "White Hat SEO," and the dangerous or bad practices known as "Black Hat SEO." It all depends on how transparent and ethical we are as SEO professionals on our website when dealing with the search engine. Today, Google's algorithm is increasingly sophisticated and it is harder and harder to fool, so certain less ethical techniques will fall into disuse.

One of them is known as cloaking and had its "boom" many years ago. Now it is barely used, but it is essential to understand it to make sure it is not present on any of the websites we manage.

What is Cloaking in SEO?

Cloaking is a concealment technique, whose name comes from English and means "to cover up," and consists of showing different content from the same website to the user and to Googlebot when they make a request to the server for the content of the website they visit.

This technique belongs to the well-known Black Hat SEO, or set of deceptive and unethical techniques and strategies aimed at improving a website's ranking by deceiving Google.

What is the crawling process like?

To understand how the Cloaking strategy works, we need to know what the bot crawling process is like. The crawling and indexing process is carried out by spiders in charge of recognizing websites. Google's spider is known as Googlebot, together with its two different versions deepbot and freshbot. The first one is in charge of thoroughly following all the links contained in the content of a website and visiting the site. The second one is in charge of looking for new content on the web.

What are its origins?

The origins of Cloaking can be traced to websites with video, graphic, or animation content that have greater difficulty ranking in search results, and this technique helps compensate for this disadvantage. It then happens that the crawler is shown a full page of content describing the content of the images or video due to the processing speed that bots have.

How does the Cloaking process work?

The purpose of this technique is to deceive search engines to improve a website's ranking in search results. It is unethical because the content shown to the user is usually pornographic or gambling content (such as a casino) while very different content is shown to Google's robots.

Generally, it is carried out through two different techniques:

Agent name delivery: access to a website is made through a "user agent" that can range from a physical person to a bot. Because of this, the server can adjust the content to display depending on the type of user agent. Cloaking then occurs when different content is delivered based on the type of user visiting the site.

Some time ago, using the "User agent switcher" plugin, you could browse the page with Googlebot's user agent. This is very useful for checking whether there are differences between the content shown to the user or to the bots.

User agent switcher plugin
User agent switcher plugin

IP Delivery: When the delivery of a website's content takes into account the IP address from which the query is made.

This technique is carried out by modifying the .htaccess file. For example, the Apache server has a module called "mod_rewrite" that allows this modification.

Are there more Cloaking techniques?

Some techniques are not strictly Cloaking but hide or make it difficult for bots to crawl content. They can be considered Cloaking:

  • Hidden or invisible text: It is used to include more keywords and additional information hidden from the user. This information is "masked" with the same color as the background color of the page.

  • Flash websites: As you know, the use of flash is not recommended in the SEO recommendation guidelines. Search engines are unable to read the content and therefore rank it.

  • E-mail: It occurs when neither the name nor the email address of the sender is specified and therefore it is unknown who is sending the email.

  • Websites with lots of HTML: Good SEO recommends a high html/text ratio that helps rank pages in the search engine. But if this is not the case, there will be little text and the ratio will be low. To counter this and avoid a redesign, the cloaking technique has been used.

  • Image gallery: To compensate for images, webmasters included keywords that would help with ranking.

  • Geolocation: It consists of showing different pages depending on the location. This is a current tactic since many websites do it by showing different content depending on the country, for example. But it becomes a malicious technique if different content is shown depending on the user or bot.

  • With CSS and JavaScript: Since Google is unable to interpret JavaScript, keywords and links can be introduced into the website. Through a function, it can be programmed so that they do not disturb the user's navigation and only stand out in importance with H headings for Google.

  • Redirects: Through redirects, they send the user who clicks on a search result to a different website. This can be easily checked with the "Redirect path" plugin. Currently, it is a spam technique that is still used to deceive and offer controversial content.

  • The most recent technique can be considered following the development of SPAs (Single Page Applications) since these pages are developed with JavaScript and the rendering difficulty this entails for Google can be interpreted by the search engine as an attempt at deception. With the "Lighthouse" plugin we can observe the rendering process of a website in the console and see how the browser is able to process the site, as well as generate a complete report with recommendations for improvements and opportunities.

Lighthouse plugin
Lighthouse plugin

Screenshot 2020 02 06 at 10.19.30 800x658.png
Screenshot 2020 02 06 at 10.19.30 800x658.png

Cloaking outlook in 2020

This practice is obsolete and you can be severely penalized by Google if you use it today. While years ago you could deceive the search engine, nowadays it is almost impossible due to all the algorithm updates that make Google a more natural and ethical search engine focused on users and on offering quality search results focused on search intent.

Black Hat SEO practices are pursued and penalized by the webspam team in charge of penalizing sites that use them, even going so far as to make an entire website disappear from search results. Therefore, this technique has completely ephemeral results.

Sources consulted:

  • Cyberclick:* "What is Cloaking?"*

  • Luis Villanueva: "What is Cloaking?"

  • Ionos: "What is Cloaking and why should you avoid it?"

  • We live security: "What is Cloaking?"

  • Iebschool: "What is concealment or Cloaking SEO?"

  • Sistrix: "What is Cloaking?"

  • Catchupdates: "What is Cloaking in SEO & Should You Do Cloaking?"

  • Search Engine Journal: "What is Cloaking & Is All Cloaking Evil?"

Author: David Kaufmann

David Kaufmann

I've spent the last 10+ years completely obsessed with SEO — and honestly, I wouldn't have it any other way.

My career hit a new level when I worked as a senior SEO specialist for Chess.com — one of the top 100 most visited websites on the entire internet. Operating at that scale, across millions of pages, dozens of languages, and one of the most competitive SERPs out there, taught me things no course or certification ever could. That experience changed my perspective on what great SEO really looks like — and it became the foundation for everything I've built since.

From that experience, I founded SEO Alive — an agency for brands that are serious about organic growth. We're not here to sell dashboards and monthly reports. We're here to build strategies that actually move the needle, combining the best of classical SEO with the exciting new world of Generative Engine Optimization (GEO) — making sure your brand shows up not just in Google's blue links, but inside the AI-generated answers that ChatGPT, Perplexity, and Google AI Overviews are delivering to millions of people every single day.

And because I couldn't find a tool that handled both of those worlds properly, I built one myself — SEOcrawl, an enterprise SEO intelligence platform that brings together rankings, technical audits, backlink monitoring, crawl health, and AI brand visibility tracking all in one place. It's the platform I always wished existed.

→ Read all articles by David
More articles from David Kaufmann

Discover more content about this author