Sponsor: Quick testing across devices with this browser built for web developers.

Your website page has been published and can be accessed on the internet. But how do you know that your website can be indexed on search engines like Google, Yahoo, Bing, Yandex?

In some cases, this will depend on the technology on your website.

If you are using popular CMS like Wordpress, Drupal, Joomla, etc..Your site might be indexed by search engines automatically.

Sometimes, it is necessary to prevent certain pages on a site from being indexed by a search engine.

There are several methods that can be done so that the website is not indexed by search engines.

Here are some methods that can be used on the website:

Using robots.txt

In your website root edit or create a file named robots.txt then write this code.

User-agent: *
Disallow: /

Using .htaccess

Go to your File Manager through cPanel then go to your domain root. Edit or create file named .htaccess then write this code:

<IfModule mod_headers.c>
 	Header set X-Robots-Tag "noindex, nofollow, noarchive"
 	<FilesMatch "\.(doc|pdf|png|jpe?g|gif)$">
		Header set X-Robots-Tag "noindex, noarchive, nosnippet"
	</FilesMatch>
</IfModule>

Using meta robot tag

The other alternative you can use is by add meta robot tag inside the head tag.

<meta name="robots" content="noindex,nofollow">

Using those methods is very conditional.

The .htaccess methods if very powerful to stop the robot from crawling the whole website. But if you need just for the specific page you can use manually adding a meta tag to the page.

Share

Let me know your thought or you find a mistake/outdated content for this post.

Get the latest update of tutorials, inspirations and useful resources for website design and development to your email box