How to block all websites crawlers using Robots.txt file

What is Robots.txt file?

Robots.txt is a text file located in the site’s root directory that specifies for search engines’ crawlers and spiders what website pages and files you want or don’t want them to visit.

Kindly follow the below steps to setup and block web crawlers via Robots.txt file

Step 1: Login to the Cpanel.
Step 2: Open File Manager and go to the root directory of your website.
Step 3: The Robots.txt file should be in the same location as the index file of your website.
Step 4: Right click on the phpinfo.php file and choose code editor to edit

User-agent: *
Disallow: /
Step 5: Add the lines of the code provided above to the file and click on ‘Save’ to save the changes made.

Click here to view this article as Oryon knowledge base video tutorial playlist.

Related Articles