What is Robots.txt

A robots.txt file is a protocol used to prevent web crawlers and robots from accessing all or part of a website. This allows you to control which web crawlers and robots can access your site and how often they do so. 

It’s important to understand that robots.txt is not a way of blocking robots from scanning your site. It is simply a request to not visit it. Also, because this file is publicly available, anyone can see what sections of the site you don’t want the robots to access. 

A Robots.txt file can be created in something like notepad or textedit. You can find a couple of examples below on how best to use this. 

This will allow all robots complete access 

User-agent: *Disallow:

To exclude all robots from the entire server 

User-agent: *Disallow: /

Lastly to allow a single robot 

User-agent: GoogleDisallow:User-agent: *Disallow: /

Once you have created your file, you would need to upload it to your root directory via FTP or your File Manager. Details of how to do this can be found at the links below: 

How to upload files to your website with FTP 

How to upload files using File Manager

That’s it!  

Was this article helpful?

Check out some of our related guides

Need a hand? Search over a hundred step-by-step support guides