What is a robots.txt file and what is it used for ?

What is a robots.txt file?

The robots.txt, also known as the robots exclusion standard or robots exclusion protocol, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or crawled. 

What it is used for?

  1. It is used when you want to ignore any duplicate pages on your website
  2. It is used when you don’t want search engines to index your internal search results pages
  3. It is used when you don’t want search engines to index certain areas of your website or a whole website
  4. It is used when you don’t want search engines to index certain files on your website (images, PDFs, etc.)


Was this article helpful?

/

Can't find what you're looking for?

Contact our support team and we'll help you out

Contact Support