When should you use a robots.txt file?

(A) When you have multiple versions of a page to indicate the preferred version

(B) When your website receives a penalty from Google

(C) When you have pages that you don’t want search engines to crawl and index

(D) Whenever you feel like it


You should use a robots.txt file only when you want to prevent search engine bots from crawling and indexing your web pages.


This question is from the HubSpot SEO Certification Exam. You can get all the answers to the questions asked in this exam on our HubSpot SEO Certification Exam Answers page.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.