Robots.txt is a text file which is used to tell search robots to which page of our site to visit and which pages not to. Search engines gives importance to robots.txt and follows according to it. robots.txt is not a way from preventing search engines from crawling our site, its just used to say " boss please do not visit this page. Only visit the pages which I said to. ". We have to locate this robots.txt file in the root of our site ex: domain.com/robots.txt.
how to create a robots.txt file:
Open a note pad and type the below given html code
User-agents are search englines crawlers and disallow list the files and directories to be excluded form indexing.
enter the pages names and save the notepad as robots.txt and upload it to root of your site. Thats it. Hope this helped you in learning about robots.txt. If any thing missed please update it in comments.