How to Restrict Google to not index your web pages
Robots.txt file: robots.txt file is the guide of your site that indicates what to index or not. Those parts of your site that you don’t want crawl insearch engine , you can restrict them into robots file for example: admin pages, add to cart pages and other private / personal pages where you don’t need any traffic come or hit directly. In this file you give instruction which folders and files are restricted. “Normally robots presents as agent”
You may want to exclude some of your web page from google because those web pages are not useful for user so you don’t want those to visible on search engines. If you want to exclude those page from google search console you can do it from Robot.txt file. Robot.txt file contain those restricted page which are not promoted for landing or not to visible the path on search engine. For creating robots.txt you have make it and upload on your server / hosting and advice Google to restrict pages from google webmaster Robots tools.
• Don’t crawl your website internal pages.
• Allowing pages like naeemrajani.com/proxy/google/com indexed, because they’re direct copies of sites.
More secure way to restrict URL in Search Engines for Sensitive information
Robot.txt is not 100 % effective way or approach to block confidential, informative and sensitive materials. Robot.txt only indicate the crawler that this pages are not for indexing but it does not means that your server is block from delivering those pages to a browser. Sometimes Search engines index your block URL by showing the URL only without any title or snippet.
In this situation, use tag for blocking those url in search engines. But still any user can reach to those page don’t be tense, use real security method by which you can block those site properly with specific password or other security method