Create a robots.txt file to control search engine crawlers - with simple checkboxes or full control.
Optional: Your domain for the Host directive
Add custom folders:
User-agent: *Allow: /Disallow: /admin/Disallow: /wp-admin/
A text file that tells search engine crawlers which pages they are allowed to visit
Malicious bots can ignore robots.txt - for real protection use authentication
Download the file and place it in the root directory (example.com/robots.txt)