Your site’s robots.txt file allows you to tell search engines how you’d like them to index (or not index) parts of your site. This file is located in the root of your site, so it’ll have a URL like ours:
https://www.machine-agency.com/robots.txt
Looking through your site’s root, however, you’ll notice that the robots.txt
file doesn’t exist! No need to worry – it’s missing because it’s being dynamically generated by WordPress. Here’s what’s included by default:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
So, how do we add our own custom items to the dynamically generated robots.txt
? We can use a WordPress filter. Simply add this code to your theme’s functions.php
file:
add_filter('robots_txt', 'machine_robots', 10, 2 );
function machine_robots( $output, $public ) {
$output .= "Disallow: /somesecretfolder/\n";
return $output;
}
Your robots.txt
file will now look like:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /somesecretfolder/
In the end, it’s up to search engines to decide whether they’ll follow your recommendations, but Google and the other big ones listen quite well.