Adding Items to the Dynamic WordPress robots.txt

July 20, 2016
Posted in: Code Snippets, How To, Web Development, WordPress

Your site’s robots.txt file allows you to tell search engines how you’d like them to index (or not index) parts of your site.  This file is located in the root of your site, so it’ll have a URL like ours:

https://www.machine-agency.com/robots.txt

Looking through your site’s root, however, you’ll notice that the robots.txt file doesn’t exist! No need to worry – it’s missing because it’s being dynamically generated by WordPress.  Here’s what’s included by default:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

So, how do we add our own custom items to the dynamically generated robots.txt? We can use a WordPress filter. Simply add this code to your theme’s functions.php file:


add_filter('robots_txt', 'machine_robots', 10, 2 );
function machine_robots( $output, $public ) {
  $output .= "Disallow: /somesecretfolder/\n";
  return $output;
}

Your robots.txt file will now look like:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /somesecretfolder/

In the end, it’s up to search engines to decide whether they’ll follow your recommendations, but Google and the other big ones listen quite well.

Scott Buckingham

President / Owner
613-801-1350 x101
[email protected]
Scott is a WordPress expert who has worked on hundreds of web design and development projects. He excels at finding creative ways to solve technical problems. View full profile