WordPress dynamically creates robots.txt
. To overwrite it during a normal, non-multisite installation, you can simply upload a static robots.txt
file to the server. In a multi-site installation, this will overwrite the robots.txt file for all sites, which is not always the desired result. In this post, we will discuss how you can change the robots.txt for individual multisite sites.
WordPress has a do_robots()
function and a do_robotstxt
filter that allows you to change the output of a dynamically generated robots.txt
file. The is_multisite()
function allows you to check if multisite is enabled on the site. Function get_current_blog_id()
returns the ID of the current site, which we can use to check the specific site and add rules to the robots.txt file. This is roughly what it might look like:
function wpz_robots_txt( $output, $public ) {
if ( is_multisite() ) {
if ( get_current_blog_id() === 1 ) {
$output .= "Disallow: /account/\n";
$output .= "Disallow: /cart/\n";
} else {
$output .= "Disallow: /category\n";
$output .= "Disallow: /news\n";
}
}
return $output;
}
add_filter( 'robots_txt', 'wpz_robots_txt', 20, 2 );
For the site with ID 1 (online store) added rules that close the account and shopping cart pages, and for all other sites closed categories and news.