How to change robots.txt in a WordPress multisite

WordPress dynamically creates robots.txt. To overwrite it during a normal, non-multisite installation, you can simply upload a static robots.txt file to the server. In a multi-site installation, this will overwrite the robots.txt file for all sites, which is not always the desired result. In this post, we will discuss how you can change the robots.txt for individual multisite sites.

WordPress has a do_robots() function and a do_robotstxt filter that allows you to change the output of a dynamically generated robots.txt file. The is_multisite() function allows you to check if multisite is enabled on the site. Function get_current_blog_id() returns the ID of the current site, which we can use to check the specific site and add rules to the robots.txt file. This is roughly what it might look like:

function wpz_robots_txt( $output, $public ) {

	if ( is_multisite() ) {
		if ( get_current_blog_id() === 1 ) {
			$output .= "Disallow: /account/\n";
			$output .= "Disallow: /cart/\n";
		} else {
			$output .= "Disallow: /category\n";
			$output .= "Disallow: /news\n";
		}
	}

	return $output;
}

add_filter( 'robots_txt', 'wpz_robots_txt', 20, 2 );

For the site with ID 1 (online store) added rules that close the account and shopping cart pages, and for all other sites closed categories and news.

How useful is the publication?

Click on a star to rate it!

Average score 5 / 5. Number of grades: 1

No ratings yet. Rate it first.

Similar posts

How to exclude posts with parent post in wp_query, WordPress

To exclude posts that have a parent (i.e., child posts) in a WP_Query request, you can use the post_parent argument. This argument controls whether the post has a parent or not. To exclude child posts, set the condition post_parent => 0, which means that only top-level posts (posts without a parent) will be included in…
Read more