Back to site

©2024. All rights reserved.
Crafted by 4Property.

Is a static robots.txt file good for performance?

This article is based on a recent tweet by Konstantin Kovshenin:

Is a static robots.txt file good for WordPress performance?

To display the default robots.txt, a fresh WordPress install will:

Run 15 SQL queries: load all options, can_compress_scripts, WPLANG, query last 10 posts, query the recent posts widget options, query terms, taxonomies, etc., for found posts, all post metadata for found posts, recent comments widget, recent entries widget and a few others.

Call file_get_contents() 61 times to register some core Gutenberg blocks, that’s in addition to json_decode() each file, and about 100 calls to file_exists() on those files.

Register 22 post types, 24 post statuses, 22 sidebar widgets, 12 taxonomies, 7 block pattern categories, 10 block styles, 3 image sizes.

It will call gettext translations: 1917 times for regular strings, and 875 times for strings with context. Using the default locale, there is zero overhead. None of those strings are used in robots.txt.

Check whether the front page has been set as a static page, and whether the request is_front_page() or is_home(). Also, is_single(), is_feed(), is_admin(), is_category(), is_search(), and the list goes on.

Check whether the user is logged in, 14 times, and whether we need to display an admin bar, also heartbeat settings. It will also attempt to read the user session, and create 3 nonces.
Reminder: this is an anonymous request.

Escape some HTML 78 times.
Reminder: robots.txt is a plain/text file, there’s no HTML. It will check whether the admin needs to be forced SSL. It will also initialise smilies, and Twenty Twenty One “dark mode”.

Run 83 unique actions (one of them is do_robotstxt) and apply 530 unique filters (one of them is robots_txt).

All combined, that’s > 42,000 function calls at 5.46 megabytes peak memory, about 100 ms wall time. So yes, by all means, please use a static robots.txt file.

This behaviour has been previously discussed in WordPress Trac here and here:

robots.txt requests cause WordPress to query the database for all the homepage posts. It’d be good if there was perhaps a query var to short circuit the internal querying and instead perhaps fake a page template… Or probably better to just offer an appropriate hook.

On a robots request, WordPress appears to make a SQL request for the same as it would on a Home index page. Ideally, WordPress should not query the database for posts on robots.txt requests.

What is the solution?

Currently, on WordPress side, none. However, as part of our Supernova theme package, we create a static robots.txt file in order to bypass all the actions above.

Quick Tip: Do not use a redirection plugin, set your redirects up directly on the server.

Share

Maybe it’s time your estate agent website had a faster frontend!

Using technologies such as WordPress, Lighthouse, Persistent Object Caching and bleeding-edge server software, it’s now easier than ever!

WordPress Property Drive
A property management plugin for WordPress

Import properties from Property Drive, search, display, filter, sort, and map using our flagship WordPress property plugin.

Import properties seamlessly from Daft, MyHomeAcquaint CRM, Rightmove, Reapit CRM, and more using our custom feed integrations!

WordPress Lighthouse
A pagespeed optimization and performance plugin

Privacy Policy