In a previous post, we talked about how increasing the WP Super Cache “Expire time” from 1 hour to 48 hours can help the performance of WordPress blogs.
Here’s another tip that can help dramatically: Remove “bot”, “ia_archive”, “slurp”, “crawl”, “spider” and “Yandex” from the Rejected User Agents box in the WP Super Cache plugin settings. (In most cases, this will leave the box completely empty.)
Those “Rejected User Agents” prevent cached pages from being created when a search engine “visits” your site. The FAQ says this is because there’s no point creating a cached file for a page that isn’t popular, which may be true for sites that don’t have many posts and aren’t that busy. The author of the plugin is wisely being conservative to avoid problems on hosting companies that use “NFS” disks where saving a cached file can be slow.
We don’t use NFS like that, though. And on a busy site with lots of archived posts, there’s a very good chance that a page that’s indexed by a search engine will be reindexed or viewed several times within the next couple of days. Allowing these pages to be cached can make a big difference. Besides, there’s no good reason not to cache a copy of the page, since you should be sending the same page to search engines as you do to actual users.
We removed those “Rejected User Agents” on a busy site with several thousand archived posts, with the “Expire time” set to 48 hours (172800 seconds) as we recommend, and here’s what happened to that site’s CPU usage:

As you can see, the average load dropped by almost half as more and more pages were cached. That’s another way of saying that the average page loading speed almost doubled. Those are pretty good results for a change that took just a few seconds to make.