WordPress, Caching, and Choice of Themes

WordPress, Caching, and Choice of Themes

It’s nice that those of us who want to blog have WordPress available as a blogging platform. It is free and open-sourced and is frequently updated with new features and patches of inevitable bugs.

But, it is not built for serving more than one or two pages per second. While that might not sound bad, given that it’s more than 86,000 pages per day, if a high-profile blog links to your site, you could receive hundreds of page requests per second. At that point, your blog would crumble and visitors would not be able to view it.

But, if you use a caching plugin, you can get lots of page views per second, and it will likely save your blog from crashing during those periods where you get a lot of page requests. This can have downsides, though, and can even limit your choice of themes. I’ll get more into that in a moment, but first I’ll share my own benchmark data.

When I started using the BasicMaths theme from Khoi Vinh, I figured it was so lightweight that WordPress could serve up several page views per second. So, I turned caching off. (I had some hints that the caching plugin might have been interfering with comments. No proof … just suspicions. So, I wanted to see how things went without caching.)

But then I came across Gruber’s article (linked above) and wondered if I could really do without caching. So, I ran a test on my server using ApacheBench:

  • Concurrency Level: 4
  • Time taken for tests: 26.008 seconds
  • Complete requests: 40
  • Failed requests: 0
  • Write errors: 0
  • Total transferred: 1493320 bytes
  • HTML transferred: 1482280 bytes
  • Requests per second: 1.54 [#/sec] (mean)
  • Time per request: 2600.806 [ms] (mean)
  • Time per request: 650.202 [ms] (mean, across all concurrent requests)
  • Transfer rate: 56.07 [Kbytes/sec] received
  • Percentage of the requests served within a certain time (ms)
  • 50% 1736
  • 66% 2112
  • 75% 2155
  • 80% 2541
  • 90% 3010
  • 95% 3178
  • 98% 26008
  • 99% 26008
  • 100% 26008 (longest request)

I was surprised and disappointed in the 1.54 pages per second, with some requests taking 26 seconds to complete. So, I began disabling plugins, and saw a little bit of improvement but not enough to give me confidence in my system.

Then I installed the latest version of WP SuperCache and set it up to cache each page upon reading, and to modify my htaccess file to bypass WordPress for subsequent requests of cached pages and serve up the cached page. The improvement was significant:

  • Concurrency Level: 4
  • Time taken for tests: 0.168 seconds
  • Complete requests: 40
  • Failed requests: 0
  • Write errors: 0
  • transferred: 1501400 bytes
  • HTML transferred: 1487760 bytes
  • Requests per second: 238.72 [#/sec] (mean)
  • Time per request: 16.756 [ms] (mean)
  • Time per request: 4.189 [ms] (mean, across all concurrent requests)
  • Transfer rate: 8750.26 [Kbytes/sec] received
  • Percentage of the requests served within a certain time (ms)
  • 50% 15
  • 66% 18
  • 75% 20
  • 80% 21
  • 90% 24
  • 95% 33
  • 98% 52
  • 99% 52
  • 100% 52 (longest request)

With SuperCache, my site can deliver over 200 pages per second, and no requests took longer than 52 milliseconds. That’s a remarkable improvement!

(As an aside: I also experimented with W3 Total Cache, but was unable to get more 13 pages per second out of it, at least with my choice of caching method of “disk” … however, I may experiment more with it someday … a lot of people swear by it.)

There is a downside, though, at least with regard to the BasicMaths design (and with some of my past choices, too.) The BasicMaths design is set up to account for discrepancies in how different browsers handle code, which is great. (For example, when it detects that the reader is using Internet Explorer, it will serve slightly different CSS code than if the reader had been using Chrome or Safari or Firefox.)

But the way that it does it appears to cause a problem in caching. The caching software saves the version of the first user to a page. So, if the first person to read post xyz uses Firefox, the caching software is going to save the page using the Firefox CSS. When a person views that cached page using Internet Explorer, part of the page may be all messed up, because Internet Explorer needed different CSS.

To help you picture this, we’ll work with an example. I fired up my Windows machine (I normally use a Mac), to view the Keener Living home page in Internet Explorer. I looked at the source of the page that was served, and it contained the following line:

That is, this cached page was created when it was viewed on an unknown browser (possibly even Google Bot), and so the design just uses the default css. The default CSS seemed to work okay except for blockquotes, which became indented a couple of hundred pixels, making the quote look really odd. Here is a snapshot:

There are ways to use supplemental stylesheets for different browsers (typically to account for Internet Explorer quirks), without caching problems. Paul Irish discusses optimum methods in his popular post. This post and the posts that it references are important reading for web designers and bloggers.

I’m not entirely sure how I will proceed at this point. The options are to (1) keep the current design and move to a dedicated server, hoping that it can serve a lot more than one page per second, (2) live with the problem, although that means some readers are occasionally going to see dorky looking pages, (3) get a different design, and (4) move to a new platform, such as ExpressionEngine, in hopes that it can handle a lot higher throughput.

I really don’t want to move servers. I love Tiger Technologies and have gotten great (actually fantastic) service anytime I had a question. (Search Engine Land also uses TT and the head of the webspam team at Google, Matt Cutts, uses TT for his personal blog.) And, I doubt that a dedicated server would really give me the throughput assurance that I’d like to have anyway, at least not without caching.

I guess I’ll go with option 2 for now. Specifically, I am going back to using Genesis and SuperCache, and will do a lot of testing to see how well the cached pages look in different browsers.

Leave a Reply