[URL](https://wphostingbenchmarks.com/methodology/)
Author: [[Kevin Ohashi]]
Goal: Test performance of different WordPress hosting providers.
My thoughts on possible gotchas before reading the rest of the article
- front-end performance can also affect performance but is not necessarily tied to WordPress hosting
- DevTools, Lighthouse metrics
- general UI elements should also be reviewed, because they sometimes have an impact on how users perceive a site's performance - progress bars and "loading" icons give a better impression even if the back-end is still processing.
- What types of applications did he deploy on the hosting providers? SPAs? Database-dependent apps? Many requests all at once, or fewer polling requests more regularly?
- back-end testing
- reliability
- load
- Do they use CDNs? How many servers and where?
- Effect of geographical locations of load generators on reported response time
- different types: soak, stress, spike
- multiple scenarios: login, checkout, download
- empty cache should be used to simulate completely new users, primed for repeat users
- Did the WP hosting providers have plans with different SLAs?
Kevin focused on peak performance and consistency (uptime).
Tools used:
- LoadStorm
- k6
- Don't impact performance:
- WebPageTest from 12 locations
- WPPerformanceTester
- SSL test
- Security tests: Internet.nl and Mozilla Observatory tests
Uptime
- 3 months
- Tools: [Hetrix Tools](https://hetrixtools.com/uptime-monitor/166399.html) and [Uptime Robot](https://uptimerobot.com/), with [PHP Server Monitor](https://www.phpservermonitor.org/) as backup.
> In some circumstances where turning on performance enhancements is very simple, e.g. clicking an option to turn on caching, this will be done.
- Caching behavior on the client side should be examined to see what is more realistic.
He turned on easy-to-find performance options, but nothing else.
He tested with permission because load testing can be a security risk without it.
WordPress 5 running the 2021 theme. [Dummy site](http://wordpresshostingbenchmarks.reviewsignal.com/uptime-monitoring/), identical plugins and code except in cases where the hosts added code.
[LoadStorm](https://loadstorm.com/)
- Why did he feel the need to use LoadStorm?
- Protocol level
- Scenario
- Homepage
- Login page
- Login
- Several pages and posts
- 30 min, rampup from 500-n,000 users for 20m then sustain for 10m
k6
- rampup from 1-n,000 users for 15m
```
import { sleep } from 'k6'
import { Rate } from 'k6/metrics'
import http from 'k6/http'
// let's collect all errors in one metric
let errorRate = new Rate('error_rate')
// See https://k6.io/docs/using-k6/options
export let options = {
batch: 1,
throw: true,
stages: [
{ duration: '15m', target: 1000 },
],
ext: {
loadimpact: {
distribution: {
Virginia: { loadZone: 'amazon:us:ashburn', percent: 10 },
London: { loadZone: 'amazon:gb:london', percent: 10 },
Frankfurt: { loadZone: 'amazon:de:frankfurt', percent: 10 },
Oregon: { loadZone: 'amazon:us:portland', percent: 10 },
Ohio: { loadZone: 'amazon:us:columbus', percent: 10 },
Tokyo: { loadZone: 'amazon:jp:tokyo', percent: 10 },
Sydney: { loadZone: 'amazon:au:sydney', percent: 10 },
Mumbai: { loadZone: 'amazon:in:mumbai', percent: 10 },
Singapore: { loadZone: 'amazon:sg:singapore', percent: 10 },
Brazil: { loadZone: 'amazon:br:sao paulo', percent: 10 },
},
},
},
}
export default function () {
let params = {
headers: { 'X-CustomHeader': '1' },
};
let res = http.get('https://example.com', params)
errorRate.add(res.status >= 400)
sleep(1)
}
```
My thoughts on the script
- doesn't include embedded resources, which can often be the bulk of the test
- Think time should be dynamic
- multiple scenario would have been good, like what he did for LoadStorm.
- HTTP 4xx and 5xx errors are now automatically considered errors by default, and you can set a response callback to change this behavior.
- Ramping up throughout the whole test is a little odd - no steady state?
What did he use to graph the results?
> Kevin Ohashi is the geek-in-charge at Review Signal. He is passionate about making data meaningful for consumers. Kevin is based in Washington, DC.
Winners
- CynderHost
- WPX
- 20i