As much as I love WordPress, there are some fundamental flaws in how it works. For the average user, WordPress out-of-the-box will do everything you want and can be run on affordable hardware. For the project I have been working on, scaling considerations have reached a code level.
I needed to speed up some WordPress REST API endpoints in my use case. Between the core heft, plugin weight and size of the database causing some queries to be slower than your average site, I had to look for a solution that functioned like a cache.
Reading and writing JSON files
Instead of expensive reads and writes for what is simple data, I explored using JSON files hosted on Amazon S3 as a potential solution knowing that using Amazon S3 could introduce latency into the application; it could still be faster than the weight of the core/plugins/database.
The code below defines the REST API endpoint. It takes an array of post IDs and a blog ID (this is for a WordPress Multisite installation) and saves their position (it’s a story queue).
register_rest_route( 'queue/v1', '/update/(?P<name>[a-zA-Z0-9-]+)', array( 'methods' => 'POST', 'callback' => 'rest_pinning_queue_positions', 'permission_callback' => '__return_true' ) ); function rest_set_queue_positions( WP_Rest_Request $request ) { $body = json_decode( $request->get_body() ); $category = $request->get_param('name'); $ids = (array)$body->ids; $siteId = (int)$body->siteId; $file_name = "{$siteId}-{$category}"; $options_order = []; foreach ($ids as $id) { $options_order[] = [ 'id' => (int)$id->id, 'blog_id' => (int)$id->blog_id ]; } write_to_json_file_on_s3($file_name, json_encode($options_order)); return new WP_REST_Response($options_order, 200); }
In the above example, we call a function called write_to_json_file_on_s3
that takes our JSON encoded array of values and saves them inside a JSON file.
function write_to_json_file_on_s3($name, $body) { $credentials = new Aws\Credentials\Credentials('key', 'key'); $s3 = new S3Client([ 'region' => 'your-region', 'version' => 'latest', 'credentials' => $credentials, 'debug' => false ]); $filename = "queues/{$name}" . ".json"; try { $s3->putObject(array( 'Bucket' => 'your-bucket-name', 'Key' => $filename, 'Body' => $body )); } catch (S3Exception $e) { echo $e->getMessage() . "\n"; } }
Using the AWS SDK we installed using Composer, we create an instance of the S3Client and then interface with Amazon S3, where we store our JSON file using the putObject
method.
Similarly, we create a function that can read from Amazon S3 as well to get these queues:
function get_queue_from_s3($name, $siteId) { $url = "https://my-bucket.s3.my-region.amazonaws.com/queues/{$siteId}" . "-" .$name . ".json"; $response = wp_remote_get($url); if ( is_wp_error($response) ) { return $response; } $body = wp_remote_retrieve_body($response); return json_decode( $body, true ); }
Using JSON files on Amazon S3, requests went from taking upwards of four seconds to sub one second. There could have been other avenues to explore and ways to reduce those long requests, but this solution was the easiest and allows for the content being saved to be accessed by other applications outside of the WordPress site too.