Thoughts On Svelte.

The hype surrounding Svelte right now is inescapable. Every blog post comment section, the article comment section on or Twitter thread/hot take seems to solicit a response about Svelte.

If you are not familiar, Svelte is a Javascript library which leverages a compiler to turn your Svelte code into plain old Javascript and HTML. You write your applications inside of .svelte files and they get compiled to something that has no runtime.

This puts Svelte into a similar league alongside Elm, Imba and a few others, not directly in line with React or Vue. However, being compared to React or Vue seems to be unavoidable in 2019 and will continue to be the case into 2020 and beyond.

In many ways, Svelte is an indirect competitor to the likes of React and Vue, both options which like to tout their small bundle and application sizes. On that front, they can’t compete with Svelte.

Where Svelte differs from other options like React and Vue is that it does not have a virtual DOM, no runtime or anything else non-standard after it is built. Syntactically, your applications end up looking like old-school AngularJS and Vue syntax and a little sprinkling of React’s JSX syntax thrown in:

<button on:click={handleClick}>

A lot of the examples you will see for Svelte highlighting its simplicity are not indicative of real-world applications. This rings true for many framework and library examples, showing as little code as possible. You don’t want to scare away users.

Where things start to look less HTML and Javascript-y is things like loops or conditional each statements. The following is an example of iterating over an array of users.

    {#each users as user}
    <li>{}: {}</li>

If you have ever worked with Handlebars templating in Javascript before, then this syntax will take you back to the mid-2000s right away. This is one example of a few other uses which also resemble Handlebars in Svelte.

Syntax aside, it is by no means a large criticism of Svelte. Every framework and library deals with looping over collections differently, except maybe for React and JSX which the community mostly uses a map to loop over items in collections.

In React’s JSX you will find the following approach is the most widely used:

      {, index) => {
        return <li key={index}>{}</li>

I actually would have loved to see Svelte adopt Aurelia’s approach to syntax instead of going down the path of Vue-like syntax and throwing in that Handlebars syntax.

In Svelte binding the value of a text input looks like this:

<input bind:value={name} placeholder="enter your name">

And in Aurelia, binding a text input value looks like this:

<input value.bind="name" placeholder="enter your name">

I realise my Aurelia bias is starting to show here, but in my opinion, I think the Aurelia approach looks a heck of a lot nicer and more JS syntax-like than the Vue bind:value approach. Not having to type a colon is a huge plus and it just looks neater.

Anyway, moving on. We are nitpicking here.

It is Fast.

There is no denying that Svelte is fast. The lack of runtime is the contributing factor here. The closer you are to the bare metal of the browser, the faster things will be.

The truth is, all frameworks and libraries are fast. When it comes to speed and performance, the contributing factor is rarely the framework or library itself, it is the user code.

Start pulling in various Node packages like Moment, adding in features such as validation and routing, and ultimately your bundle size is going to grow significantly. The end result might be the framework or library itself (even those with a runtime) accounts for 10% of your overall application size.

This is why I always tell people to be wary of benchmarks. Sure, benchmarks might look impressive, but they are not indicative of real-world conditions where things like latency, bundle size, what libraries you are using, and how you write your code are really the determining factors.

I think considerations to how a framework or library lets you author components and write code, what its features are, and what it allows you to easily and not easily do are more important than its speed.

To put things into context, there are still many AngularJS 1.x applications out there in production, which are still working fine. I also know of many Durandal, Knockout and Backbone applications still being used which are also working fine.

The generated code I have seen from Svelte applications is surprisingly readable as well. Usually compiled code is not easy to read (for humans) at all, so I was really surprised.

Svelte Exposes The True Complexity of React

For years, React has hidden behind the claim that it is the V in MVC, that it is a simple and a humble view component library. Anyone who has ever worked with React on an actual application will tell you that you never just need the view part.

I cannot recall a time where I have ever built a web application that didn’t have at least:

  • Routing (the ability to define routes to different screens)
  • The need to work with data from an API
  • Form validation (not always, but more often than not)

If you want to add these features into a React app, you have to glue them all together. Because React utilises a Virtual DOM, the ability to just drop in and use any library that touches the DOM is not possible.

The problem with React itself (without turning this into a post bashing React), is that it is too heavily invested into itself. It is also responsible for perpetuating FUD in the front-end ecosystem on quite a few fronts.

React popularising Virtual DOM (and later on, Vue) would result in a lot of FUD around the DOM. When people tell you that the DOM is slow, they’re responding as a result of being programmed by the React community which drunk the “DOM is slow Koolaid” a few years ago.

Svelte has proven that the DOM is not slow. Although to be fair, Aurelia has eschewed the Virtual DOM (in favour of reactive binding) since it launched in 2015 and managed to keep step with other frameworks and libraries for years (upcoming Aurelia 2, even more so).

Now that React has introduced the concept of hooks into their library, it is yet another thing for developers to learn. Solutions like Svelte which do not require you to learn abstractions and ways of authoring applications definitely feel lighter and saner in the face of React.

Cognitively React requires a few React-specific ways of working which just adds to the learning curve. The React of 2019 is not the React of 2014, that is for sure. Authoring applications using Javascript and HTML is kind of refreshing.

Lack of ability to functionally compose views

This is one of those downsides of Svelte that some developers will struggle to look past. It requires you to use script tags and HTML to build your Svelte components. This means you are forced to use its templating syntax like #if, having to use #each for looping.

For developers who have had a taste of “pure components” where all components are written in Javascript, this is going to be a hard pill to swallow.

No TypeScript Support (Yet)

Right now, there is no official support for TypeScript in Svelte. If you are not a TypeScript user or perhaps you work with Vue 2 which admittedly is not much better at supporting TypeScript, then this will not be a dealbreaker for you at all.

If you are like many other developers who realise the future is TypeScript and have switched over, the lack of TS support is going to be a dealbreaker for you. Some developers have gotten it to work sort of using hacks, but not ideal support by any means.


I think what Svelte has brought to the table is going to kickstart some innovation and competition in the front-end space. While React has been trudging along for quite a few years now and Vue picking up popularity as well, it’s nice to see some new thinking that doesn’t revolve around a Virtual DOM or leaky abstraction.

Rest assured, you best believe that other frameworks and libraries are not going to sit idle while Svelte comes in and pulls the table cloth right off the dinner table.

The AOT compiler coming in Aurelia 2, for example, is going to optimise your Aurelia applications to a high degree stripping away runtime and unneeded code. Angular has been focusing their efforts on improved AOT compilation with the Ivy compiler and renderer and other options are also focusing their efforts on the compilation as well.

Even after playing around with Svelete just briefly, the lack of resulting code and marketing spin was refreshing to see after years of other players in the industry seemingly perpetuating immense amounts of hype.

Having said that, the safety and stability that I get using a featured framework (in my case, Aurelia) still feels too hard to beat.

I think Svelte is definitely going to get more popular and for non-complex UI’s it would be a great choice over React or Vue, but I still have hope that one day that Web Components becomes the norm and we see light abstractions on-top of WC that just compile to Web Components behind the scenes.

I would love to see how Svelte scales in a large-scale web application. Not specifically in performance (because I think it would remain fast), but rather code organisation, maintainability, testability and how easy it is to bring new team members up to scratch with an existing codebase.

Massive kudos to Rich Harris and everyone else who has worked on Svelte. I can definitely see the hype around Svelte is more than warranted and in the end, competition is healthy. We need fresh thinking and solutions to help drive standards and the ecosystem forward as a whole.

Callback Functions in Aurelia Using .call

Aurelia’s robust binding system allows you to not only bind values into your custom attributes and elements, but also the ability to pass in callback functions as well. While your first instinct might be to try using &lt;my-element callback.bind="myFunction" you will quickly realise that this will not work because of scoping issues.

This is where .call comes in. Using .call allows you to pass in callback functions to your bindables, and when they get called, their scope is retained.

Callbacks Without Parameters

Say we have a custom element we have created called my-custom-element and it has a bindable property called callback defined inside of the view-model using

@bindable callback = () => {}

Then our custom element callback binding looks like this:


Inside of our custom element, when the callback bindable is fired, it will call our callback function and the scope of someCallbackFunction will be retained (the view-model it is defined in).

When you are not using parameters things are easy enough. You just need to define your callback with the circular function brackets like you would if you’re using click.delegate or other more event-type bindings.

Callbacks With Parameters

This is where I see developers get caught out quite a bit, passing parameters to callback functions. Using our above function, let’s say that our callback accepts two parameters: user and item.

<my-custom-element"someCallbackFunction(user, item)">

Inside of your custom element, when you call the callback you might try something like this if you didn’t read or understand the documentation correctly:

this.callback(this.selectedUser, this.selectedItem)

Because of how the .call feature works, this will not work (as you might have possibly already discovered). This is because you need to pass an object to the callback with your parameters matching the names you use in your HTML.

In our case, we are expecting two parameters: one called user and one called item to be passed into the callback.

Inside of our custom element, we need to pass them like this:

this.callback({user: this.selectedUser, item: this.selectedItem})

Now, our values get passed correctly and the callback retains its scope.

Reasons To Use Aurelia in 2020

It is almost the year 2020, and you are still not using Aurelia, say it isn’t so. No more excuses, it’s time to charge up your Bluetooth keyboard and mouse batteries, make yourself a coffee and start using Aurelia.

Version 2 is coming and it’s going to be FAST

While Aurelia 1 is plenty fast, Aurelia 2 is a complete rewrite of Aurelia from the ground up. It not only focuses on JIT (Just In Time) runtime performance but also sees the introduction of AOT (Ahead of Time) compiling support which will yield upwards 100x speed improvements in your applications.

Aurelia 2 will take the learnings of Aurelia 1, along the way the features that many developers love and supercharge them, as well as add in new features that modern web applications require.

Low Learning Curve, Highly Extendable

The low-learning curve of Aurelia is unrivalled by any other framework. Show me a fully-featured client-side framework that can build ambitious large-scale web applications that require minimal documentation diving to use.

I have helped upskill developers from all facets in Aurelia and it truly speaks for itself. Just recently two backend developers at my work started moving into the front-end and within the space of just a week were writing and shipping Aurelia code.

Besides the ease-of-use, the extensibility is once again unrivalled. From the dependency injection through to templating, routing and compiling, every part of Aurelia can be overridden and replaced.

Dependency Injection

There are too many benefits of Dependency Injection to list, but there is no denying that DI is immensely useful and powerful. Aurelia has a great DI layer allowing you to pass around dependencies in your applications and specify the type of injection mode they should use.

One of the great benefits of Dependency Injection is unit testing. Because dependencies are passed through the constructor in your components and view-models, it allows you to easily mock them or my favourite approach of overriding the value in the container with a mocked version.

While it is possible to shoe-horn some semblance of DI into other libraries and frameworks, Aurelia provides a true Dependency Injection layer which can be used in your applications and keeping in line with the mantra of “easy to override” you can configure whatever way you want too.

First-class TypeScript Support

I have been working with Aurelia since 2015, I have been using it with TypeScript since 2016 and Aurelia has always supported TypeScript without the need for anything additional to be installed. The framework ships with strong typing support and in Aurelia 2, TypeScript is even more supported. The codebase of Aurelia 2 is completely written in TypeScript.

Long-term Support // Minimal Breaking Changes

Some of us Aurelia veterans have been running production applications since 2015 when it was an alpha release. As time went on and Aurelia reached beta, then release candidate stages before stable 1.0, while the framework was improved and changed, the core design stayed the same.

In its almost five years of existence, there has not really been one single breaking change. The core architecture and vision of Rob has remained untouched. There are not many frameworks and libraries that can boast continual improvement without some form of breaking change or convention.

Flexible Binding Approaches

Despite what some developers believe, there is no denying that two-way binding is amazing if used correctly. The ability to bind values to a view-model from a form input is immensely powerful. Unlike something like React which forces you to litter your code with change function callbacks to update values.

  • One way
  • Two way
  • From view
  • To view
  • One time

Batteries Included

Why waste your time deciding what Router library to use, what state management solution you should use, whether you want to write in some XML like HTML syntax or how you will deal with validating some form inputs?

Aurelia is a glue-less framework that provides you with all of the modules you’ll need to build web applications. Let’s not get it twisted, it is quite rare that you just need the view component, most web applications we build generally require routing, validation, state management and templating.

Do you really want to wake up to the sound of tears on Christmas day because you forgot to buy the batteries and all of the shops are closed?

How To Easily Mock Moment.js In Jest

Recently whilst writing some unit tests in Jest, I had to test some code that took ISO date strings and converted them to formatted date strings, then code that converts them back to ISO strings before it’s sent to the server.

My first attempt was to use jest.mock and mock each individual method. For some of the uses of moment where simple dates are being converted, it is easy enough to mock format and other methods, but once you start chaining Moment methods, things get tricky from a mocking perspective.

This is some code that would be a nightmare to mock in Jest:

moment.utc().add('1', 'years').format('YYYY')

It turns out there is a much easier way to “mock” moment, without actually mocking it at all. You get a fully functional (well in my use case) version of Moment that actually converts dates and allows you to use chaining features.

jest.mock('moment', () => {
  const moment = jest.requireActual('moment');

  return {default: moment };

You use the jest.requireActual method to require the real Moment package, then you return it inside of an object. I am having to return it with default because moment is being included in my application like this:

import moment from 'moment';

It’s a surprisingly simple, functional and elegant solution. It requires no absurd nested mock functions and code. If for whatever reason you need to override certain Moment methods, you can do so either inside of the mock declaration or on a per-use basis.

Fixing The Webpack File Loader [object Module] Issue

Recently I updated to the latest version at the time of writing this post 5.0.2 of the file-loader plugin for Webpack. I use this for dealing with some image files in my project amongst other things.

To my surprise after updating, I noticed my SVG images had all broken without explanation. It turns out a recent fix to the esModule option had enabled a default value of true for esModule which generates Javascript modules that use ES syntax.

This simple fix had some serious consequences in my application, all of my SVG image elements were showing [object Module] as the source (which clearly is not going to work).

Now, it does not take a genius to see the problem here. If you are dealing with SVG files, this is going to break them. Maybe the file-loader plugin was never intended to be used with SVG images, but I and many others do, so it is a bit of a problem.

To cut a long story short, the fix is to set esModule to false:

    test: /\.(ttf|eot|svg|otf)(\?v=&#91;0-9]\.&#91;0-9]\.&#91;0-9])?$/i,
    loader: 'file-loader',
    options: {
        esModule: false,

This essentially reverts the behaviour back to the way file-loader has always worked, by using CommonJS syntax to resolve back to the default export of the file itself.

GraphQL Is Superior To REST

When it comes to API’s, you traditionally will opt for a RESTful API that returns JSON data. The REST approach has served us well for many years, but as web applications have evolved and grown in complexity, so too, have the needs of what API’s do. Fortunately, the smart folks over at Evil Inc (Facebook) created something called GraphQL, which is one of the best things to happen to APIs since REST.

Instead of defining controllers in your API and making calls to your database via rigid actions, GraphQL allows the client to query for the data using a JSON-like format. It can do reads, writes and updates like a traditional REST api, only you don’t have to bother your backend team to add in new controllers, actions or touch existing ones and potentially break other parts of your application.

The true value in GraphQL is the ability to implement new features or update existing ones on the client-side without requiring anything else. If you have to make some changes to your application, you can do so in a safer manner.

Couple this with the fact that GraphQL requires you to define your schema and type said values in the schema and you know that the type of data you are querying is the type of data that is being returned.

If you opt for a decent server (I highly recommend Apollo Server) some of the painful aspects of GraphQL such as caching and query complexity have been solved and documented for your convenience. Some of the lighter and in my opinion, inferior options for implementing GraphQL on the server-side give it a bad name as they generally do not deal with these problems for you (at least not out-of-the-box).

In my experience, a properly implemented GraphQL solution should be easy to maintain and equally, easy to understand and improve. In all of my uses of GraphQL, I make sure my resolvers are simple and easy to follow. A resolver is just a simple function that returns data and in some instances, one or more arguments for filtering and querying data.

On all of my side-projects, I have moved to GraphQL and I generally host it within a Firebase Cloud Function, with minimal work in my resolvers to communicate with my database and return data. Where I possibly can, I will always opt for GraphQL over REST because it is vastly superior.

As with anything, you should not default to choosing GraphQL for every single use case. As great as GraphQL is, it does introduce a little more complexity into your stack, it requires a little more configuration. For medium to large applications dealing with a lot of data, you will see the true value of GraphQL quite quickly. Small hobbyist projects, perhaps not so much.

My Experiences Using Apollo Client & Server With Blockchain

Some of you might know that I spend my time immersing myself in the latest and greatest technologies and a couple of years ago got active involved in cryptocurrencies and blockchain.

The rise of GraphQL has become too high to ignore. Unlike traditional RESTful API’s, GraphQL uses an expressive query language to allow you to query your server for the pieces of data that you need, leaving the implementation details on the server in resolver functions.

At the start of 2019, I open sourced a GraphQL implementation over the top of the Steem blockchain, specifically a layer on top of the Steem blockchain called Steem Engine. I named the library Steem Engine QL. If you are not aware, the Steem blockchain is a fast blockchain with no fees and 3 second block times. It is perfect for content, decentralised applications and other use cases where you need a fast open source blockchain.

GraphQL is a perfect fit for blockchain

After creating my initial implementation one thing that stood out immediately was how much easier it made querying the blockchain and returning data.

In some instances, I needed to combine data from multiple sources and return it in the one request. On the client-side, this would have taken two API requests, then taking the results, filtering and combining into the final structure. Now, this happens on the server and the API returns everything as one. A good example is the coinPairs type here.

On the front-end, that coinPairs data is fetched like this (the code is located here):

query {
  coinPairs {

GraphQL === Fast Feature Iteration

Having used the above GraphQL server implementation in a large-scale open source project one of the biggest benefits is the ability to iterate and implement new features.

Instead of having to write implementation logic on the server to return needed data, once all of the query types and resolvers have been implemented, you just query for what you need. If a REST API were being used, it would require continual development work to add in new endpoints and maintain existing ones.

Because GraphQL promotes typing your resolvers and return types, everything is self-documented, so you know what the server supports and returns data wise. This is an area where REST simply cannot compete, and I am a huge TypeScript fan so it aligns with my code quite nicely.

Case in point, just yesterday in a few short hours of work, an entire new feature was implemented into the codebase without requiring any API work whatsoever. Most of the work was simply just UI, querying the API for the needed pieces of information.

Why Apollo?

I tried a few different GraphQL implementations before settling on Apollo for the server. The thing with Apollo is that the company seems comitted to open source and is well-funded as well. It has great integration with numerous frameworks and libraries, and has sorted out some of the painpoints in GraphQL: namely caching and request batching support using DataLoader.

On the other side of the stack, the Apollo Client library makes querying the Apollo Server a breeze, with support for caching including an in memory cache which does a fantastic job caching your GraphQL queries.

You can find the code for Steem Engine QL here and see it being used in an open-source application here.

Storing The Last Dispatched Action In Aurelia Store

Another day, another Aurelia Store tip/hack. Recently, I encountered an interesting situation where I wanted to know when a specific action has fired, being able to check for it within my state subscriptions.

Now, it is highly likely there is a better way to do this. RxJS is quite magical and has a seemingly bottomless trove of treasures and ways to work with observables and data.

The use case is simple. Inside of my subscriptions, I want to know if a specific action has fired and in some cases, what the parameters for said action were. I decided this was the perfect use case for an Aurelia Store middleware.

function lastCalledActionMiddleware(state: State, originalState: State, settings = {}, action: CallingAction) {
    state.$action = {
        params: action.params ?? {}

    return state;

This is the middleware function and registration. It sets a property defined in your state called $action which stores the currently fired action passing through the middleware as well as the parameters supplied. I prefix with a $ to make the chances of it being overwritten elsewhere highly unlikely.

When registering the middleware, I only want to know when it has fired. So, I choose to place it after., MiddlewarePlacement.After);

As you can see below in Redux Developer Tools, my property is being stored and the parameters supplied to the dispatched action.

If you know of a better way to do this, I would love to hear about it. For my use cases, this works quite well and fine. A middleware seems like the perfect use case for something like this.

Some Small Blog Changes Preempting Aurelia 2

As Aurelia 2 draws near, I have made some changes to my blog. Acknowledging that many of you rely on my blog as a source of information and help with Aurelia 1, I have created an Aurelia 2 category and renamed the existing category to Aurelia 1.

This will hopefully alleviate any confusion that you might have when Aurelia 2 is released. While many of my Aurelia 1 articles will still be relevant, they might not always apply to the new version of Aurelia.

Until Aurelia 2 is released, these changes do not currently mean much. But, after release, they will.

Another change you might have noticed is a new theme. The existing theme served me well for years, but now, it is time to try something newer and still easy to read. I am using the Less theme.

Getting Typescript 3.7 To Work With Webpack and ts-loader

At the time of writing this post, TypeScript 3.7 is in beta. Eventually, this post will become irrelevant. But, for the moment if you are trying to get TypeScript 3.7 Beta or any of the RC builds working with Webpack and ts-loader you might have encountered a bunch of red text in your console.

In my case, I had target: "esnext" set in my tsconfig.json file which the ts-loader plugin should read and set the appropriate settings. And yet, TypeScript 3.7 Beta was not working despite making sure everything was up to date.

It turns out at present, ts-loader does not seem to work with esnext as the target value (hopefully, this changes when TypeScript 3.7 is released). To get things working, all you need to do is change your target value in tsconfig.json to es2018 like this: "target": "es2018"

In my case, that fixed the issue and I could use the exciting new features TypeScript has to offer such as Nullish Coalescing and Optional Chaining. Happy days.