Aurelia vs. React.js: Based On Actual Use

There are tonnes of exciting things happening in the Javascript space in 2015. The two most exciting things for me are React.js by Facebook and Aurelia by Rob Eisenberg (of Durandal fame).

Even though Aurelia still has a few months before it hits beta status, developers will once again be at a crossroads as to what they should choose for their next projects when it nears release.

We already have Angular 1.x while still being supported for a while yet before 2.0 becomes the de-facto choice and people are forced to either upgrade or move on to something else, I don’t like the look of the new syntax nor other decisions made about its design.

As I previously mentioned in a prior article comparing Angular to Aurelia, I think Aurelia is more developer friendly and the way forward.

For me the two best choices right now are React.js and Aurelia (even in its alpha like form). These won’t be feature-by-feature comparisons, I have used both Aurelia and React.js extensively enough that I feel as though I can form an appropriate and fair opinion on my experience using the two in actual applications. You won’t find metrics, charts, performance benchmarks or any in-depth comparisons here.

Disclaimer: This is going to be more of an opinion based comparison than a traditional comparison with charts type article based on my experience using both on real applications. I won’t be debating the finer points, that’s your job.


When you have a product with Rob Eisenberg’s name attached to it, you know it is going to be good. Proving his genius on Durandal and other earlier projects as well as his brief stint on the Angular 2.0 core team before leaving to work on Aurelia full time.

Currently Aurelia is a slightly beyond alpha framework and not quite beta. Big changes are still happening, meaning if you’re looking for something almost complete and battle-tested: Aurelia might not be your cup of tea at the moment.

Having said that, I have been writing a production application using Aurelia (not deployed yet, but will be soon) and keeping up with the changes as they are released (which so far has been quite easy to do).

This application I am writing is visualisation heavy (using Snap.svg and CSS animations), has a service layer for data and API interaction/manipulation, a few third party libraries (Underscore, jQuery and Select2) via JSPM, routing (including child and parameterised) and of course: authentication.

As changes are made to Aurelia, Rob has been very public about what has changed. Thanks in part to Aurelia’s strict committing guidelines being able to generate descriptive change-logs is a core part of the framework and release process. This means changing things that have been removed or changed are quite easy to manage.

Some might not see ES6 support as an edge, but honestly it was the driving force behind my decision to use Aurelia. I came previously from an AngularJS heavy background and while it got the job done, it did so in often less-than-ideal ways that made you pull your hair out.

The use of class decoration for dependency injection and other nice parts of the framework means I spend less time configuring things and more time coding. It takes the fatigue out of building an application within an single page application framework.

The benefits of using such a relaxed framework are pretty obvious when you write your first few lines of Aurelia. You don’t have to extend anything, you don’t have to decorate your HTML with weird attributes, no strange scoping issues like you might encounter in Angular and you don’t have to organise your code in a specific way. As far as conventions go, Aurelia doesn’t make you do anything nor enforce strict conventions.

Don’t want to write ES6 code? The framework supports the likes of TypeScript and if you want, you can even write apps using ES5 code instead. If you’re happy with the defaults, you literally don’t have to do anything other than pulling down the code and start developing.

After using Aurelia first-hand for a couple of months now I can honestly say this promising framework is a breeze to work with. There are crucial missing pieces like a proper bundling solution (although Rob says it’s 50% there) and performance optimisations still to be made, but for the most part it is a pretty functional framework.

Even though Rob cautions going into production with Aurelia, given its lack of bundling and optimal performance optimisations, there is no reason why you can’t start using Aurelia now and in a couple of releases time (within the next month or so) when large breaking changes are less frequent and things are more stable you might be able to go into production (provided the bundling solution is done then).

The limitations that I have run into using Aurelia have been mostly related to JSPM, but most of the issues I have run into have been addressed in the latest release of JSPM 0.15 (at the time of writing this). As always exercise caution, but don’t let the early preview tag scare you, you would be surprised how complete Aurelia feels for something that isn’t anywhere near version 1.0.

The documentation is quite concise, although it could do with more explanation for some things. When I get the time, I will be contributing to various parts of Aurelia (including the documentation). I find myself not needing to reference the documentation as I have with Angular in the past, things just naturally make sense.


While on the surface it might not seem fair to compare Aurelia to React.js (and in part you are right), they’re both being used for the same things. I am not sure what the intent was behind React.js when it was first released, but it seems it has gone beyond just an alternative view layer to something bigger. The community have taken React and made it into a framework of sorts (once you cobble all of the needed pieces together).

Even though React.js is a fully-fledged and functionally released product without the early preview alpha tag and Aurelia is not, in their current state they are both surprisingly pretty on-par with one another. You can achieve the same tasks within both, just in different ways.

I recently got to work on a React.js application that was strictly React coupled with the Flux architecture (computer sciencey pub/sub system). Even though it ships without anything else, it is remarkable what you can actually achieve with React, a router component is crucial for actual applications and there is a react-router component which does the job quite well.

The turnaround time for the application was short, and React didn’t get in the way. This allows fluid development without worrying about configuration and dependency management. It is worth pointing out this is the approach Aurelia takes as well, proving the two aren’t too far from one another.

I would equate React components to Aurelia’s ViewModel’s. They are both quite similar in that you’re essentially using a class to define properties and methods bound to a particular view. The point-of-difference between the two is React doesn’t separate the logic from the view, meaning in React the View and ViewModel are both within the same file.

However, that’s not to say that Aurelia doesn’t allow you to achieve the same thing by rendering the View from within the ViewModel as well and forgoing a traditional View.

I think the original (and still current) intent behind React.js was not to be a competitor to the likes of Angular or Aurelia, but rather be the library that everyone uses with their SPA framework like Angular to improve performance.

So this means you can easily use React.js within Aurelia which I have done without any trouble at all and wrote about it as well. Or as mentioned, you can use React.js exclusively as your framework component and couple it with Flux.

I really like how React encourages the use of components, making you think about your application in chunks not as a whole. Some developers are not a fan of React’s inline styling. Personally I don’t have a problem with it, but I know some feel strongly about it more than others. Keep this in mind if you are considering using React for an entire project by itself or within an existing code-base.

Performance wise React has saved my bacon. In a previous project I was working on built using Angular 1.x, React was used to overcome performance limitations within Angular and its $watcher/$transclusion features. Dropping React in resulted in unprecedented performance gains when all hope was thought to be lost.

Then there is Flux: the uni-directional data flow concept that Flux promotes is a joy to use, but it does require a lot of code to setup properly if you want to implement it yourself instead of using a third-party library. Breaking out your views, actions, stores and constants can result in a lot of work getting things setup for Flux.

This is where the plethora of Flux implementation libraries come into play, but we won’t go into that.


You’ve been somewhat deliberately trolled by this article (my apologies). You were expecting a feature-by-feature comparison or a definitive reason to use one or the other. I am telling you that it doesn’t have to be like that. We need to eradicate the us-vs-them mentality in the Javascript framework/library arms race.

Aurelia and React.js can be used together and in doing so, it provides you with a level of power other frameworks cannot without subsequent complexity and strict convention like EmberJS.

Even though Angular 1.x is stable and works, I personally wouldn’t consider it for a new application starting today or soon. Not just because Angular 2.0 is coming, but because Angular does things a little differently and as a result it can mean bringing developers inexperienced with Angular into the fold can be a considerable time investment.

The same can be said for the other choices out there like Knockout and EmberJS, it feels like nothing comes close to React or Aurelia’s simplicity at the moment.

I would use Aurelia purely for the fact that its simplicity and power lies in its use of ES6 features like modules and classes, with polyfills added in for browsers that don’t support particular features.

If you like React, then I would consider using it within Aurelia or perhaps even forgo it and see how far you get just using Aurelia, which has a pretty smart observation system of its own even when native Object.observe() isn’t supported.

Having used both Aurelia and React.js, I don’t think I will be considering anything else anytime soon. I don’t know about you, but I am tired of having to learn new frameworks and I am tired of working with Angular as well. If you do choose one over the other, know of their limitations and assess your needs first.

I like how Aurelia provides me with components like Routing and working with an API in the form of XMLHTTPRequest’s without needing to download or write anything. Sometimes you just want to get something done quickly and the groundwork that React.js can make you perform can sometimes feel like a chore.

If performance limitations in Aurelia are currently a concern, dropping in React.js is a great intermediary step to addressing performance issues until we near a beta and version 1.0 of Aurelia. It’s great that React gives us options and workarounds prior performance limitations (like I mentioned I have encountered in Angular before).

Always appropriately benchmark your code and make decisions from there. Don’t choose to implement something if you don’t have any problems to begin with, you might save yourself some unnecessary work. The future is exciting, man.


Another Reason To Hate Foxtel

As if we needed anymore reasons to hate Foxtel, Australians now have yet another reason to add the list.

If you are like me and you are a fan of Last Week Tonight with John Oliver, you would have realised up until recently you could watch the show on Youtube without any issue for free. Not anymore thanks in part to The Comedy Channel (solely owned by Foxtel) purchasing the rights to show the series here in Australia. Maybe not a decision that affects you if you are fortunate enough to have pockets deep enough to pay for Foxtel.

Ironically in my search for news about the purchase or some kind of statement I found this Gizmodo article.

My favourite take away from the article is:

Within four hours, everyone on the planet with an internet connection has access to 20 minutes of top-shelf satire. No BitTorrent client required. By watching it on YouTube, you’re giving HBO all the hits it wants, and still accessing it legally.

Followed by:

It’s interesting, really, to watch how HBO distributes Last Week Tonight when compared to other shows it distributes like Game Of Thrones. Anyone who wanted to keep abreast of events unfolding in Westeros was forced to subscribe to one of Foxtel’s less than ideal streaming packages just to get access to the show.

How ironic, another HBO show that we have to pay Foxtel an exorbitant amount of money per month and lock-in fixed-term contract to watch legally. It looks like us Australians are backed into a corner once more and forced to use VPN’s, the Hola extension for Chrome or torrent the show. Who really wins in the end here?

Thanks again Foxtel for making me hate you even more.


Creating Your Own React.js Mixins

The beautiful of simplicity of React.js doesn’t only extend to components, but also mixins. Essentially mixins can extend components and all of the default lifecycle methods.

Through the use of mixins we can take repetitive tasks and break them out into their own standalone pieces of functionality that can be optionally included within one or more components.

In this post we are only going to be building a really simple React.js mixin. The purpose of this post is to get you familiar with how mixins are created so you can go on and explore them further.

The mixin we are going to be building is actually something I recently created for a personal project to conditionally include stylesheets. However, I have stripped away load wait events and other things to make it more simple for the purposes of this post (this code will insert a stylesheet but won’t tell you if there is an error or if it has loaded).

As you will notice, defining a React.js mixin can be as simple as defining a new Object and writing your code like you would any normal component. Because mixins are included within an existing component, we don’t need to worry about the context: React will handle this for us. We are effectively subclassing our component(s).

// stylesheet-mixin.js
var Stylesheet = {
    loadStylesheet: function(url) {
        var head = document.getElementsByTagName('head')[0];
        var link = document.createElement('link');

        link.setAttribute('rel', 'stylesheet');
        link.setAttribute('href', url);
        link.setAttribute('type', 'text/css');

We have a simple enough mixin that will insert a stylesheet into the DOM for us. Now we create a simple made-up component and include our mixin.

var LoginComponent = React.createClass({
    mixins: [Stylesheet],
    componentDidMount() {
    render: function() {
        return <div>This is a styled span element</div>;

React.renderComponent(<LoginComponent />, document.body);

For the sake of our example, if we were to create a file called login.css within our CSS folder and create a class called .mySpanStyle we would see our span become stylised with whatever we put on it.

As you can see we have created our component and called it “Stylesheet” – to use it we include our mixin and then include it in the mixins array (as shown in our LoginComponent). That’s all it takes to include a mixin, it is easy (like React itself).

What can we do within a mixin?

Pretty much anything. Mixins allow us to subclass functionality within a component and as discussed earlier we can even define our own React lifecycle events so we can do things when a component is mounted, props change or any other lifecycle event that React ships with.

Keep in mind that declaring a lifecycle method in a mixin does not override it on the component itself. If you declare componentDidMount within your mixin and your component defines the same lifecycle method: both will be called. Keep in mind that mixin lifecycle methods will always be called first, followed by component lifecycle methods.

Yo dawg…

I heard you like mixins. So you can put mixins in your mixins, so you can mix while you mixin. In React mixins can actually include other mixins, so whether you are including your own mixins or including one of the bundled mixins that comes with React, you can. There is no limit to the depth, you can endlessly include mixins from within your mixins.

You can also include more than one mixin within a component or mixin at once, evident by the fact that the mixins property expects an array to be provided.


You cannot declare render within your mixins, attempting to do so will throw an error. Defining render more than once makes no sense anyway, so you should hopefully never encounter this error.

You need to be careful when setting state values from within a mixin as well. If a component and mixin are changing the same state value, one will override the other. As a rule of thumb you should never modify the same state value as a component would. Consider namespacing state values if you need to access them from a component or vice-versa.

While lifecycle methods can be defined within your mixins, you cannot declare an already delcared method from a component. If you create a method on the login component called login and then you attempt to redeclare it within your mixin, an invariant violation will be thrown as React doesn’t allow you to override and duplicate user created component methods.


The benefits of creating React.js mixins is that they keep our components clean and they promote reuse as they are essentially components themselves. The above approach I have taken allows me to only include stylesheets on a component-by-component basis, but you could easily use them for other things too.


My Soundwave Festival 2016 Wishlist

With Soundwave Festival 2015 now firmly behind us, our attention now shifts to who will play Soundwave in 2016. There are too many bands I would love to see and after my previous wishlist actualy partially coming through I thought I would write up another in hopes it comes true.


These guys are now back together. Before they went on hiatus they did some farewell shows, none of which came to Australia. It has been quite a while since we’ve seen Thrice in Australia, now they are touring in 2015 again, I would love to see them hit Australian shores in 2016 for Soundwave. They might even have a new album out around then, so the timing kind of makes perfect sense.

Gary Clark Jr.

Definitely most likely will not happen, he is playing Bluesfest 2015. But honestly, he would be a great act to play SW, a larger audience and probably better money for him.

Rival Sons

These guys are awesome. I’ve never had a chance to see them live, I don’t even think they’ve toured Australia before. They were on the 2015 lineup, but had to pull out because one of their band members had a child and they didn’t want to take him away from his wife and newborn (understandable). Although Maddah has said he is hoping to get them here before the end of the year, so either way it will be a win.

Job For A Cowboy

They just released a new album and they’ve never played Soundwave even though they would be a perfect fit. Could 2016 be the year we finally see JFAC at Soundwave?


They’ll have a new album out most likely before Soundwave 2016 and falling inline with previous lineup trends they played SW 2014, so the possibility for them to play is quite high provided prior commitments or tours don’t conflict.


Another band I have always wanted to see live. A great rock band that would fit in with the vibe of Soundwave. They’ve also never played at Soundwave before and I think they’re big enough to be worthy of a spot on the festival.

Parkway Drive

Even though I’ve seen them plenty of times live, it has been 9 years almost since they played Soundwave. As AJ said so himself they are definitely due to play.

Billy Talent

They’ve supposedly already reached out to AJ to be a part of Soundwave 2016, so there is a good chance we’ll see them. A new album is due out before 2016, so that definitely fits in with the SW requirements.


They last played Soundwave 2013 (the same year as Metallica) and their new album(s) are awesome. They would be a great addition to Soundwave 2016.


Another SW13 band that are worthy of being on the lineup again.

The Sword

Yet another SW13 band due to play again. They were great the last time they played.

The Black Dahlia Murder

They played Soundwave 2014, they’re a great band and I would love to see them again.


They’ve never played Soundwave before. They’ve got a great heavy sound that would be inline with Soundwave.

The Ghost Inside

They played Soundwave 2014, the timing would be right for them to play again and AJ Maddah hasn’t ruled them out either.

I could go on, but these are some of the bands I would love to see at Soundwave Festival 2016. Lets make it happen.


LOGIK Commercial Blender Review

My fiancée has been wanting a Blendtec or Vitamix for sometime now, sadly they are out of our budget for the moment because of the baby on the way and expensive wedding we are having. Looking for a cheap powerful blender, we considered a few options.


While the thought of owning a Vitamix is still in our sights, we figured if we can get something cheaper for the moment even if it lasts only the two year Australian warranty period, it would have served its purpose.

At first we were going to get some Breville Kinetix blender with a 1000 watt motor for $129 from Big W, but we opted to keep on looking. Our requirements go beyond just blending up fruit which cheap blenders can do, but the build quality is usually sub-par. Opting for quantity over quality it seems is the approach a lot of manufacturers take when producing budget family blenders.

We wanted a blender that could crush ice, blend frozen fruit into liquid, and handle things like frozen spinach and kale. We have had issues in the past with cheap blenders not being able to handle frozen spinach and really struggle with kale. which is a shame because one of the primary reasons for wanting a strong blender was to make green smoothies. This is why we set out to find a decent strong blender.

After deciding to go onto eBay we found some relatively no-name blender that goes by the name LOGIK. An Australian company that appears to have their blenders made in China and then sells them in Australia. The blender was $150 including free shipping, only being a little bit more than the Breville and over twice the wattage, we decided to take the leap and buy it.

Part of the reason we also decided to purchase this blender was due to the fact Cold Rock Ice Creamery here in Australia supposedly use these blenders for their shakes, etc. I had never paid any notice to be honest, but it’s nice to know that a company like Cold Rock are using these blenders instead of more expensive ones like Blendtec’s or Vitamix’s. If this is really the case, who knows.

It appears as though the only place you can get these blenders at the moment is via eBay or through the site Close The Deal. Although the eBay seller in my case was the same operators of the site. They sell through eBay and their own site. So buying either way you’re buying from the same people.

Unboxing and plugging it in

From the moment we plugged this thing in, it exceeded our expectations. Weighing in at almost 5kg, this blender definitely feels commercial. If you can, maybe try and avoid moving it around, the weight really lends itself to a more permanent spot in your kitchen.

It is powerful even on 3/4 speed, turning the speed dial all of the way up practically rips a hole in the space time continuum and sucks you in. The 2200 peak watt motor was probably overkill for us (this thing can grind coffee beans and pulverise nuts), but it’s nice to have that power.

To put the power into perspective, Blendtec’s most expensive model the Tom Dickson Extreme which comes in at 2400 watts is USD $1,034 which is just shy of AUD $1350 after converting the amount and applying the current exchange rate. Blendtec’s most expensive version only comes in a 200 more peak watts.

If you are looking for a quiet blender, this is not for you. This blender is seriously loud, given how highly rated the motor is, you probably already expect it to be loud. Maybe close your doors and windows if you’re planning on using this early in the morning so you don’t wake the neighbours.

Lets throw some things in

This isn’t a Will It Blend type situation where I tried blending iPhone’s and magnets (as fun as that would be), I’m not sure the blades would be that strong in this cheap blender. However I attempted various food additions to the blender to see how it handled them.

Ice – The large chunks of ice I put in were decimated in a matter of seconds. The power dial wasn’t even turned all of the way up, probably around the 3/4 mark and the ice was destroyed.

Mixed nuts – A cheap bag of mixed nuts thrown in (the entire 500g bag), dialed the power all of the way up and turned it on. There was no struggle, the blender mowed down the nuts like they were soft butter, crushing them into a fine dust.

Coffee beans – Eh, why not? The description of the blender touted it could grind coffee beans, so I put it to the test. Seemed to work well, considering coffee beans would probably be easier to grind than nuts, I think a lot of blenders could do the same.

I think it’s safe to say the only limiting factor of this blender is your imagination. Just don’t go blending anything weird in it, alright?


Honestly, don’t expect much in the way of features. This LOGIK blender is literally a couple of switches and a dial with a jug. Unlike the Blendtec blenders, you don’t get automatic programmed modes or fancy blue-lit LCD screens. The LOGIK looks similar to a Vitamix, especially the professional series ones, however all these LOGIK blenders do is blend and allow you to change the speed.

Once you use this blender you realise that those other features while they might look nice and impressive, they don’t really serve any practical purpose. If you’re a busy mother with two young kids and you’re trying to make some baby food, auto modes might be nice as you don’t have to keep an eye on it while you do other things. But for the most part, you can forgo these niceties.

You also get a nice 2 litre BPA free plastic jug with this blender meaning it is large enough to make soups for dinner or milkshakes for multiple people at the same time. Have some friends around, you could use this to make a lot of cocktails as well.

I think the plastic jug is a nice touch. After owning a few cheaper blenders in the sub $100 mark and experiencing issues with the glass cracking, plastic will undeniably last a lot longer and is probably cheaper to produce (hence the low price of this blender).


Honestly, you don’t get a lot for $150 these days. The motor seems decent, the blades also seem decent as well. How long this blender will last? who knows. There isn’t a lot of information out there about this blender, I think it could be quite new (but I am not sure).

All I know is we got a powerful blender for a few dollars more than a cheap Breville from Big W which probably wouldn’t last as long as this one. It doesn’t struggle one bit making green smoothies or soups, I have yet to make a milkshake in it, but I don’t doubt it isn’t up to the task.

I’ll be sure to keep this post updated in-case the blender only ends up last a couple of months. But I have a feeling this thing should last well beyond the required minimum two year Australian warranty.


How To Use React.js In Aurelia

I am rather smitten with Aurelia but I also just so happen to be a big React.js fan. Due to the fact that Aurelia itself does not ship with its own efficient virtual DOM diffing algorithm for rendering composed views and templates, we can use React.js for the heavier UI stuff.

For when the going gets tough in your Aurelia application or perhaps you just like React.js and want to use it: we can easily use React.js within Aurelia.

In Angular we would have done this using a custom directive. In Aurelia we are going to be doing the same, but using the Behavior functionality of Aurelia instead to create a custom element which achieves the same thing as an Angular directive, albeit in a more clean ES6 and Aurelia-like way.

In the below code example we are going to create a custom element, which will pass data through to React and also has the added advantage of re-rendering itself when the data changes. For the sake of sticking with the demo, I recommend saving this file as: “reactel.js”

import React from 'react';
import {Behavior} from 'aurelia-framework';

export class ReactEl {

    // Define our custom metadata Behavior
    static metadata() {
        return Behavior
            // Define the element (this is what a directive would look like in Angular)
            // Referenced in DOM like <react-el></react-el>
            // Specify a valid property this element can have

    // DI we will inject a reference to the current created Element
    static inject() {
        return [Element];

    // Like we would any other DI component, we save a reference to this class
    // so we can access it from other methods
    constructor(element) {
        this.element = element;

    // This is where we handle DOM/View stuff
    bind() {
        React.render(<YourReactElement data={}, this.element />);

Worthy of mentioning here is the fact the React component referenced above “YourReactElement” isn’t a real React component. I am assuming you will create and import your own component and replace with the appropriate name.

Now lets use our newfound React custom element by creating a ViewModel and View. Keep in mind this post assumes you saved your “reactel.js” file in the same directory as your ViewModel and View.

Create the ViewModel:

Save this file as whatever name you like. For the purposes of this example, I recommend choose “react-example.js” – so everything stays inline with the rest of this post.

export class ReactExample {
    constructor() { = [];


Create the View:

Okay, now that we have our ViewModel, we need our matching view. The name of the View HTML file needs to match the name of the above ViewModel Javascript file, not the class name. So if you saved the above as “react-example.js” – we will save this View as “react-example.html”

    <import from="./reactel"></import>

    <react-el data.bind="data"></react-el>

This isn’t an Aurelia tutorial, but I will touch upon a couple of things happening in the View loosely. We import our created “reactel.js” file using the HTMLImport directive. We then reference the element we named in our class above, specifying our data attribute and binding its value to that of the ViewModel data value defined in “react-example.js”


Aurelia is amazing. React.js is amazing. Combine them together and you get more power than a president.


Getting Aurelia To Work With HTML5 pushState

Recently in Aurelia I ran into a peculiar issue using the bundled Browser Sync Gulp task for deploying a test server. When visiting a parameterised URL the paths to the System packages would break. It turns out by default the paths are set in a way they will add onto the current URL instead of referencing the root “/” as the base path.

In /config.js where all of the paths are setup, right up the top you will see a few paths set. Mine looks like the following (yours probably does as well):

    "paths": {
        "*": "*.js",
        "github:*": "jspm_packages/github/*.js",
        "npm:*": "jspm_packages/npm/*.js"

In my situation, I had a parameterised URL point to /reports/:reportid however when visiting this URL, all include paths would try loading from /reports/jspm_packages/ instead of the expected /jspm_packages/.

I have no idea if this problem is isolated to the Browser Sync testing server or not, but it was very puzzling. Then I realised that the paths need to be set like this to work with parameterised URL’s correctly:

    "paths": {
        "*": "/*.js",
        "github:*": "/jspm_packages/github/*.js",
        "npm:*": "/jspm_packages/npm/*.js"

All we did was add a forward slash to the source values of the paths object. This forces the loader to always reference the root directory. I initially tried to set the baseUrl value on the config object, but for some reason it had a few issues with the Browser Sync testing server running. This method is confirmed working.

I did not see any mention of this issue in the documentation, so I assume perhaps something weird with my environment could be to blame. I am running an Aurelia application within Windows, so perhaps it could be differences in how paths are resolved in Windows vs a Unix based operating system like Mac OS.


Two Months (and counting) With Aurelia

Back in January I wrote an article about Aurelia, squaring it up with AngularJS and pointing out how its use of ES6 features and polyfills makes for a clean and easy to use Javascript framework.

Since that original post I have actually been using Aurelia quite a bit and have started writing a production application with it as well. Using Aurelia has been a breath of fresh air and I thought I would share some of my experiences with the framework.

I have yet to get around to writing a tutorial on using Aurelia, but I thought I would share my experiences working with Aurelia so far. Things I like, things I do not like and things I think that can be done better.


Nobody likes setting up things. Even though for most projects I have a set workflow of how I use Gulp for testing and running other tasks, I still find I spend time adapting it to work on a project-by-project basis. Not to mention all of the other things that sometimes need to be added or configured.

In Aurelia things are different. The setup process is so simple, you could probably show your grandma the basics and she could setup an Aurelia application. The out-of-the-box defaults for JSHint, Gulp tasks, ignore files, Karma and Protractor configuration files make things so much easier to get up and running.

When I started using Aurelia, the only thing I changed was the identation levels in the .editorconfig file that ships with Aurelia and defaults to 2 spaces (I prefer 4). But that’s it. I literally started using it right away and that is one area Angular and other frameworks/libraries fail. Giving you sensible defaults and letting you write code.


Surprisingly, I haven’t had to reference the documentation as much as I thought I would. The framework itself is so simple to use, you pick up its conventions and way of working so quickly. When I started learning AngularJS, I found it took a solid month before I understood how it expects things to work and all of its weird abstract syntax.

The only time I had to really read the documentation, was when I ran into some issues with the routing system caused by my own ignorance and not the framework itself. But honestly, you would be surprised how easy Aurelia is to pick up.

I am sure as I progressively get deeper within the framework I might need to reference certain parts of the documentation a little bit, but compared to my first-time experiences with Angular in Aurelia everything feels so much less stressful.


In Aurelia you write classes (ES6, TypeScript, whatever) which are your controllers aka a ViewModel, which in turn are immediately bound to a HTML file of the same name (your View). This means you spend less time configuring the framework and more time actually using it.

If you create a ViewModel called “users.js” the ViewModel assumes that you have a “users.html” file which becomes the view. Referencing the Javascript ViewModel within your route definition will load up that file when you visit it in your web browser.

Templating & Syntax

The templating system in Aurelia in one word: AMAZING. While you’re still writing HTML, you can do more powerful things like utilise the Shadow DOM to create compartmentalised components complete with their own styles, logic and markup. Because the HTMLImports specification allows you to write your views as though they are components or just HTML files, you get a lot of flexibility.

The syntax for binding to elements, model values and writing things like loops feels equally as liberating. Want to show or hide an element? show.bind="expression" or want to add in a repeater? repeat.for="item in items" on the element you want to repeat.


Rob has been quite vocal about how he thinks Aurelia shouldn’t be used in production quite yet. More tests need to be written, probably a few more features and potentially API breaking changes. Having said that for what is effectively an alpha release, Aurelia feels like a stable and complete framework. So far I have not felt as though anything was missing, it makes it easy to include third party libraries like jQuery and Bootstrap.

If this is just a taste of better things to come, I think I will probably die of excitement when version 1 rolls around because Aurelia already wipes the floor with Angular 2.0 and any other SPA framework I have used (including Ember).


As I briefly touched upon, Aurelia feels complete. It doesn’t feel like an early preview or alpha, it feels like a stable release. While I wouldn’t recommend taking my word for it alone, the experience I have had so far is that Aurelia has been able to achieve everything I have thrown at it without any problem whatsoever.


I knew Aurelia was going to be fun and an entirely new experience, but I never thought I would enjoy Aurelia as much as I have. I have been moving away from SPA frameworks in favour of Facebook’s React.js library and Flux methodology, but I think Aurelia has made me a believer.

I used to think when Angular 2.0 rolled around we would see uniformity amongst the popular SPA frameworks, but it is evident that Angular as it heads towards its 2.0 release is going in an entirely different direction. Whether it is worse or better, who is to say.

I think Aurelia is definitely off to a great start and with the talented Rob Eisenberg behind the wheel, it is sure to be a smooth ride.


2016: The Year Of Virtual Reality

The VR space has been ramping up for a while now, thanks in part to Oculus Rift and the fact that Facebook bought them for a VERY large sum of money a little while ago.

Since Oculus originally debuted on Kickstarter and then were subsequently purchased, a VR arms race was started. Everyone is rushing to get their solution out there, evident by the fact Oculus have been working on their headset for years, the problem of creating a decent functional headset is harder than it appears.

The reason it has taken so long is when Oculus development started, the screen technology was not there yet. A crucial key aspect of VR is low latency. When the latency goes beyond a certain level, when you turn your head fast and there is any kind of noticeable lag: it is a recipe for VR induced motion sickness.

Fortunately Oculus and other manufacturers thanks in part to great screens from JDI and Samsung’s own high resolution AMOLED screens, have been able to produce high resolution and low latency headsets (as witnessed by anyone who has an Oculus DK2 headset or has tried Project Morpheus).

Companies working on VR headsets include (and these are the ones we know about):

  • Oculus Rift
  • Sony’s Project Morpheus
  • Microsoft HoloLens
  • Valve Software’s HTC Vive headset (a collaboration between Valve Software and HTC)
  • Samsung Gear VR
  • Magic Leap (unannounced VR headset)

Most of these headsets have one thing in common: most of them have not released a consumer release product just yet, with exception of Gear VR.

Not only are major companies working on virtual reality headsets, but tonnes of Kickstarter campaigns and smaller indie upstarts are taking their shot at the VR crown as well.

Even though you can buy an Oculus Rift developer kit headset, they have yet to release a consumer-focused version of the headset aimed at the gamer or general PC user. Meaning you need a beefy PC with decent graphics card and GPU power to even run it.

Microsoft only just announced HoloLens, but it has been in development for years. It takes a different approach in adopting stereoscopic 3D and augmenting reality, not replacing it. We will most likely see HoloLens in 2016, although no release date has been announced.

While we’re not really going to see any VR headset make its debut in 2015, Oculus and Project Morpheus in particular are slated for release in the first half of 2016. I think it is safe to say the technology is definitely ready, but the content is not quite there yet.

It will be interesting to see what the virtual reality landscape looks like in 2016. Will we see fragmentation or perhaps a combined effort to produce content that works on every commercial available headset? Who knows.

Every company will want to lock consumers into their monetised ecosystem, but without the content, it would be a chicken and egg scenario where consumers would only buy the headset if it has the content and companies only producing content if the consumer numbers are there.

I am excited about the potential of virtual reality, it has been a long time coming and I am definitely ready. While initial releases will undoubtedly be great, it probably won’t be until 2018 when we see headsets really come into their own (leaps in screen tech and tracking).


What’s New In React.js 0.13

The latest and greatest version of React.js is here in the form of version 0.13. With the latest release comes a plethora of great and potentially application breaking changes. This isn’t a definitive guide, just more of a brief post explaining some of the biggest changes in React.js.

ES6 Support

During the beta phase of 0.13 which I have been using for some time now, I absolutely loved the ES6 support aspect. Being able to write ES6 classes for React components just feels so damn right. The only downside is of course using ES6 classes means you cannot currently use mixins, I rarely use mixins anyway, so this was not really an issue for me.

The eventual goal of the React.js development team is to deprecate and remove support for writing non ES6 components, but for now React.createClass will remain a first-class citizen of React.js. Once the team work out a nice solution for supporting mixins in React.js ES6 classes, it will open up new opportunities for developers to write cleaner and easier to read code. We most likely will see mixins added in the form of class decorators, where we extend our class object with our mixin.

It is worth noting various API methods still supported on the React.createClass method of creating components are deprecated or removed when using ES6 classes; getDOMNode, replaceState, isMounted, setProps and replaceProps.

Immutable Props

With 0.13 comes a few changes with the goal of making faster React components. Mutating props in your React components is now deprecated, meaning if you attempt to change your props from the time the value is passed in until the point of render, a warning will be thrown.

Ensuring your props are immutable has been a recommended practice from the React team for some time now and those who have been using React.js for a while know this is crucial to performant applications using React. Going forward further performance optimisations can be added to the React core as it will be assumed your props are immutable.

Batched setState

Another VERY welcome addition to React is batched setState calls within React lifecycle methods. This means setState calls are now asynchronous instead of the first call upon the component being mounted being synchronous.

Goodbye component.getDOMNode

To support ES6 based patterns going forward, the React team have added in a new method React.findDOMNode(component) which is to be used in place of component.getDOMNode(). The base class for ES6 React components will NOT have the method getDOMNode due to the way classes work.

New API method: React.cloneElement(el, props)

Added in 0.13 is a new API method for cloning React components. This is once again another welcome addition, as it allows us to clone a component and preserve its ref. This means children can no longer steal a reference from their ancestors (bad children). This was one of the biggest complaints developers had especially when using a map to iterate over multiple components.

You can check out the full changelog here.