Mocking Default Imports In Jest With TypeScript

If you are writing tests using Jest and you use TypeScript, there is a good chance you have encountered an error along the lines of TypeError: defaultsDeep_1.default is not a function or TypeError: myClass.default is not a constructor when trying to test a file that is using a default import from a module.

You most likely have read countless StackOverflow questions, but none of the solutions will solve the issue. You’ve read the Jest documentation (which is quite extensive), but still no mention of mocking default module imports with TypeScript.

In my case, I had this error when trying to import a Lodash function defaultsDeep and another when importing the Input Mask module. My imports look like the following.

import defaultsDeep from 'lodash/defaultsDeep';
import Inputmask, { Options, Instance } from 'inputmask';

Inside of my test which will be testing this specific file, I use jest.mock to mock the specific modules and their implementations. The important thing to note here is I am returning default from within my mocks. This is because of how default imports are transpiled within TypeScript.

The Lodash mock is more simplistic:

jest.mock('lodash/defaultsDeep', () => {
  return {
    default: jest.fn()

In the case of Input Mask, I needed to mock an instance which has a method on it. The usage in the actual file highlights what we want to achieve. The input mask plugin is newable, it then exposes a mask method which we supply with an element. = new Inputmask(options);;

This is how we mock the above module and accommodate for the usage:

jest.mock('inputmask', () => {
  return {
    default: jest.fn().mockImplementation(() => {
      return {
        mask: jest.fn()

The convenient thing about the solutions presented is they will work for all default imported modules. Have fun.

When To Use State Management In Front-end Applications?

As ubiquitous as state management has become in front-end development, it is still a confusing magical black box to most developers. Data goes in, data goes out, and nobody thinks about what happens in-between.

Some developers believe the answer to the question in my title is: always. While some don’t believe in using state management at all and if you’re like me, the answer is: it depends.

When state management gets added to an application that meets the criteria for using it, a weight gets lifted off your shoulders, and things make sense. Prematurely introduce state management or use it in places where you shouldn’t, and your life becomes a tangled mess.

The complexity of state management starts to get even more confusing when questions arise around best practices for working with API endpoints or dealing with forms. The answers are primarily opinion-based once again.

Avoid state management for forms

I cannot stress this enough. I have seen developers implement hacky solutions to working with form inputs and state management, and it’s a clear case of the right tool for the right job. While Redux and other state management solutions have plugins for dealing with forms, why inflict pain on yourself unnecessarily?

You might not agree with me on this one, and that is okay. However, every single time, I can recall seeing state management, coupled with forms, was unnecessary. You only have to Google to find a tonne of people asking for help getting state management to work with forms to see why you shouldn’t.

Forms are often always ephemeral state, meaning the data only exists temporarily. An example of a form might be login form with a username and password or a form for adding a new product to your store. You enter the data and dispatch an action, the form gets cleared, and that’s it.

Instead of replicating and nesting properties in a massive state tree for one specific part of your application that some users might not even use, use local state instead. If you’re working with React, this would be local state within a component (using something like the useState hook) and similar with Aurelia or Vue, local state within your view-model or component.

Just because you can doesn’t mean that you should.

Working with API’s

Depending on your state management solution of choice, the approach for working with API’s can vary depending on plugins and workflow. However, the principle is the same. Your action(s) make an API request and update the state, or you make the request and dispatch an action with the response.

I know in Vue’s VueX state management plugin, many in the community advocate for making your API requests inside of your actions. There isn’t anything wrong with that; however, in Aurelia’s state management library Aurelia Store, I advocate for making the request and then notifying the store.

It doesn’t matter how your data gets into the state, more-so what kind of data you are putting in the state is what truly matters.

Do I need this data again, will I use it more than once?

State management is recycling your data. Will you need that value again in other parts of your application? Use state management. Do you only need to store the value temporarily and reference it in a specific component, only for it to be discarded shortly after that? Don’t use state management.

Asking yourself the following question should be the litmus test you apply to your development workflow. Will you need this value again and will you need it in other parts of your application? Type it up, print it out and hang it up on your wall.

The purpose of state management is not to play the role of “random kitchen drawer full of miscellaneous items”, it exists to make cross-component and cross-application data access easier as well as ensuring the integrity and shape of the data remains intact (in part due to Javascript passing everything by reference).

Using GraphQL?

You might not need state management at all. GraphQL offerings like Apollo offer an all-in-one package for working with data, including state management like functionality that makes syncing and working with your GraphQL server a breeze.

While there is nothing stopping you from using GraphQL with state management libraries, and some GraphQL clients might require them to meet your needs, in many cases you only need one or the other.

State management can introduce unnecessary complexity

If you have ever seen a React + Redux application, you know what I am talking about, a mess of folders and files scattered through your application. You have to open up seven files to change something, and it’s a tonne of cognitive overload.

Something I want to make very clear here: the complexity of using something should never be the deciding factor in whether to use it or not. The next time you start on a new application, don’t be so quick to add in state management but don’t leave it too late.

If you’re validating an idea or prototyping, it can slow you down having to write all of the boilerplate most state management libraries require. Sometimes you need to be “agile” and flexible, and state management can be quite rigid and the opposite of that.

When it comes to state management, do what works for you. Trust your intuition, and if something feels complicated and unnecessary, your gut instinct is probably right. Posts like these are great as a guide, but ultimately you should never take everything as gospel.

Default Exports = Bad

Hello humans. In JavaScript, the worlds most loved and internets favourite client-side language, thanks to modern ECMAScript standards, we have default and named exports.

It’s simple, and you have a file that exports something to be imported somewhere else. A named export is explicit and is only importable by its defined name. A default export is implicit, and you can import it and call it whatever you like.

Now, default exports came about in the CommonJS world of Node.js where you would import a module using const MyModule = require('my-module') to account for uses where exports are default module.exports = MyClass – although, it is worth pointing out that CommonJS does support named exports.

The most persuasive case for named exports

All modern code editors and IDE’s provide autocompletion functionality. If you are using Visual Studio Code (chances are, you are already), then you get some nifty auto-complete functionality out-of-the-box, even if you are not using a superset like TypeScript.

A default export receives no such auto-completion hints because it’s a default export, it could be anything; a class, a function, a constant. A named export explicitly tells your code editor what you’re exporting and importing.

Furthermore, default exports are difficult, if not, impossible for bundlers to tree-shake your code. A default export means that instead of just keeping the code you’re using, the entire file or in some cases an NPM Package is bundled into your code, and therefore adds bloat.

There are a plethora of other interesting issues that have arisen for people, further highlighting the reasons for avoiding default exports. Rich Harris succinctly worded it in his response to an issue on the Rollup repository on GitHub in 2016.

We absolutely would have been. Default exports have caused no end of problems. People get desperately confused by all the different forms of import/export declaration – imagine if we could teach people that you either import { names } or * as namespace, and that you can export either names or declarations. As it stands, it feels like there’s a ton of different variations you have to understand.

Plus the confusion that arises over whether default exports are live or not. I’ve spent more time learning about ES modules than anyone should reasonably be expected to, and I had no idea that the situation is as you’ve described. (Marked this issue as a bug, btw.)

And then there’s the interop headaches. Ostensibly, privileged default exports were meant to make adoption easier for a community that’s familiar with Node modules, which is ironic as nonsense likemodule.exports.default has probably caused more friction than any other aspect of ES modules. I’m sure we could have come up with a better way of importing single-export CommonJS modules. (Though we shouldn’t really call them CommonJS modules – CommonJS modules can only have named exports!)

Unfortunately, we’re stuck with it.

Default exports are lazy

There is no reason to use a default export unless you’re lazy and cannot be bothered taking the extra 5 seconds to add curly braces around your import and make sure your export is named.

There are exceptions when you’re dealing with a third-party package and have no control over how the exports are defined. However, even so, in that situation, a pull request on the repo for the library you’re using might be worth considering.

There is no legitimate reasoning for default exports, but there is plenty of legitimate reasoning against them. Make your life easier and avoid them altogether.

The State of JS Survey Is A Farce: Part Two

Recently, I published a blog title which I titled, The State of JS Survey Is A Farce in which I expressed criticism that the State of JS survey is highly inaccurate, biased and dangerous.

I didn’t get a roaring response until a developer who is one of three running the survey Sasha Greif out of nowhere expressed feelings that I was unkind in my blog post in a Tweet that tagged me.

@AbolitionOf calling the State of JS a “farce” was pretty unkind. I hope you get better treatment if you ever launch your own projects

Admittedly, this Tweet took me by surprise. When I wrote the post, I couldn’t have told you if you asked me who runs the survey. And my intention wasn’t to put down someone else’s work, it was to call out what I saw was bias in a survey growing in popularity.

I was critical, but I never resorted to personal attacks or name-calling. It was strictly criticism and valid criticism (or so I thought). As someone who actively participates in open source myself, I know all too well what unconstructive criticism looks like, but this wasn’t one of those times (at least, not intentionally).

I responded to Sasha on Twitter with the following:

Sorry, you took it personally, Sasha. It was never personal and I apologise if you think otherwise. I just have a problem with biased data being used to turn front-end development into a schoolyard popularity contest by declaring winners and losers.

I apologised and clarified that my post wasn’t personal, it was a criticism of the survey itself and the fact it was trying to turn front-end development into a popularity contest. Sasha didn’t like my response and blocked me without responding.

A few hours later, Sasha unblocked me and sends me a few responses, one of which was the following:

Well in any case I can’t wait for part two of your post where you actually explain why you think the data is biased

I can be pretty blunt, sometimes brutally honest, but one thing I would never do is personally attack someone and their projects for no reason. I have no reason to pick fights or put down others online, I am not a bully, I am a developer too.

My blog post was only criticism of the survey and the data, the data of 20,000 participants, not the people collecting and sorting the data. It’s like blaming the outcome of an election on the people counting the ballot papers.

I can understand that maybe Sasha and his team are proud of the survey which explains why I was met with such hostility, but honestly as I said in my previous blog post, it’s a good idea, it just needs better data.

I thought Sasha’s comment about a follow-up where I explain why I think the data is biased was fair, so here is the follow-up where I will do my best to explain why the data is biased and how it can be fixed.

At a glance: how does data become biased?

Before we proceed, I am not a statistics expert nor do I have professional experience in this field. However, just because this isn’t my realm of expertise doesn’t mean I am unqualified, because the bias is as clear as day in this survey.

Bias in data can come from a lot of things, but in the case of the State of JS survey, in particular, I believe it comes down to:

  • Survey questions that have been worded in a particular way to get a specific/inaccurate result result
  • The data is heavily skewed towards specific countries and excludes a wide variety of demographics, particularly non-English speakers
  • Data has been grouped into misleading categories
  • The team behind the survey mostly all use ReactJS and have a vested interest in its success and market position

Language Bias

Let’s go from the top here. While participants in the survey came from a wide variety of countries, there is some obvious bias here, most of the survey participants came from the USA.

What American developers get to use, is widely different than what developers in say India or South America get to use. One of the fastest growing economies in the world China only had 75 participants and India had 521 participants.

I worked for a company in 2014 that was building a Netflix type streaming video platform for the South American market. We were constrained by needing to support IE8 and AngularJS 1.3 dropped support for IE8, so we were forced to stay on the version prior. This meant we couldn’t use the latest and greatest, internet speeds were also slower and devices had lower specs.

Living in a first-world country, developers are spoiled for choice. Some of us only have to support IE11 minimum now, some of us don’t have to support IE at all. It’s easy to forget the entire world isn’t living in the future or has the latest technology like countries such as the USA is fortunate to have.

Region limitations aside, a huge piece of bias in the survey is that it is only available in one language: English. The lack of translation for other languages such as; Mandarin, Spanish, Arabic is a huge barrier for participants considering Mandarin is the worlds most popular language and English is the third.

As you will see further down, the exclusion of certain countries (due to only being in English) yields interesting results from underrepresented countries.


Translate the survey into more languages. The survey excludes a very large portion of the world population by only being available in English.

Marketing and Reach: Selection bias

The survey is predominately marketed on Reddit, Twitter, Hacker News and Product Hunt. If you participated in surveys from previous years, you probably got an email. From the outset (because I don’t have the figures), it appears most of the traffic seems to come from social media.

There is a huge problem here: countries like China are more strict in terms of what their citizens can see and do on the internet, social media is notoriously locked down in China. In fact, Twitter, Google, and Reddit are all banned in China.

This explains why China only had 75 participants, chances are you if you live in China you don’t even know this survey exists. If you don’t speak English, you also probably never heard of the survey or did and could not participate.


Don’t assume that everyone uses social media or can access it. Also, don’t assume that all developers visit Hacker News or other websites. This is a harder problem to crack, but one that maybe partnering with a larger company can solve (such as Google or StackOverflow). The reach and accessibility of the survey needs to be improved.

Angular v AngularJS (miscategorised and slanted questioning )

Unlike previous years (2016 and 2017), the 2018 survey when it came to questions about Angular really shit the bed (so-to-speak) in how it polled developers.

Angular is the newer version (2+) and AngularJS is the older version (< 2). Previous years made the distinction between old Angular and new Angular, however in 2018, the distinction was not made and it essentially invalidated this entire portion of the survey.

While the newer version of Angular is the recommended choice for new projects, not everyone has the luxury of throwing out what they have and starting from scratch (because it can be expensive for starters).

The survey appears to have erroneously made the assumption that AngularJS has been deprecated and abandoned by Google, when AngularJS 1.7 has a long term support (LTS) period of three years that only began July 1, 2018, and expires in 2021.

A lot of companies are still using AngularJS because their applications work and understand the importance of the wise proverb, “If it ain’t broke, don’t fix it.” comes into play here.

This appears to have caused confusion in the survey data. While some can discern the difference between Angular and AngularJS when presented with both options, when presented with just one, it appears they’re both being lumped together and this skews the data.

A popular video on YouTube titled State of JavaScript – Real Analysis of Angular, React, and Vue which currently has almost 30,000 views challenging the State of JS results on its treatment of Angular and claims of its death. This video has 1.5k upvotes, but the real story is in the comments section.

But the backlash doesn’t stop there. Angular core team member Olivier Combe took to Twitter to dispute some of the data in the survey as well. In this Tweet exchange with Sasha, Olivier writes:

Why not make the distinction like the previous years? The complete analysis is worthless because of this. Of course a large number of people wouldn’t use AngularJS again, but that’s not necessarily the case for Angular. If you can’t make a non-biased analysis, don’t do it

In a further reply, Olivier goes on to say:

It’s just basic statistics: don’t compare things if you changed the referential between each data point. Being aware of it is even worse you’re admitting that the data is wrong and yet in the final conclusion about frameworks you say that it won’t be a top-end framework ever again

Once again, we have someone else calling out the bias (albeit a specific part of the survey) and one of the creators of the survey downplaying its significance like it doesn’t matter. This kind of thinking is dangerous and it’s wrong.

Continuing on…

The most telling sign of exclusion bias is shown in the section, Angular Usage by Country. The happiest Angular users are in the most underrepresented countries.

Romania at 58 users makes up 37.9% of the happy camp of Angular users. Egypt at 17 users makes up 35.4% of happy Angular users. New Zealand at 39 users equates to 26.7% of happy Angular users.

Where is this going you ask? Go back to the Participation by Country section and count how many participations from those countries there were in the survey overall.

Romania which had the highest percentage of happy Angular users made up just 0.76% of the survey with a total of 153 participants. This gives us a total of 36.64% of Romanian participants are using Angular and are happy with it.

Now Egypt, only 48 users participated in the survey making a tiny 0.24% of the overall participant count. Now, interestingly the second highest count of happy Angular users above at 17 makes 35.41% of happy Angular users.

Finally, New Zealand had a total of 146 participants and makes up 0.72% of the survey. New Zealand fairs slightly lower, but out of all participants, 26.71% are happy Angular users.

I know large New Zealand companies such as are big Angular users amongst other New Zealand companies who use Angular. It seems to be used a bit over there, which for a small country is quite impressive.

There are a lot more underrepresented countries who are using Angular and quite happy with it. I only picked a couple of them, but I recommend you go check out the data yourself.

But this seems to somewhat align with the StackOverflow developer survey results for 2018. Even though, StackOverflow targets a more broad audience and has a larger number of participants, we see developers still love working with Angular and are clearly using it (54.6%).


Questions about Angular and AngularJS should be separate until after the LTS for AngularJS 1.7 ends in 2021 at the very least. The data is also skewed because the participants who were the happiest with Angular were among the least represented in the survey, increasing representation would help address this.

The team behind the survey

For the record, I think this is worth including, but it’s not the primary factor here for why I believe the data in the survey is heavily biased. All three people behind the State of JS survey work with React and so, naturally, anyone who follows them and what they’re working on probably falls into the React camp.

One of the people behind the survey and the one who called me out on Twitter over the previous blog post Sasha Greif actually seems to run an Open source self-described full-stack React+GraphQL framework.

One of the other State of JS members is Raphaël Benitte who has a dashboard tool built with Node, React and D3 called Mozaïk as well as another project os DataViz components built using D3 and React.

Finally, Michael Rambeau runs a site called bestofjs, which seems dominated heavily by React content. On the left-hand side under the popular tags, React has 189 tagged articles and Vue has 50.


The very fact that two of the three owners of the State of JS survey are heavily invested into React introduces bias because of their followers most likely leaning into React as well, and the only solution here is to introduce more data into the survey so this eventually this is not an issue anymore.


My initial blog post was not personal, and it was not intended to be an attack on Sasha or anyone who runs the State of JS survey.

Reiterating what I already said in my previous blog post, there is bias in the data and there is no doubt about that. I invite all criticism and feedback, so if I made a mistake or assumption in this post, please let me know so I can correct it.

If the team behind the survey simply acknowledged some of these biases when presenting the results, I would not have published my blog post in the first place.

When you take tainted data and you use it to besmirch the name and reputation of frameworks, libraries and tools and tell people to avoid using frameworks like Ember and that Angular is dying, that kind of schoolyard popularity contest bullshit is not needed in an already heavily politicised industry.

I think the State of JS survey is great and it’s the first of its kind, but the data needs to be more random and widespread. The language being used also needs to be less about “us vs them” or “avoid using this” and instead just focusing on displaying the data for what it is and let people draw their own conclusions.

I hope in 2019 we see a more representative and less exclusionary survey that yields more truthful results than what we were given in 2018. I want to see this survey succeed.

The State of JS Survey Is A Farce

The State of JS is a survey that has been running for a few years now, which surveys front-end developers and aims to find out what they’re using, what they love, what they’re interested in learning and what they’re not interested in knowing.

The survey sounds good in theory, it gives you insight into the state of front-end development and the various tools, libraries and frameworks people are using.

In practice, the survey is a farce. The 2018 version of the survey saw over 20,000 respondents complete the survey. While 20,000 respondents seem quite low given the number of developers out there who identify as front-end or Javascript developers, the actual issue here is the data, in this case, is biased. When you use biased data, you get a biased result.

The survey on the front-end frameworks page makes a really bold and exaggerated claim:

The front-end remains the key battleground for JavaScript. But now that the dust has cleared, it’s starting to look like only two combatants are left standing…

Based on the extremely limited dataset it might look like that, but this is an erroneous and highly inaccurate statement to make. While many use React and Vue, this does not mean people have abandoned other choices in favour of them.

In the enterprise, choices like Angular and Ember still reign supreme because they’re more verbose and verbosity is generally favoured in the enterprise because it more often than not results in a less error prone result. Angular in terms of enterprise popularity, in particular, is quite high.

And I have seen people using Npm stats as a metric for determining popularity, in the enterprise more often than not, packages are not being installed using Npm. The metrics are also skewed here once again because it doesn’t take into account that not everyone who downloads a package through Npm is building something with it (could just be curious).

In the conclusion for the front-end libraries section, the survey then doubles-down on the erroneous statement of React and Vue being the only choices:

The other story of those past couple years is the fall of Angular. While it still ranks very high in terms of raw usage, it has a fairly disappointing 41% satisfaction ratio. So while it probably isn’t going anywhere thanks to its large user base, it’s hard to see how it will ever regain its place atop the front-end throne.

Why does this matter?

It’s only a silly survey and while a little over 20,000 respondents filled it out, it’s dangerous.

The issue here is that managers, CTO’s and CEO’s are going to potentially see this survey and use it as justification to abandon other solid choices in a desperate attempt to be seen as modern and relevant.

This isn’t about being angry that React or Vue are increasing in popularity, I think Vue is great and I have worked with it before. I also worked for a company building a Netflix-like product for South American content which used React and that was also great as well.

I am sure you have seen what happens when you send developers to conferences, they come back excited and giddy wanting to change the world and use all of these new libraries they saw at the conference, this survey is the same, it’s hype fodder.

The crux of the matter here is the State of JS survey is perpetuating false claims based on seemingly biased information and in the process turning front-end development into a schoolyard popularity contest by declaring winners and losers.

The survey is a good idea, but it needs more data

StackOverflow has an annual developer survey they do and their 2018 survey got around 100,000 respondents, but the downside is they cover a broader spectrum that isn’t just front-end development like the State of JS does.

If the State of JS wants more accurate results, they need a tonne more data. And not just more data, but they need to work to eliminate the bias from their survey. And to remove the bias, it’s clear if they can get more people to complete the survey it might help. But even so, the people who run the survey seem to be users of React, so there would always be an element of bias.

Going back to the StackOverflow survey for 2018, the section for frameworks libraries and tools is rather interesting to look at. Even though it includes non-front-end choices, Node.js is the top result followed by Angular and then React.

Out of 100,000 respondents, 36.9% are using Angular, 27.8% are using React. With the following footnote:

Node.js and AngularJS continue to be the most commonly used technologies in this category, with React and .Net Core also important to many developers.

Then under the Most loved, dreaded and wanted frameworks, libraries and tools section we see angular get a 54.6% score. Worth noting that React is second here with 69.4% saying they love it.

As you can see, the two surveys produce two different results. One is focused on a specific area of development, the other is focused more broadly. I would love to see StackOverflow run a more targeted survey to see what the results are.

The State of JS survey clearly has a marketing problem and hopefully, over time the number of people who participate goes up. It’s a great idea, but at present feels like it is only being marketed to React and Vue developers, creating this confirmation bubble that React and Vue are the only choices (they’re not).

I found the gender disparity in the results (over 90% male) to be quite concerning. It really highlights that we need to do more to get women into front-end development, but it also highlights once again that the limited number of people this survey is being marketed to is a huge problem in terms of skewing the data.

One thing that is clear from this survey, is developers are choosing to use frameworks, libraries and tools based on their popularity and not whether or not their choices actually align with the requirements of the business or customers they’re developing for.

The front-end space in the last four years has honestly turned to shit, with people flinging mud and swinging their proverbial digital dicks around claiming that React is the king and that Vue is the new messiah.

There is an entire ecosystem of great frameworks and libraries to choose from, and in 2018, very little difference between them (except how you build apps). While people buy into false claims like the virtual DOM being faster than the real DOM, really regardless of what you choose; Angular, Aurelia, Ember, React or Vue, you’re going to be making a great choice for building modern web applications.

It is an exciting time to be a developer if you like picking sides and criticising people for the choices they make (especially if they’re less popular options).

The TL;DR here is to take the results of this survey with a grain of salt. It is not indicative of the industry whatsoever and is highly inaccurate, it’s interesting to see what 20,000 front-end developers think, but that’s about it.

Make decisions based on the results of this survey at your own peril.

Computed Object Keys and Function Names In Javascript

For years, I wanted the ability to use variables as object keys in Javascript. Thanks to ES2015, we got the ability to have computed object keys from within the object definition itself.

This isn’t a new or cutting-edge addition, we’ve had it in Javascript for a while now and it is well-supported. The reason for talking about them is a lot of developers do not know about these features or simply forget about them.

In ES5, this wasn’t impossible but you had to do something messier looking like this:

var variableValue = 'A VALUE FROM A VARIABLE.';

var myObject = {};

myObject['This is just: ' + variableValue] = 'But do not worry, it is just a test'

When ES2015 hit the scene, the above could be written like this:

const variableValue = 'A VALUE FROM A VARIABLE.';

let myObject = {
    ['This is just: ' + variableValue]: 'But do not worry, it is just a test'

But, we can make it a bit cleaner. Using template literal backticks, we can remove the string concatenation and do the following instead:

const variableValue = 'A VALUE FROM A VARIABLE.';

let myObject = {
    [`This is just: ${variableValue}`]: 'But do not worry, it is just a test'

I particularly find dynamic object keys useful when working with Aurelia or Vue.js class binding (for dynamic template classes on elements), or when I am working with Firebase and dynamic values.

And one of my favourite features of all is the ability to use this syntax with function shorthand, allowing you to have named functions:

const ADD_USER_FUNCTION = 'addUser';
const REMOVE_USER_FUNCTION = 'removeUser';

let methods = {




This is the kind of syntax that I use when working with state management libraries, as it allows me to name my getters, mutations and actions using constants for consistency.

How To Convert FormData To JSON Object

Recently, whilst working on a project I needed to take HTML FormData and then convert it to JSON to be sent off to an API. By default the FormData object does not work in the way you would want it to for JSON.

Using the for..of syntax introduced in ECMAScript 2015, we can access the form data entries and iterate over them to get key and value pairs.

const formData = new FormData(SomeFormElement);
let jsonObject = {};

for (const [key, value]  of formData.entries()) {
    jsonObject[key] = value;

By calling entries on our form data object, it returns the required iterable data we can then access in the form of key and value. In our above example we actually store each item in an object.

Fortunately, this isn’t a complicated problem to solve. If we had to do this without a for..of loop, then it wouldn’t be nearly as clean as the above solution is (which can still be improved).

Module ES2015 and TypeScript 2.4 Dynamic Imports

Introduced in TypeScript 2.4 is support for the ECMAScript dynamic imports feature. If you didn’t see the announcement or read it properly, you’re probably here because you’re getting the following error.

In my case I use Webpack and I was trying to add in some dynamic import goodness and getting this error: Dynamic import cannot be used when targeting ECMAScript 2015 modules.

TypeScript 2.4 dynamic imports error

You’re probably thinking, this is crazy considering dynamic imports are an ECMAScript feature, not a TypeScript one. The tell is in the error, if your module is set to es2015 you’re targeting ES6 and dynamic imports are a feature not due for release until ECMAScript 2018.

Funnily enough, the TypeScript team did reveal this in their official announcement but if you’re like me, you missed it the first time and hit this issue.

The fix is simply setting the module value in your tsconfig.json file to esnext, like this: "module": "esnext". If you’re using Visual Studio Code, you might get a squiggly in your tsconfig.json file telling you it’s not a valid value, but ignore it because it is.

Efficiently Looping A Javascript Array Backwards

Most of the time when you’re looping over an array of elements, you’ll do it sequentially. However, I recently needed to iterate through an array of objects in reverse just using a plain old for loop.

This will not be a highly informative or groundbreaking post but, I thought I’d share this in case someone wants to solve the same problem and might be confused with the many different ways you can loop over an array in reverse.

I had an idea in mind, I knew Array.reverse would be the ideal candidate but I still Googled to see if smarter developers than me figured something better out.

Turns out, there are a lot of scary alternatives to looping an array in reverse (mostly on StackOverflow). Some people are proponents of decrementing and using while loops, others had different ideas. Why not just use a function that’s existed since the dawn of Javascript?

var myItems = [
    {name: 'Dwayne'},
    {name: 'Rob'},
    {name: 'Marie'},
    {name: 'Sarah'},
    {name: 'Emma'},
    {name: 'James'}

var itemsToIterate = myItems.slice(0).reverse();

for (var i = 0, len = itemsToIterate.length; i < len; i++) {
    var item = itemsToIterate[i];

In our example, we take an array of items and then we use slice to make a copy of our array starting at its first index (zero). Then we call reverse on the cloned array.

I said efficient in the title, but I haven’t benchmarked anything. However, we are using a barebones for loop and you don’t really get much faster than that. Sometimes commonsense and readability beat microoptimisating.

The reason we copy the array is so we don’t modify the original array. Using slice allows us to effectively clone the array and gives us a new instance, there are other ways of doing the same thing but I find this way is the cleanest.

Without slice, you’ll be modifying the provided value to our function and might produce an unintended result doing so. I tend to keep my functions pure for this kind of thing, nothing that gets input should be modified.

Thanks to reverse flipping our array upside down, we iterate like we would normally. Breaking out the reverse functionality into a function might also be a great idea. This would allow us to easily test our functionality from within a unit test.

function reverseArray(array) {
    return array.slice(0).reverse();

One thing to keep in mind is for my use-case, slice worked — if you’re dealing with arrays containing object references or nested arrays or complex objects, slice does not do a deep copy.

Lodash has some great methods for doing recursive and deep cloning of arrays and collections if you need that kind of power. Post your thoughts and suggestions in the comments below.

Configuring Git Pre Commit Hook And TSLint (automatically)

If you’re a TypeScript user and you’re reading this, then you’re using TSLint (most likely). Recently, a situation at work arose where even though TSLint warnings were being thrown in the editor as well as terminal output, some developers (tsk tsk) were still committing these warnings.

Naturally, a pre-commit Git hook is the right candidate for this. Being able to run TSLint to ensure that before a developer can even commit let alone push, only valid code conforming to the tslint.json file can be pushed.

This poses another problem. You can’t automatically add pre-commit hooks into the repository and have everyone automatically pull them down. This is for security reasons, could you imagine if someone committed a hook that deleted a bunch of files/folders?

If you’re using a task runner like Gulp or Grunt, then you can create a clever task that copies a file to the .git/hooks directory for you.

Firstly, let’s create a pre-commit hook. In the root of your application create a new folder called hooks and a new file called pre-commit (with no file extension):


TSLINT="$(git rev-parse --show-toplevel)/node_modules/.bin/tslint"

for file in $(git diff --cached --name-only | grep -E '\.ts$')
        git show ":$file" | "$TSLINT" "$file"
        if [ $? -ne 0 ]; then
                exit 1

Git hooks are actually bash scripts and can be quite powerful. We are creating a path to TSLint in our local application (some Git clients like Github for Windows require this) and using that to call TSLint on our files.

Secondly, let’s create our task. I personally use Gulp, but you can easily adapt the following to any task runner:

var gulp = require('gulp');

gulp.task('install-pre-commit-hook', function() {

gulp.task('default', ['install-pre-commit-hook']);

Running gulp or gulp install-pre-commit-hook will copy over our pre-commit hook and put it into the .git/hooks directory. It is possible on Unix based operating systems that you need to adjust the file permissions using chmod which the fs module offers a method for, but possibly not needed.

Now, throw that task into your Npm build script and anyone else who has the latest changes will get the pre-commit hook every time the task runs. No more warnings from code written by others clogging up your terminal or editor.