What to expect when you decide to migrate from Javascript to Typescript

21 first impressions of Typescript

An Aha! moment, delivered to your inbox every week. Check out the JS Tips & Tricks Newsletter!

To a typed Javascript

I finally made the leap. I always felt Javascript’s dynamic typing was a compromise in the wrong direction. Complex software need all the tools to prevent and detect errors, and a sound type system is a powerful one. Coming from Java to Javascript, one of my main concern was the missing of this safety net.

As a result, I paid close attention (and much hope) to the alternatives and solutions to bring static typing to the web. Dart –formerly Dash–, JSDoc, Flow, and then Typescript all have a solution, but an entirely different approach.

The hardest part of web development is that the goal is not to find a useful piece of technology. Rather, it is to identify the next big thing that will be the de-facto standard a year from now. If you choose wrongly, you need to refactor and rewrite over and over again.

I liked Dart’s approach, but it failed to become mainstream. While there are some advocates, I don’t see Flow to emerge as the winner either.

As for Typescript, I see more and more projects embracing it. I feel confident that this trend continues and I won’t need to migrate to the next big thing that adds types to Javascript.

While the migration went smoother than expected, it’s still a whole different world. There were pleasant and not so pleasant surprises which required extensive Googling and some serious head scratching.

Below is the summary of the experience so that you can know in advance what the ups and downs are. I hope it will help you avoid an issue I faced or at least give you an overview.


1. Tuple types


Typescript has a type called tuple, which is a fixed-length Array with fixed types at each position. When you know a value has exactly two elements, the first being a number, and the second a string, it provides better type checking:

(params: [number, string]) => {
	params[0] // number
	params[1] // string

Despite their apparent usefulness, I was surprised to see that type inference does not use them at all:

const t = [1, 2]; // number[]

const t = [1, "2"]; // (string | number)[]

Why not a tuple of [number, number] and [number, string], respectively?

As it turns out, Typescript plays safe here and assigns the broader type automatically.

To use tuples declare the types as such:

const t = [1, 2] as [number, number] // [number, number]

Another surprise was that even when a tuple type is used, over-indexing does not produce an error:

console.log(t[2]); // No error

t[2] = 5; // No error

In my experience, using tuples brings a bit stricter type checking, but there are some edge cases. Also, they require writing out the types explicitly, which is cumbersome at times.

2. (-) Non-indexable objects


Element implicitly has an 'any' type because type '...' has no index signature.

This error was the first bumper needed some head-scratching to solve.

As it turns out, objects are not indexable by default. That, in turn, forbids the [...] property access:

const f = (p: string) => {
	const o = {
		a: "a"
	console.log(o[p]); // Error

Index signature can be added using [index: string]: string (or any other value type). But to add it, you also need to copy the entire type, which is a whole lot of extra code, especially for larger or nested objects:

const f = (p: string) => {
	const o = {
		a: "a"
	} as {
		a: string,
		[index: string]: string

3. (-) Cast to non-undefined


I use ImmutableJs extensively and found it strange that Seq’s filter’s iteratee gets the value as optional. I think it is fixed in 4.0.0., but at the time of writing, it was still in rc.

The iteratee function has (e?: T) types. With strictNullChecks enabled, this code raises an error, as e’s type is number | undefined:

new Seq([1, 2])
    .filter((e) => e > 1);

I know it can’t be undefined, but how to tell Typescript about it?

Cast it explicitly to a number:

new Seq([1, 2])
	.filter((e) => (e as number) > 1);

While this solution works, it’s long, and what if the stored type change? In that case, all the casting has to be modified too.

Later I found the Typescript-specific non-null assertion operator, which eliminates undefined and null from the type:

new Seq([1, 2])
	.filter((e) => e! > 1);

I’m still not perfectly happy with it, as I need to add that to all the usages and I still can not declare the parameter itself as non-undefined.

Later still I found that I can amend libraries themselves by adding new overloads to functions. That enabled me to fix the filter function, which in turn made casting and non-null assertions unnecessary.

4. (-) Function returns as types


Typescript embraces classes and using them as types is easy:

class C {
    public func() {
        return 2;

const f = (c: C) => {
    return c.func();

console.log(f(new C())); // 2

All the types are inferred, and all the usages are type-checked.

But what if I don’t want a class but a function that returns an object? In that case, I need to manually provide an interface, as while the function’s return type is inferred, I can not use it as a type elsewhere.

interface C {
	func(): number;
const c = () => {
	return {
		func: () => {
			return 2;

const f = (c: C) => {
	return c.func();
console.log(f(c())); // 2

This forces me to maintain two parallel structures.

5. (-) Object -> pairs and back


I need to transform objects into Arrays and back in several places, and Typescript has a hard time figuring out what happens with the types. Depending on the library and the steps, the inferred types are considerably different.

When converting an object to pairs and back without using any library, no types are inferred:

Object.assign({}, ...Object.entries({a: "a"})); // any

Doing the same with Underscore, the result is an empty object, which is just a small step in the right direction:

underscore.object(underscore.pairs({a: "a"})) // {}

Lodash figures out the best:

lodash.fromPairs(lodash.toPairs({a: "a"})); // {[index: string]: string}

But throw a map into the mix, and the inference is now less helpful:

lodash.fromPairs(lodash.map(lodash.toPairs({a: "a"}), ([k, v]) => [k, v])); // {[index: string]: any}

6. (-) Default imports

I got errors complaining about import _ from "underscore" and a few other libraries. Even though it was an entirely valid import, Typescript thought otherwise.

Googling around, I found some docs saying that the import = require(...) construct would solve the issue. Unfortunately, with the "module": "es2015" setting, this does not work, and with Webpack supporting ES6 modules I didn’t want to go back to UMD.

Another solution is to use import * as _ from "underscore". This works, but in the not-so-rare case when the export is a function, for example, in this case, it does not work.

Searching further, I finally found a suitable solution: setting allowSyntheticDefaultImports to true. This setting took care of the missing exports, and everything went fine from there.

7. (-) No compatible call signatures


On several occasions, Typescript managed to surprise me how accurate its type inference is. But then it struggled in some other cases. I had an issue with typing that sparked a question, that was answered quickly afterward.

For this perfectly fine code, the compiler reports an error:

class C1 {
	public f: (params: { a: string }) => {
class C2 {
	public f: (params: { b: string }) => {

const fx = (c: C1 | C2) => {
	const params = { a: "a", b: "b" };
	c.f(params); // Error

The solution is to make both of the classes implement an interface with the function having both sets of expected parameters. Check out the linked answer for the code sample.

8. Force cast


Although on rare occasions, I needed to cast a value to an entirely different type. This issue also required some searching.

For example, when I have a variable that Typescript considers a number, but I know it’s actually a string, casting it reports an error:

const f = (a: number) => {
	// const b: string = a as string; // Error

To cast to anything, cast to any then to the target type:

const f = (a: number) => {
	const b: string = a as any as string;

9. (-) Custom types


You can create a new type using type <name> = .... This construct is great if you want to reuse the same definition in multiple places. For example, type userId = string; declares that the userId type is a string.

From there on, you can use it just like any other type:

const isActive = (userId: userId): boolean => {

This is great, but to my surprise, it is just an alias. It does not prevent passing any strings as userId:

isActive("c"); // No error

I could not find a way to prevent that without adding too much code and type assertions. In Typescript’s terminology, this stricter custom type checking is called nominal type matching. There is an ongoing ticket to support this.

It would be great if there was a way to declare enforced types. That way, if you have userId, itemId, and they are both strings, type checking would make sure that you don’t accidentally mix them.

10. Casting a destructured statement


The next lesson was how to use typecasting along with destructuring. I used UnderscoreJs’s partition function that separated null properties from non-nulls:

const objs = [
	{ k: "1", v: "v1" },
	{ k: "2", v: "v2" },
	{ k: "3", v: null}   
const [nonNulls, nulls] = _.partition(objs, (e) => e.v !== null);

nonNulls is guaranteed to not have any nulls as values, while nulls has only nulls. Typescript’s inference assigned string | null for both of them.

How could I cast them?

First, I tried to cast the arguments, like the way I do for variable declarations: const [nonNulls: ..., nulls: ...], but with little success.

Then I figured casting the result of the function might work:

const [nonNulls, nulls] = partition(objs, (e) => e.v !== null) as [{ k: string, v: string }[], {k: string, v: null}[]];

It worked. As a second surprise, type checking does not work on the right of the as.

11. (-) Generated function signatures


Many libraries provide functions that need to work with many different sets of arguments. One such is partial from lodash, which needs to work for an arbitrary amount of prefilled and placeholder parameters. While the types support several permutations, they do stop at some point, leading to seemingly awkward inferred types.

With a function getting three arguments:

const f = (a: number, b: number, c:number) => `${a} => ${b} => ${c}`;

This partial application infers the types correctly:

const p = _.partial(f, _, _, 5);
console.log(p(3, 4)); // (number, number) => string

But shuffling the arguments easily falls back to the catch-all case:

const p = _.partial(f, _, 4, _);
console.log(p(3, 5)); // (...any) => any

12. (-) Lodash flow with only one function


Using lodash, this is valid JS:

	(a) => a + 1

But Typescript reports an Error, as the types of the 1-ary flow are missing.

Granted, the flow function can be removed without any change to the functionality. So it was just a minor inconvenience.

But it was surprising at first that valid Javascript produced a type error.

General impressions

13. (+) Getting started was easy

My setup was a fairly typical Webpack + Babel combo, and throwing Typescript to the mix was surprisingly straightforward.

There are two loaders to choose from: awesome-typescript-loader and ts-loader. I opted for the latter, and had zero problems with it; also, the build did not slow down by any perceivable degree.

The integration into the build pipeline as a whole took only a few minutes.

14. (-) Generated function signatures


Many libraries provide functions that need to work with many different sets of arguments. One such is partial from lodash, which needs to work for an arbitrary amount of prefilled and placeholder parameters. While the types support several permutations, they do stop at some point, leading to seemingly awkward inferred types.

With a function getting three arguments:

const f = (a: number, b: number, c:number) => `${a} => ${b} => ${c}`;

This partial application infers the types correctly:

const p = _.partial(f, _, _, 5);
console.log(p(3, 4)); // (number, number) => string

But shuffling the arguments easily falls back to the catch-all case:

const p = _.partial(f, _, 4, _);
console.log(p(3, 5)); // (...any) => any

15. (+) Caught a bunch of bugs (or potential bugs)

On several occasions in the process of adding types, Typescript reported an error, and it was indeed a potential bug hidden in the codebase. Bugs like these usually follow a similar mental pattern: The system works, and nobody knows about the flaw. Then one day the bug gets revealed, making a mystery how on earth could this had worked in the first place.

Type errors were like this. It seemed like some function never got called with some parameters, and that hid the potential problems. While fragile, the code worked.

This was the time I saw the power of static typing and how fragile Javascript’s way is. It revealed edge cases that could wreak havoc on the function if called in a specific way. It was only a coincidence it did not manifest itself as a bug. But with types, even the possibility could be avoided.

16. (-) Strict, stricter, strictest

Typescript’s strict modes are strange, to say the least. I’m sure had I followed its evolution, they would make perfect sense, but having jumped in later, they are a mess.

By default, Typescript tries to infer the types, and if it is unable to do so, it assigns the type any, which essentially disables checking.

Also, there is null/undefined handling. By default, if something is potentially null or undefined it does not raise an error. This makes it easier to migrate to Typescript, but TypeErrors that could be detected slip through.

To both these problems, there is a configuration option that forces checking. They are disabled by default not to break existing programs during an update, but that fragments the codebase. Code written using more lenient settings produce errors when moved to a codebase with stricter ones.

Adhering to the strictest possible settings should be the goal. This not only catches the most bugs but also makes the code as portable as possible.

But having to turn them on came as a surprise. On the other hand, maintaining backward compatibility is a challenge for every evolving technology, and making it optional with a setting seems like a good compromise. I’m sure I’ll be grateful for this when I have a sizeable collection of .ts files.

17. (-) Need to convert most of the codebase at once

With all the strict flags enabled, migration to Typescript is like pulling a spider web; you keep finding files you need to convert to get rid of all errors. I’m sure with more lenient settings a phased migration is possible, but in the long run, aiming for the strictest checks pays off.

This is because a .ts cannot import a .js without types definitions. So if you rewrite one file, you also need to do so for all its dependencies. And so on, until all the connections are converted.

This made the first step of the migration a rather large one, a bit larger than I felt comfortable.

As an alternative, you can write .d.ts files for some of the js’s, making demarcation lines between the two worlds. This works and adds just a little overhead, so opt for this approach if you feel overwhelmed by the amount of work.

18. Add types to js files


Some .js files were too complicated to rewrite in the first run, but I’d read that I can write the types without converting the file itself. It wasn’t immediately apparent how to do it, and required some searching and trial-and-error, but figured out finally.

All you have to do is create a .d.ts file next to the .js with the same name (for filename.js, create filename.d.ts), and declare a module with the types. Typescript will automatically use it and you don’t need to convert the file to .ts.

19. Types for 3rd party libs


To get types for a library that does not bring its own (for example, ImmutableJs comes with types included), installing @types/libname usually works. @types is an organization with community-supported types, and as such, covers many popular libraries.

But even when there is no definition file from the community, it’s surprisingly easy to write one and use it in your project.

20. (+) Easy to change the return of a function

One of the selling points of static typing is that a function’s usages can be tracked. If the return type changes, all the non-conforming usages are reported as errors.

Contrary to standard Javascript, if you change the result, for example, from Array to an ImmutableJs List, all usages are immediately reported as errors.

Say goodbye to grepping the function name and hoping to find all the usages this way. Just change the result, fix all the reported errors, and it’s done.

21. (+) Add types to lib


While the types for libraries were mostly OK, there were some problems. But as it turned out, amending a library is not hard either.

As with Seq, List.filter’s iteratee’s parameter is declared as optional:

List.of(1, 2)
	.filter((e) => e > 1) // Error: number | undefined

To fix the signature, add a .d.ts file:

import {List} from "immutable";

declare module "immutable" {
	interface List<T>{
		filter(iteratee: (v: T) => boolean): List<T>


This fixes all usages.


After all the first impressions and many hours of searching for solutions to the above problems, I still feel Typescript is worth it. Most of the issues can be traced back to the dynamic nature of Javascript, and I hope that in the future more and more libraries will embrace a safer way of coding with less moving parts.

Do you have a more straightforward solution to any of the above problems? Let me know, and I’ll update the post with it. Thanks!

27 February 2018

Interesting article?

Get hand-crafted emails on new content!