Node Summit 2015

NodeSummit panel with representatives from ???
NodeSummit panel with Foundation members.

What a difference a year makes. Astute readers of this blog may remember my sitting on the Node.js fence in the fall of 2013. I eventually jumped off the fence thanks to the wave of presentations at the last year’s NodeSummit indicating that Node.js was ready. During the ensuing year I have become a vocal supporter of Node.js-based micro-services for fun and profit. It is therefore with great anticipation that I flew to San Francisco earlier this week to attend this year’s NodeSummit.

One of the immediate delights of my visit was the fact that unlike last year, IBM’s presence was huge (‘platinum sponsor’ big enough for you?). There have been projects in IBM dabbling in Node.js, and IBM PaaS Bluemix has a first class support for running Node apps. More recently though, Node.js has become a go-to technology for every new IBM project I have seen recently, particularly cloud-hosted (and most new projects fall into that category these days).

The theme of the last year was the claim that ‘Node.js is ready for the enterprise’ (once the Walmart memory leak has been fixed). The mood was one of hope and excitement, with a tinge of whistling in the dark. During the ensuing year, the pilot projects have wrapped up, risk-awerse enterprises have convinced themselves that Node.js is just fine for production, and there is no need to convince anybody any more. This year, we moved on to the details and the logistics of doing it at a massive scale.

Of course, the highlight of the conference was the announcement of the formation of the Node.js Foundation, taking over stewartship of Node from its sole corporate sponsor Joyent (Joyent continues to be the member of the Foundation). It is left to be seen if this move will heal the rift in the community caused by the fork of Node into io.js, but it does point at the maturation of the platform and addresses some of the key complaints regarding the control over its evolution.

Another long-awaited news was the shipping of Node.js 0.12. Exactly a year ago when I attended Node Day organized by PayPal, then and still current Node.js team lead TJ announced that the shipping of 0.12 was ‘immanent’. Well, this was the longest ‘immanent’ anything in the recent history, if that word would to retain any meaning. On the positive side, the shipping of 0.12 followed the passage of all the test case suites for all the supported platforms, turning this into the new quality benchmark to satisfy for all future releases (and the coveted 1.0). It also underlines the difficulty that Node.js platform now finds itself in, being built into the core of more and more production deployments, and under a lot of pressure to evolve the platform while maintaining quality.

One of my favourite talks was by Fred Schott from Box about the realities of running Node.js in a real-world situation. It was heart-warming to follow performance tweaks all the way from mediocre to fast. It is a cautionary tale that, while Node.js is a wonderful platform with a potential for great performance, careless out of the box apps may disappoint. Like any other platform, it requires tuning, profiling and (what a concept) a certain level of mastery. You cannot just copy a ‘Hello, World’ app from Stack Overflow and expect server-melting speeds.

Another personal highlight was that the old ‘server vs client’ rendering debate is alive and well. It was amusing for me to hear two young PayPal team members talk about ditching Node/Dust combo (a normal PayPal staple) for client side rendering using Angular. In my chat with them they revealed that they managed to wrestle decent performance from Angular on mobile only after a lot of work (see my comment about Box’s experience above). Funnily enough, in a later talk Peter Marton shared tricks on isomorphic Node.js apps, where the same template is reused on both sides of the divide (my preferred technique as well). It just shows that there is no right or wrong approach – whatever works in your particular situation.

If I had one complaint about the two days I spent at the conference, it was that it was somewhat light on hard-core technical content. The focus was on high-level panels, ‘this is how we migrated from the monolith to the Node micro-services’ talks and ‘look at all the boxes in our architecture’. Now that we have all congratulated ourselves for the foresight to support Node.js and turn it into such a world-class platform to write modern systems, we should probably take off our celebratory hats and dig into the details of hard core, production Node.

All in all, I enjoyed the summit but left with mixed emotions. Last year I was fully aware of doing familiar things in a new platform, and the most mundane tasks (‘look ma, I am setting a cookie using Node.js!’) felt new and exciting. I think we are leaving that ‘new car smell’ age and entering the period where Node.js will just dissolve into the background – become like air or water, and we will focus on ‘what’ we are doing, rather than the fact we are doing it in Node.js. It may become like (gasp) Java at some point in the future.

For some developers (see my post on another TJ) there is no fun in that (look, ma, I am setting a cookie using Go!), and restless Node.js committers eager to move fast and break things are already busy in the io.js repository. I on the other hand just want to build awesome apps. Node.js is already a great platform for me and as long as it remains stable, reasonably bug free and supported, I am happy. I am sure I share this with many enterprise Node.js enthusiasts.

© Dejan Glozic, 2015

Isomorphic Apps Part 1: Node, Dust, and

Two-headed turquoise serpent. Mixtec-Aztec, 1400-1521. Held at British Museum. Credits: Wikimedia Commons.
Two-headed turquoise serpent. Mixtec-Aztec, 1400-1521. Held at British Museum. Credits: Wikimedia Commons.

On the heels of the last week’s post on the futility of server vs client side debate, here comes an example. I wanted to do this a long time and now the progress of the Bluemix project I am working on has made it important to figure out isomorphic apps in earnest.

For those who have not read the often cited blog post by the Airbnb team, isomorphic apps blur the line between the client and the server by sharing code and templates. For this to work, it helps that both sides of the divide are similar in nature (read: can run JavaScript). Not that it is impossible to do in other stacks, but using Node.js on the server is the most direct path to isomorphism. JavaScript libraries are now written with the implicit assumption that people will want to run them in a Node app as well as in the browser.

Do I need isomorphic?

Why would we want to render on the server to begin with?

  1. We want to send HTML to the browser on first request, showing some content to the user immediately. This will help the perceived performance since browsers are amazing at quickly rendering raw HTML, and we don’t have to wait for all the client side JavaScript to load before we can see the anything.
  2. This HTML will also give something to the search engine crawlers to chew on and give you a decent SEO without a lot of effort.
  3. Your app will fit nicely into the Web as it was designed (a collection of linked pages).

Why would you want to do something on the client then?

  1. You want to provide nice interactive experience to the user – static documents (even those dynamically rendered on the server) are not a lot of fun beyond actual content.
  2. You want your page to respond to changes on the server (other users making changes that affect the content of your page) using Web Sockets.
  3. You want to provide features that involve a number of panels that need to flow like a native app, and don’t want to reload full pages for that.

How to skin this particular cat

Today we are not hurting for choices when it comes to libraries and frameworks for Web development. As a result, I decided to write a multi-part article covering some of those options, and allow you to choose what works in your particular situation.

We will start with the simplest way to go isomorphic – by simply exploiting the fact that many JavaScript templating libraries run on both sides of the network divide. In our current projects in IBM, dustjs-linkedin is our trusted choice – a solid library used by many companies and a pleasure to work with. It can be used for rendering views of Node/Express applications, but if you compile the template down to JavaScript, you can load it and render partials on the client as well.

The app

For this exsercise, we will write a rudimentary Todo app, which is really just a collection of records we want to keep. There is already a proverbial Todo MVC app designed to test all the client side MVC frameworks known to man, but we want our app to store data on the server, and render the initial Todo list using Node.js, Express and Dust.js. Once the list arrives at the client, we want to be able to react to changes on the server, and to add new Todos by entering them on the client. In both cases, we want to render the new entries on the client using the same templates we used to render the initial list.

Since we will want to use the REST API (folded into the same app for simplicity) as the single source of truth, we will use library to build a MVC-CV app (full MVC on the server, only the controller and the view on the client). The lack of the client model means that when we make changes to the server model through the REST API, we will rely on to communicate with the client side controller and update the client view. With a full client side MV*, client side model would be updated immediately, followed by the asynchronous reconciliation with the server. This approach provides for immediacy and makes the application feel snappy, at the expense of the possibility that a seemingly successful operation eventually fails on the server. Mobile app developers prefer this tradeoff.

In order to make Todos a bit more fun, we will toss in Facebook-based authentication so that we can have user profile and store Todos for each user separately. We will use Passport module for this. For now, we will use jQuery and Bootstrap to round up the app. In the future instalments, we will get progressively fancier with the choices.

Less taking, mode coding

We will start by creating a Dust.js partial to render a single Todo card:

<div class="todo">
   <div class="todo-image">
      <img src="{imageUrl}">
   <div class="todo-content">
      <div class="todo-first-row">
         <span class="todo-user">{userName}</span>
         <span class="todo-when" data-when="{when}">{whenText}</span>
      <div class="todo-second-row">
         <span class="todo-text">{text}</span>

As long as the variables it needs are passed in as a dictionary, Dust Core library can render this template in NodeJS or the browser. In order to be able to load it, we need to compile it down to JavaScript and place in the ‘public/js’ directory:

dustc -name=todo views/todo.dust public/js/todo.js

We can now create a page where todos are rendered on the server as a list, with a text area to enter a new todo, and a ‘Delete’ button to delete them all:

<h1>Using Dust.js for View</h1>
<div class="new">
  <textarea id="new-todo-text" placeholder="New todo">
<div class="delete">
  <button type="button" id="delete-all" class="btn btn-primary">Delete All</button> 
<div class="todos">
    {>todo todo=./}

You will notice in the snippet above that we are now inlining the partial we have defined before (todo). The collection ‘todos’ is passed to the view by the server side controller, which obtained it from the server side model.

The key for interactivity of this code lies in the JavaScript for this page:

<script src=""></script>  
<script src="/js/todo.js"></script>   

  var socket = io.connect('/');
  socket.on('todos', function (message) {
    if (message.type=='add') {
      dust.render("todo", message.state, function(err, out) {
    else if (message.type=='delete') {
  $('#delete-all').on('click', function(e) {
    $.ajax({url: "/todos", type: "DELETE"});
  $('#new-todo-text').keyup(function (e) {
    var code = (e.keyCode ? e.keyCode : e.which);
    if (code == 13) {
      $.post("/todos", { text: $('#new-todo-text').val() });

New todos are created by capturing the Enter key in the text area and posting the todo using POST /todos endpoint. Similarly, deleting all todos is done by executing DELETE /todo Ajax call.

Notice how we don’t do anything else here. We let the REST endpoint execute the operation on the server and send an event using Web Sockets. When we receive the message on the client, we update the view. This is the CV part of MVC-CV architecture that we just executed. The message sent via Web Sockets contains the state of the todo object that is passed to the Dust renderer. The output of the todo card rendering process is simply prepended to the todo list in the DOM.

REST endpoint and model

On the server, our REST endpoint is responsible for handling requests from the client. Since we are using Passport for authentication, the requests arrive at the endpoint with the user object attached, allowing us to execute the endpoints on behalf of the user (in fact, we will return a 401 if there is no user info).

var model = require('../models/todos');

module.exports.get = function(req, res) {
  model.list(req.user, function (err, todos) {
}; = function(req, res) {
  var body = req.body;
  model.add(req.user, body.text, function (err, todo) {
    _pushEvent("add", req.user, todo);

module.exports.delete = function(req, res) {
  model.deleteAll(req.user, function(err) {
    _pushEvent("delete", req.user, {});

function _pushEvent(type, user, object) {
  var restrictedUser = {
    name: user.displayName
  var message = {
    type: type,
    state: object,
    user: restrictedUser
  };"todos", message);

We are more-less delegating the operations to the model object, and firing events for verbs that change the data (POST and DELETE). The model is very simple – it uses lru-cache to store data (configured to handle 50,000 users and TTL of 1 hour for entries before they are evicted). This is good enough for a test – in the real world you would hook up a database here.

var LRU = require("lru-cache")
, options = { max: 50000
            , length: function (n) { return 1 }
            , maxAge: 1000 * 60 * 60 }
, cache = LRU(options)

module.exports.add = function (user, text, callback) {
  var todo = {
    text: text,
    imageUrl: """/picture?type=square",
    userName: user.displayName,
  var model = cache.get(;
  if (!model) {
    model = { todos: [todo] };
  else {
    model = JSON.parse(model);
    model.todos.splice(0, 0, todo);
  cache.set(, JSON.stringify(model));
  callback(null, todo);

module.exports.list = function(user, callback) {
  var model = cache.get(;
  if (model)
    model = JSON.parse(model);
  var todos = model?model.todos:[];
  callback(null, todos);

module.exports.deleteAll = function(user, callback) {

The entire example is available as a public project on IBM DevOps Services. You can clone the Git repository and play on your machine, or just click on Code and inspect it in the Web IDE directly.

The app is currently running on Bluemix – log in using your Facebook account and give it a spin.

Commentary and next steps

This was the simplest way to achieve isomorphism. It has its downsides, among them the lack of immediacy caused by the missing client side model, but it is blessed by the complete freedom from client side frameworks (jQuery and Bootstrap nonwithstanding). In the part 2 of the post, I will insert Backbone on the client. Since it has support for models, collections and views, it is a particularly good choice for gradually evolving our application (AngularJS would require a complete rewrite, where Backbone can reuse our Dust.js template for the View). Also, as frameworks go, it is tiny (~9K minified gzipped).

Finally, in the part 3, we will swap Dust.js for React.js in the Backbone View implementation, just to see what all the fuss is about. Now you realize why I need to do this in three parts – so many frameworks, so little time.

© Dejan Glozic, 2015