Node.js Gzipping

Yesterday I lied when I said it would be the last Node.js post for a while! Oh well.

So today I was looking to make my project site a little faster, particularly on the mobile side. Actually this was the last three days worth of trying to figure stuff out. Node.js has plenty of compression library add-ons (modules), but the most standard compression tool out there is gzip (and gunzip). In the Accept-Encoding request header, the browser will tell you whether or not it can handle it. Most can...

This seemed like an obvious mechanism to employ to decrease some page load times... not that it's soo slow, but when the traffic gets up there and the site starts bogging down, at least the network will be less of a bottleneck. Some browsers do not support it, so you always have to send uncompressed content in those cases.

So I found a good Node.js compression module that supported BZ2 as well as gzip. The problem was, it was only meant to work with npm (Node's package manager), which for whatever reason, I've stayed away from. I like to keep my modules organized myself, I guess! So I pull the source from github and build the package if it requires it, then make sure I can use it by just calling require("package-name"); It's worked for every case except the first gzip library I found... doh! Luckily, github is a very social place, and lots of developers will just fork a project and fix it. That was where the magic started. I found a fork of the node-compress that fixed these issues, installed the package correctly by just calling ./build.sh (which calls node-waf, which I'm fine with using!), and copied the binary to the correct location within the module directory. So all I had to do was modify my code to require("node-compress/lib/compress"); I'm fine with that too.

Code - gzip.js

var compress = require("node-compress/lib/compress"); var encodeTypes = {"js":1,"css":1,"html":1}; function acceptsGzip(req){ var url = req.url; var ext = url.indexOf(".") == -1 ? "" : url.substring(url.lastIndexOf(".")+1); return (ext in encodeTypes || ext == "") && req.headers["accept-encoding"] != null && req.headers["accept-encoding"].indexOf("gzip") != -1; } function createBuffer(str, enc) { enc = enc || 'utf8'; var len = Buffer.byteLength(str, enc); var buf = new Buffer(len); buf.write(str, enc, 0); return buf; } this.gzipData = function(req, res, data, callback){ if (data != null && acceptsGzip(req)){ var gzip = new compress.Gzip(); var encoded = null; var headers = null; var buf = Buffer.isBuffer(data) ? data : createBuffer(data, "utf8"); gzip.write(buf, function(err, data1){ encoded = data1.toString("binary"); gzip.close(function(err, data2){ encoded = encoded + data2.toString("binary"); headers = { "Content-Encoding": "gzip" }; callback(encoded, "binary", headers); }); }); } else callback(data); }

So it's awesome. Here's a picture of the code working on this page
gzipped

Quite the improvement, at less than 30% of the size! Soon I'm going to work in a static file handler, so that it doesn't have to re-gzip js and css files every request, although I use caching extensively, so it won't have to re-gzip it for you 10 times in a row, only re-gzip it for 10 different users for the first time... I can see that being a problem in the long run, although, it's still fast as a mofo!

Thread Safety in Node.js

Probably my last post about Node.js for a while. My original implementation of the webserver used page objects in the following way:

index.html -> /site/pages/index.js

Meaning that when index.html was requested, index.js was located and executed. A side effect of this, using the node.js construct "require", is that the js page will only be loaded once. Which was bad because I had my code structured in the following way:

//index.js: this.title = "My page title"; this.load = function(..., finishedCallback){ this.title = figureOutDynamicTitle(parameters); finishedCallback(); }

Granted, when someone went to index.html, there was a very small time that they might get the wrong title, in this example. But for things, like setting an array on the page to the value loaded from the database, or where you might have another case in the load function where the array doesn't get set at all, there's a very good chance that someone will see something that someone else was only supposed to see.

How I fixed this was to pass back the page object to the finishedCallback. The page object was declared, built and passed back all within the context of the load function, so it never has a chance to be wrong! This is how it looks now

//index.js this.load = function(..., finishedCallback){ var page = { title: figureOutDynamicTitle(parameters) }; finishedCallback({ page: page }); }

This works. And it's super fast still.

Node.js Process

The Node.js process class is very helpful for cleanup purposes. You can imagine, when writing a rudimentary web server, you might also have a mechanism for tracking sessions. This was definitely the case for me, so we can easily keep track of who's logged in, without exposing much in the way of security.

Having not worked in a way to update all of my code without restarting the server, I have to restart the server when a change is made to the code in order for it to update. This in turn, deletes all of the sessions. I was thinking of a way to handle this, knowing of Node.js's process class, but where to put the code was not immediately obvious in my brain, but once I started coding for this exact purpose, it was a shame that I didn't think of it right away, just for the shear fact that I cannot lie, and I could say "Yeah I thought of it right away" :)

Here's the code:

function closeServer(errno){ console.log("Server closing, serializing sessions"); SessionManager.serialize(config.web.global.sessionStore, function(){ server.close(); process.exit(0); }); } process.on("SIGTERM", closeServer); process.on("SIGINT", closeServer);

Luckily, killproc sends SIGTERM, and pressing CTRL+C sends a SIGINT signal. I've noticed kill also just sends SIGTERM, with SIGKILL being a very explicit option, so we may just have to listen for SIGTERM and SIGINT. Anyway.

sessionStore is the folder to store them in, and it clears them out each time and saves what's in memory, so we don't get months old sessions in there.

I just serialize them using JSON.stringify and deserialize with the counter, JSON.parse. It works beautifully.

The session ID is just a combination of the time, plus some user variables like user agent, hashed and hexed. Here's the code that serializes and deserializes.

SessionManager.prototype.serialize = function(path, callback){ var self = this; fshelp.ensureFolder(path, function(err, exists){ fshelp.clearFolder(path, function(err, exists){ for (id in self.sessions){ if (!self.isExpired(self.sessions[id])) fs.writeFileSync(path + id, JSON.stringify(self.sessions[id]), "utf8"); } delete self.sessions; callback(); }); }); } SessionManager.prototype.deserialize = function(path, callback){ var self = this; fshelp.ensureFolder(path, function(err, exists){ fs.readdir(path,function(err, files){ files.forEach(function(d){ var id = d.substring(d.lastIndexOf("/") + 1); self.sessions[id] = JSON.parse(fs.readFileSync(path + d, "utf8")); }); callback(); }); }); }

Now no one gets logged out when I have to restart my server for any reason!

Since all of this takes place at the end or the beginning, I figured to save some brain power and just use the synchronous operations for writing and reading the files.

Website is Back Up

As you can see.  Very simple, no images, but the bells and whistles are there.  I wrote it up over a weekend, probably 6-7 hours total, converting the original mysql database over to mongodb, and writing some quick code in node.js.

There's only one table, posts, so it was easy.  There is a mechanism for me to edit the posts, and add new ones, but it's very crude!

Take it easy. Peace.