Moment of Truth #3 October 24, 2013
After adding some better logging and stuff in the webserver, I determined that no error was causing the server to close. While I was starting the service with node webserver &, I think that I still needed to do nohup.
We shall see. My nohup'ed webserver is now running, and will run always in the future, whether it dies randomly or not. It's just the better way to start it.
Moment of Truth #2 October 23, 2013
After upgrading to Node.js 0.10.21 and implementing domains in Node.js, I was still getting errors. Since then, I've added code to listen for the response 'close' event, as well as the server 'clientError' event, in an effort to track down the issue. I think I may be on to something. At least then I'll be able to go to stack overflow or github with the issue in hopes that it will be fixed, or the act of listening for those events and logging the errors will lead me to the fix myself. Here's hoping.
The Moment of Truth October 17, 2013
Every morning I have an email from pingdom that my site went down overnight. I've upgraded to node.js 0.10.20 a few weeks ago to take advantage of some other bug fixes and optimizations, and it shows in the Google Analytics page load times (they're all 0 seconds)
I have had no luck tracking down the cause, but I read about how to prevent a server error from bringing down the site. I've implemented the suggested solution using node.js domains, and we will see what's happening tomorrow.
It might be the same thing because when I go into the server, node is still running my webserver, it just appears the socket was destroyed... We shall see. Wish me luck!
Flickr Integration Complete! October 11, 2013
It really didn't take too long! What I outlined in the previous post is exactly what it does:
- Check flickr for my photos that were tagged with "jtccom". Download the 2048 pixel wide version of it
- Write JS + Node part of my website which shows, in a queue like manner, an image that hasn't been processed
- Clicking on the image shows a box of where I clicked, where it would be cropped, according to the sizes defined for the header
- Send request to the server, which uses GraphicsMagick to crop it and then create different scaled images based on the sizes defined for the site
Here are some screenshots:
At first, you are prompted with the next image which hasn't been processed
Next, you select the point where you want to crop it. Sizes are pre-determined, so there's no dragging and resizing a bounding box, it knows all the sizes and the size of the images, so it just does it for you
Click the process button when you've made your crop selection
Wait a second or two while Node.js and gm (GraphicsMagick) processes your photos.
And GraphicsMagick code in node.js, which is really helpful, and I was able to get it to work on Windows
this.handlePost = function(site, query, finishedCallback){
var tmpdir = path.normalize(site.path + site.config.tempDownloadFolder);
var processeddir = path.normalize(site.path + site.config.processedFolder);
var form = query.form;
var sizes = site.config.imageWidths;
var heights = site.config.imageHeights;
var filename = form.image.substring(form.image.lastIndexOf("/")+1);
var fileParts = filename.split(".");
sizes.sort(function(a,b){ return b - a; });
heights.sort(function(a,b){ return b - a; });
var x1 = parseInt(form.x1, 10),
x2 = parseInt(form.x2, 10),
y1 = parseInt(form.y1, 10),
y2 = parseInt(form.y2, 10);
var w = x2 - x1, h = y2 - y1;
// process first size, use that for base of resizes
var croppedPath = processeddir + fileParts[0] + "-" + sizes[0] + "." + fileParts[1];
gm(tmpdir + filename).crop(w, h, x1, y1).write(croppedPath, function(err){
var sync = new SyncArray(sizes);
sync.forEach(function(size, index, array, finishedOne){
if (index > 0){
var scaled = processeddir + fileParts[0] + "-" + size + "." + fileParts[1];
gm(croppedPath).resize(size, heights[index]).write(scaled, function(x){ finishedOne(); });
} else finishedOne();
},
function(){
finishedCallback({ content: JSON.stringify({ success: true }), headers: {"Content-Type": "application/json"} });
});
});
}
Again, it uses my custom built webserver and the SyncArray object that I also wrote.
Flickr code was pretty simple too. Here's that, accessing the Flickr API (no auth) with Node.js
var http = require("http"),
querystring = require("querystring"),
SyncArray = require("syncarray").SyncArray;
this.getPhotosByTag = function(apiKey, user, tag, callback){
var self = this;
var method = "flickr.photos.search";
var qs = { method: method, api_key: apiKey, user_id: user, tags: tag, format: "json", nojsoncallback: 1 };
var req = { host: "api.flickr.com", path: "/services/rest/?" + querystring.stringify(qs) };
http.get(req, function(res){
var json = "";
res.on("data", function(d){
json += d;
}).on("end", function(){
var photos = JSON.parse(json).photos.photo;
if (photos.length > 0){
var sync = new SyncArray(photos);
sync.forEach(function(photo, index, array, finishedOne){
self.getPhotoSizes(apiKey, photo.id, function(sizes){
photo.url = sizes.filter(function(s){ return s.label == "Large 2048"; })[0].source;
finishedOne();
})
}, function(){
console.log("url = " + photos[0].url)
callback(photos);
})
}
else callback([]);
});
});
}
this.getPhotoSizes = function(apiKey, photoId, callback){
var method = "flickr.photos.getSizes";
var qs = { method: method, api_key: apiKey, photo_id: photoId, format: "json", nojsoncallback: 1 };
var req = { host: "api.flickr.com", path: "/services/rest/?" + querystring.stringify(qs) };
http.get(req, function(res){
res.setEncoding("utf8");
var json = "";
res.on("data", function(d){
json += d;
}).on("end", function(){
var sizes = JSON.parse(json).sizes.size;
console.log(sizes.length);
callback(sizes);
});
});
}
The next step is to update the front end css to include all sizes of a version of the image, and switch between them using the respond.js and media queries. That should be simple, but it's late and I'm going to bed!!
Enjoy! Leave a comment.
Prettier Site October 10, 2013
I updated the site by incorporating a random picture that I've taken, pre-cropped, into the header.
As a developer and overall lazy person, finding and cropping 10 images to the size that I want was way too much. I will have to rectify this. I got a flickr API key. Here are my plans:
- Obtain Flickr API Key - Done
- Take Pictures
- Upload them to flickr the normal way
- Add a tag to them specifying that they are suitable for the website, like jtccom
- Write a service that checks for new photos of mine with that tag, download them, flag them as new
- Write an admin interface to show new images, and for now, let me click the important part of the image, so it can crop to the size I need around the specified point. Sizes will be pre-determined (3 sizes for the 3 different breakpoints I have defined in my responsive design (not much to it).
- Continuously have an inflow of beautiful headers that will display on my web page
It shouldn't be that bad. Nothing I've mentioned above has me too concerned. It should be fun! For now, though, I have 10 canned images that don't populate directly from flickr. They are random so you have have to refresh more than just 10 times to see them all. Enjoy!
Tag List Added October 8, 2013
var db = require("../db");
this.tagCloud = [];
this.initialize = function(site, callback){
var self = this;
while (self.tagCloud.pop());
db.getPosts(site.db, {}, -1, -1, function(posts){
var tags = {};
posts.forEach(function(post){
if (post == null) return;
for (var i = 0; i < post.tags.length; i++){
if (tags[post.tags[i]] == null)
tags[post.tags[i]] = { tag: post.tags[i], count: 0 };
tags[post.tags[i]].count++;
}
});
for(var tag in tags){
if (tags[tag].count > 8) // arbitrary limit so we don't list like 200 tags with 1 post each
self.tagCloud.push(tags[tag]);
}
self.tagCloud.sort(function(a,b){ return b.count - a.count; });
callback();
});
}
Google Keep is not fast enough October 7, 2013
Responsive Design October 6, 2013
A Chat with a Coworker October 4, 2013
Me:: their property names are their querystring keys
Me:: QS.rdb = 1
Me:: javascript man, it's awesome :)
Mark Coworker: yeah... that's what i avoid with those strongly typed querystring objects of mine.
Mark Coworker: too many query string keys that don't make any sense.
Me:: strongly typed is weakly handwritten
Me:: :P
Me:: just tried to come up with something that you couldn't possibly have a comeback for, and which was cleverly punned
Mark Coworker: i don't understand how i'm the only person here who seems to have an issue with the hard-coding of non-sensical query string keys all over the place.
Me:: personally i depend on url rewriting so that the client doesn't see the querystring names... if the technology allows it easily
Me:: so i don't use querystring in my node.js web apps
Me:: i have a very nice helper method that will look for a unique key in the database... so if you passed it the text, "I dislike Mark Coworker's Strongly Typed Querystrings", with the table and the field (mongodb doesn't know of such things by those names), it will take the whole string, lowercase it, remove non-characters, replace spaces with hyphens, then look to see if that's unique. if not, it will have add an incremented value to the end and find
i-dislike-mark-Coworkers-strongly-typed-querystrings-32
Mark Coworker: =P
Mark Coworker: sorry. ddin't see IM alert until the last message.
Mark Coworker: trying to get CLIENT_REPLACED build ready.
Me::
as the unique key to use in the URL for rewriting
http://www.stuffidislikeaboutmarkcorkwer.com/posts/post-1254/i-dislike-mark-coworkers-strongly-typed-querystrings-32
Me:: heh
Mark Coworker: linky-no-worky
Me:: post 1253 was about wearing sweatshirts on 80 degree days (EDITOR: side note, the day this chat took place, it was 80 degrees, early October, as we left for lunch and he had his sweatshirt on)
Me:: i mispelled your name in the url anyway
Mark Coworker: you did!
Mark Coworker: even after fixing it, the url still doesn't work.
Me:: yeah, it's down for maintenance, need more database space
Me:: too many posts
Mark Coworker: well, it's not his fault that there are some many things wrong with the world. such as the lack of database space on servers.
Mark Coworker: I'm through half my bottle of Purell
Me:: damn
Mark Coworker: it's a small traveller sized one though.
Me:: i've used half a bottle in my lifetime
Me:: post 1255, germophobe
Mark Coworker: post 1255: half of your office getting sick right before you're hosting a EVENT that took up TIME_SPAN of your life and DOLLAR_AMOUNT dollars to get ready for.
Me:: post 1256: wants editorial authority on site which talks badly about him