CodeKit 2 - First look

CodeKit 2 was released today. Is this just an iterative release, or is it the harbinger of a brighter day for web-developers?

What is CodeKit

CodeKit helps you build websites faster and better.

Bryan Jones, the developer of CodeKit defines it as a tool for helping web-developers build web-sites faster. Or in his own specific humorous words: "It's like steroids for web-developers". He's actually correct, it is a tool which will in most cases significantly improve your performance regarding creating web stuff. 

CodeKit is a web-site build tool, kinda like Grunt, Gulp and others. What separates CodeKit from the other tools listed is that CodeKit is a visual tool, a proper Mac application. It does not require any knowledge of the terminal, writing JSON, installing Ruby gems or Node programming. It just works!

If you're a web-developer and haven't yet discovered CodeKit you have been missing out! Today, the next generation - CodeKit 2 is out. I have been beta-testing it for some time now, both in smaller and larger projects. These are my initial thoughts.

The updated UI

The UI of CodeKit is in a word, new. It is also in my opinion significantly better.

The main file view of CodeKit is now using a tree view. This means that all of your files are organized into folders. This makes for a much more clean presentation compared to the old "show all files in a list" approach.

Tip: You can have CodeKit 2 ignore entire folders by right-clicking on them and choosing "skip this folder". Great for Node projects!

CodeKit 2 features a new project panel (see the top image), which you get at by clicking the big-ass button in the top left corner. Yep, you can customize how the button looks for individual projects. All the functionality from the first CodeKit is still there, you can add new projects, switch them on and off and handle your Compass stuff from the contextual menu button, and now you can add a Zurb Foundation Project as well.

cont_menu.png

There's a lot of new UI stuff in CodeKit 2, all of them improvements in my opinion.

Languages, features and frameworks

The first version of CodeKit did support quite a slew of languages, cross-compilers and external tools. You never needed any knowledge of how to install any of these languages when using CodeKit, they where just there - build in. This is also the case for CodeKit 2, with some welcome additions.

New languages

langs.png

CodeKit 2 adds support for TypeKit and Markdown as well as Autoprefixer, a tool that will add missing user-agent prefixes to your CSS code.

Zurb Foundation and Susy are now supported on the framework side of things. Great for building responsive web-sites.

Also worth mentioning is the addition of CoffeeLint, which will lint your CoffeeScript files before compiling them, making it that much harder to mess up.

Source maps ftw

CodeKit 2 has build in support for source maps, that simply works!

If you ever have tried to set up source maps, you know that it can be quite a pain in the behind to configure correctly. CodeKit 2 performs all that nasty setup for you. This allows you to concentrate on using source maps instead of pulling your hair out trying to make it work correctly. Source maps are supported for CoffeeScript, SCSS, Less, TypeScript and Uglify.js.

The new assets panel

Handling external dependencies like jQuery, Modernizr and Bootstrap can be quite a chore when working on larger projects. Things gets updated, you have to hunt for each individual library on the web and manually check which is the most up to date version and so on.

CodeKit 2 has build in support for Bower, a package manager for the web. Opening the «Assets» panel allows you to browse, search for and install Bower packages. There is also an easy way to update your packages when a new version is available. I particularly like the feature which allows me to update all my installed components in one click.

Hooks

The Hooks feature is exactly what you think it is. It allows you to create a hook with a set of rules and have it  execute a shell-script or AppleScript when the criteria of the rules are met. In the example above I tell my tests to run when any file has changed in the "kit" folder. This feature could for instance allow you to integrate with other command line tools.

Nifty server-refreshing

This is perhaps the flagship feature of CodeKit 2. It does live reload like before, but now it does so much more. CodeKit 2 will fire up a simple HTML server and give you an URL which you can navigate to on any device on your local network. When you change any of the files in your project CodeKit 2 will automagically update the site on all of the devices, even the remote ones like an iPhone.

Consider the following; You are creating a site which should run on mobile devices, desktop devices and a refrigerator (it could happen!). When you change something in your project, CodeKit 2 will compile it and refresh ALL of your connected devices, including remote devices. This is not only awesome, but it will save you a lot of time and frustration when testing on multiple devices.

But, what if your project needs more than a simple HTML server, like a local Node server running on port 3000?

Support for Node / Wordpress / external servers

CodeKit 2 can expose remote servers by piping their requests through it's own server maintaining the ability to update all your devices when something changes. You simply configure the project with a remote server, in my case a Node server running on port 3000 and when you change for instance a SCSS file in the Node project, the site automatically updates on your Mac, iPhone, iPad and refrigerator. Amazeballs!

Conclusion

There's much more new stuff in CodeKit 2, like the «libsass» compiler which makes compiling your SCSS files insanely much faster and project level settings, a new settings file-format which works better for teams and so on.

So, is CodeKit 2 great?

Let me put it to you this way. 
There is a class of applications where the care and attention to detail permeates ever nook and cranny. It surprises you in positive ways and does stuff which makes you smile. CodeKit 2 is clearly such an application in my opinion.

CodeKit 2 is targeted at a specific audience - web developers. If you find yourself being part of that audience you should consider CodeKit 2, it might just make your day brighter.

CodeKit 2 is available from http://incident57.com/codekit/

Quick-tip: Web performance demos

A while back I wrote some performance demos for a presentation I was doing. The demos go into some depth for the following topics:

  • JavaScript download and thread blocking
  • JavaScript libraries and memory usage and execution times compared to pure DOM
  • Paints, invalidation and memory usage when creating layers

All of the demos include an in depth explanations on GitHub, so I won't bother rehashing this here. You can find the demos here: https://github.com/jornki/performance

Performance demos on GitHub

Performance demos on GitHub

A simple rich-text editor

Text-Edit-icon.png

Chances are you have tried some rich-text editors embedded in webpages, like TinyMCE and others. In the past, it used to be quite the task to create tools like these. This changed with the introduction of the HTML 5 «contentEditable» attribute. Now, you can with little effort create your own basic rich-text editor.

The «contentEditable» attribute can be set to true or false on any element and will enable or disable design mode for the element. In short, design mode means that the user can edit the content of the element.

In this example I've created a simple text-editor. There are buttons to toggle «bold» and «italic» styles on any selected text. When the user clicks one of the buttons or changes the text, the content gets saved.

To create this, the first step is to enable design mode on an item. To do this we add the «contentEditable = true» attribute to, in this case, a div element.

<section id="editbox" contenteditable="true">
..
</section>

Then to apply the styling we need a bit of CoffeeScript. Since this demo has quite a bit of script I'll draw your attention to the important bit, the «document.execCommand». This method takes one of several commands, in our case "bold" or "italic" and applies this to the selection. For a complete reference of the available commands see this article over at Mozilla.

To set the selected text to bold you only need one simple command.

document.execCommand 'italic'

Take a look at the demo and the full CoffeeScript code for a reference on how to create the complete demo.

Retina images with «img srcset»

retina.png

Handling High-DPI or Retina™ images is probably one of the more discussed topics in the front-end community today. The reason for this might be that front-end developers are working with technologies that hasn't quite caught up with the progress in this particular part of device technologies.

To give you an idea of what I'm talking about; A lot of mobile devices have High-DPI displays, first and most notably was the iPhone 4 which Apple used to coin the term Retina™. This kind of display have a 2x pixel density (vertically and horizontally), which means that there are 4 pixels for every one point.

The challenge

For developers and designers this pixel doubling means that artwork like images will need to be double the size it should be displayed at (horizontally and vertically). This in by itself is quite easy to fix, you just set the width and the heigh of the image in the «img» tag and then refer to an image twice the area of the defined width and height. The problem that brings is that non-retina devices, which often has less available memory will need to download a file which is double of the needed size and hence consumes much more memory and bandwidth than a normal image would. The real solution is to serve high-DPI images only to those device which can support them.

My approach

For my demo-site I'm using the W3C srcset attribute to handle High-DPI images. This allows you to set multiple sources for one image and at the same time maintain backwards compatibility with older browsers since older browsers will simply ignore the «srcset» attribute. You define it like so:

img src="normal-image.jpg" srcset="retina-image.jpg 2x"

Notice the "2x" part of the «srcset» URL. This is where you define the resolution of the image. The «srcset» attribute can take multiple comma separated image values with different parameters. If you need to display different images for different screen sizes and so on you could set multiple images here. For my demo-site I'm only using the 2x option and specifying only one alternative image.

Browser support and polyfilling

At the time of writing this only WebKit Nightly builds and Chromium supports the «srcset» attribute, but the good news is that it is coming. However, we need a solution now, hence enter the polyfill.

There are a several complete polyfills for «srcset», but they implement the whole spesification and I was really just looking for the 2x part, so I decided to create my own simple polyfill.

As you can see this is a tiny script. In short it tests if «srcset» isn't supported and whether the «devicePixelRatio» is more than one. If so, it scoops up all the images with a «srcset» attribute and swaps the image source for the one in the «srcset» attribute.

Note that this script assumes that all you want is 2x images, it is not a complete «srcset» polyfill!

So that is how I handle Hi-DPI or Retina™ images in my test-site, feel free to use it if you need it.

Making a faster website

Since I'm a nerd and slightly above average preoccupied with performance in apps and websites, I decided to put my then present knowledge to the test by attempting to create a highly performant website and maybe see if I could learn something in the process. So, like I said, the goal, build a website which will load insanely fast and score a perfect 100 on both mobile and desktop using the Google Page Speed benchmark. In the end I did it and you can see the resulting website here: http://kinderasweb.azurewebsites.net.

Note that this is my take on creating a fast website with a specific goal of scoring a 100/100 on the Google Page Speed benchmark. This might not be your goal and your site might have other considerations, so this is no definitive answer.

Click the image to see the full report.

The server setup.

My server setup includes a Node.js server running an Express.js application hosted on a Windows Azure website.

My experience with Node.js and Express was somewhat limited before this project, I had mainly used Node with Socket.IO before and had really never written a "full" Express application. So I learned a few things and I'll like to share some of those now.

Firstly, Windows Azure man, I'm not exactly what you would call a Microsoft fan, but Azure is pretty much a joy to work with and I highly recommend it if you're building a website or any kind of service for web or native applications. It's that good!

The Express application

The application running the site is pretty straightforward. There are a bunch of templates written in Jade and some routes rendering the content into the templates. The content comes from a heap of markdown files which are parsed and then rendered. Pretty standard really, but there where some challenges though.

Server response time

Just to make it clear, this is not an Azure ad and I'm not sponsored in any way or form. However, since Azure websites uses a CDN infrastructure to distribute content, response time for Azure websites are pretty good. As long as you make sure that your actual server application responds quickly Azure will handle the delivery pretty flawlessly. So my only advice on the server response time is to stick you application on properly configured infrastructure. I do have some experience with this and it really matters!

Low latency means that the server responds quickly and the users of your site will start to see the content sooner

Low latency means that the server responds quickly and the users of your site will start to see the content sooner

Compression

Compression means that the server will compress the content (using «gzip» in most cases), resulting in the files transferred being much smaller.

Enabling compression in an Express application is easy peasy lemon squeeze. It's a setting and you enable it trough the Connect middleware like so:

app.use(express.compress());

Note that it is critical to set this before any other settings in your Express application to ensure that everything is transferred in a compressed state.

Browser caching

For me, this was the trickiest part. Browser caching is in short the how server defines some http headers on the content allowing the browser to cache the content for some pre-defined amount of time. Having mosts servers set these headers is quite easy, in Express you can do this when rendering the content, like so:

res.header("Cache-Control", "public, max-age=" + cacheTime);

The tricky part is for how long you are going to allow the server to cache content. According to Google you should set the cache time for "static content" like JavaScript and CSS up to a year. However for "dynamic" content like HTML pages, browser caching is not recommended. Now, this depends on what kind of content you are serving of course and how important it it to get new content out quickly.

Static content

For my fictional website I decided on caching static files like CSS and JavaScript for one year and then using «URL fingerprinting» to tell the browser when the file had changed. «URL fingerprinting» simply means that you append a value to the end of the URL like «styles.css?cache=1.0.0» where the "1.0.0" part could be a version number or really anything that changes. In my app I simply exposed the «package version» to the main template and used that to bust any cache.

// in app.js
// Get the css app version
var version = pkg.version || (new Date().getTime());
// Expose the version number to the templates
app.use(function (req, res, next) {
res.locals.version = version;
next();
});

// in layout.jade
link(rel='stylesheet', href='/stylesheets/style.css?v=#{version}')

This - again can be done in a multitude of ways depending on server software and such. Setting cache headers for static content in Express is easy and is done once for all content.

app.use(express.static(path.join(__dirname, 'public'), { maxAge: oneYear }));

Dynamic content (HTML)

Google prefers to look at HTML as dynamic content and in most commercial cases I totally agree. For my case - the need for having the user instantly see any changes was not so important. So I decided to set the cache for all HTML pages (templates) to one day. This is set via the cache control header mentioned above.

Server caching

In contrast to browser caching, server caching simply means that templates and data is cached on the server avoiding database lookups and I/O operations every time a user requests a page. Most CMS systems and web applications do this out of the box. For my case, the application needed to read a bunch of markdown files. These files where kept in their parsed form in memory until they changed or the server restarted, this way the content was most of the time read from memory and the application would respond immediately. How you do this depends heavily on what your server-application does - but a goal here is that the server should use a maximum of 200 milliseconds before responding to any request.

Inlining styles

Apparently this is somewhat controversial and is sometimes referred to as «critical path CSS». Basically this means that instead of linking to an external CSS file in the header of your HTML document you instead inline the CSS styles needed to render parts of or the entire page. There are some tools to help you detect which parts of your stylesheet should be inlined.

What this gives you is fewer server requests. For users on mobile networks this is a big deal. In the mobile network setting, latency is often in terms of seconds. This means that requesting additional files will add a new roundtrip of latency before your pages can be rendered, since browsers generally do not start to paint a page before the styles have downloaded and have been parsed.

You can read more about Googles reasoning and how Google Page Speed looks at this specific measurement.

Inlining CSS with Jade

This is kinda a front-end thing, but it is most practical in my option to have the server do this for you. Using Jade with Express this is as simple as including your CSS file directly into the layout.jade file.

| <style type='text/css'>
include ../public/stylesheets/style.css
| </style>

Front-end

80-90% of the end-user response time is spent on the frontend. 
Start there.
- Steve Sounders

Mr. Sounders coined the quote above as the performance golden rule back in 2012. I do actually think the man knows what he is talking about, but I do want to stress that it is completely possible to utterly fuck up your entire site by creating a crappy server-application. So do both the server part and the front-end part properly is my advice.

There are many things you can do in the front-end to improve the performance of your site, let's start with minification.

Minification

Minification is the process of "compiling" static files like JavaScript, HTML and CSS files in a way that reduces their file size before serving them to the browser. This can be done by the server or before the files are uploaded to the server.

For my site I use the build in functionality of Jade to minify the templates (HTML) when serving them to the browser. These minified HTML files are automatically cached by Express when you go in to production-mode so you don't have to worry about additional server response time on this.

I use CoffeeScript (which is awesome) for scripts and Uglify.js to minify the resulting JavaScript files(s). For styles I use SCSS (which also rocks) which has a build in "compressed" mode for outputting minified CSS. The tool I use to handle these files is CodeKit, which I highly recommend for front-end designers / developers!

CodeKit is a tool to handle front-end projects (click the image to read more)

CodeKit is a tool to handle front-end projects (click the image to read more)

Minification and compression (gzip)

If you're clever you might be thinking something like: "Hmm, if I enable gzip on the server, then why do I need to minify the files as well?".

For JavaScript the answer is easy, minification does more than just compressing the file, it also renames variables to shorter name and removes comments and so on. In other words, there is less content to read for the browser after the files has been downloaded. And therein lies the whole point. In transfer size you will not save that much by minifying and then compressing, compressing will pretty much do the job. However if you do not minify the file, the browser on the end-user device will need to spend more memory when parsing it, because, well, the file is bigger. So it is a good idea to both minify and compress static files.

Handling images

Images are static data and should be cached along the same lines as other static files like stylesheets. See the part about browser caching.

Also, images needs to be optimized. There's a bunch of completely redundant metadata in images which can be stripped away to achieve a smaller file size.

Images should be served to the browser in the size it will be rendered in. Don't scale images in HTML or CSS. The exception being HI-Def (retina images) which will be twice the size.

Serve the images from a different domain than the one your application is hosted on. This allows the browser to download more stuff in parallel, resulting in shorter load times.

More about optimizing images from Google.

Enter Cloudinary

cloudinary_logo_square_500x500.png

Cloudinary is an end-to-end image management solution for your Web and mobile applications.
- Cloudinary website

All of the things above, Cloudinary does them and more. It is also a CDN allowing for faster downloads of image files and allows for downloading in paralell.

This service is for images what Azure is for application hosting. It is that good! Learn it, use it!

Wrapping up

There are other related things to performance my test-site handles, like handling retina images and so on, but more on that in separate post.

I believe that the main thing is to realize how much performance affects the users of your site and to act accordingly. There are a bunch of research done on this and in one famous case Amazon found that they could achieve a 1% increase in revenue for every 100 millisecond of load time they shaved off their product pages.

Turns out, performance matters..