Serving JavaScript Fast

Cal Henderson on ThinkVitamin.com – The next generation of web apps make heavy use of JavaScript and CSS. We’ll show you how to make those apps responsive and quick. With our so-called "Web 2.0" applications and their rich content and interaction, we expect our applications to increasingly make use of CSS and JavaScript. To make […]

Cal Henderson on ThinkVitamin.com – The next generation of web apps make heavy use of JavaScript and CSS. We’ll show you how to make those apps responsive and quick.

With our so-called "Web 2.0" applications and their rich content and interaction, we expect our applications to increasingly make use of CSS and JavaScript. To make sure these applications are nice and snappy to use, we need to optimize the size and nature of content required to render the page, making sure we’re delivering the optimum experience. In practice, this means a combination of making our content as small and fast to download as possible, while avoiding unnecessarily refetching unmodified resources.

He talks about several different approaches, including:

  • Monolith - the bigger the chunks the better, less overhead of loading more than one file for each page execution
  • Splintered Approach - divide it up into multiple subfiles and only load what you need
  • Compression - gzipping up the content to reduce its filesize as sent to the browser
  • Caching- sending headers to correctly cache the javascript file(s)

For each there’s a brief description, the advantages and disadvantages of the method, and a code example (in PHP) . He focuses largely on the caching option, however, and gives a longer example of how to ensure that your files are remotely cached as well as possible to reduce the load times for javascript-heavy pages.  |