JavaScript the Web site performance killer, Google guru says

news
May 13, 20094 mins

The scripting language is one the largest of several culprits that slow Web site loading

Nowadays, even regular Web surfers know some of the things to avoid when designing a Web site for fast performance: Cut the number of requests to the Web server. Shrink JPEG sizes. Employ a content delivery network vendor like Akamai Technologies Inc. or Limelight Networks Inc.

Problem is, according to Steve Souders, steps like these aimed at optimizing the Web server make only a tiny impact.

[ There are some big changes in store for JavaScript 2. ]

“We used to tear apart the Apache [Web server] code to figure out what Yahoo was doing,” said Souders, who was Yahoo’s chief performance engineer for several years before moving to Google in the same role.

But after performing a detailed analysis, Souders discovered something startling: Only 10 to 20 percent of the time it took to load a Web site could be attributed to the Web server.

The vast majority was the result of code executing inside the Web browser, said Souders at a talk on Tuesday at Microsoft’s Tech Ed conference in Los Angeles (download PowerPoint here).

In today’s AJAX-heavy Web sites, the offending code is usually JavaScript, Souders said. That’s not because JavaScript files on a Web page are large — they aren’t, he said — but because of the way Web browsers treat JavaScript.

“The first generation of Web browsers decided that because they had to execute all of the JavaScript files in order, we might as well execute one while stopping all other downloads,” he said, as well as preventing any other code from being executed or rendered.

That may have made sense a decade ago, but in today’s era of PCs powered by dual and quad-core CPUs, it doesn’t. And the cost of the delays created can be high.

Google has found that a 500-millisecond delay results in a 20 percent decrease in Web traffic, while Amazon.com has seen a 100-millisecond delay cutting its sales by 1 percent, Souders said.

Better browsers, better performance New and upcoming Web browsers will be able to download JavaScript files while executing them. Internet Explorer 8, released last month, has this feature, Souders said, as do the upcoming Firefox 3.5 from Mozilla Corp. and Chrome 2.0 from Google.

Barring an overhaul of the JavaScript, the boost will stay small, Souders said.

To fix, Souders first recommends a free tool he created called Yslow that analyzes and then grades how well a Web page is designed for maximum speed. Originally developed for Internet Explorer, Yslow 2.0 is an add-on for Firefox integrated with the Firebug Web development tool. It is downloadable here.

Using YSlow, users can see how much JavaScript is being loaded in the beginning, creating a bottleneck. Users can then split the JavaScript files, loading only the necessary JavaScript at the start and leaving the rest at the end after the words and images are already up, he said.

Doing so helped one Google site that Souders declined to name speed up its initial page rendering by 60 percent.

Besides JavaScript files, CSS can also drag down site performance. CSS files, which describe a Web page’s look and feel, have become more elaborate in recent years.

Also, users tend to stay on certain sites, such as their Web mail, all day. These sites will re-render constantly throughout the day, incurring a delay from over-elaborate CSS files each time, Souders said.

Besides JavaScript and CSS, Yslow analyzes 22 criteria in all. It is unsparing in its ranking. Popular Web sites, such as Apple.com, ESPN.com, and Wikipedia, received a “C” from Yslow, while NYT.com, NBA.com, and Computerworld.com earned an even worse “E.”

“When I look at it, I feel like the teacher who hands out very severe grades,” he said. Search engines with minimal content on the page, such as Google.com and Microsoft’s Live.com, are among the rare sites that get an A from Yslow.

There are other tools besides Yslow for diagnosing performance bottlenecks. Microsoft offers the Visual Roundtrip Organizer, while AOL developed a tool, now open source, called PageTest.

All these tools judge Web site performance by a set of rules, though none of them matches YSlow’s 22 criteria.