Windows Server 2003 and SQL Server 2000 scale to new heights

analysis
Oct 17, 20039 mins

As Microsoft’s 64-bit OS and database rise high above the 32-bit platform, pieces are still falling into place

It’s simple, really: Bigger is better. Compared to a 32-bit system, a 64-bit system offers a larger memory address space for applications, a wider bus into the underlying memory hardware, and superior floating point precision to better massage all those terabytes as they pass through the various “double-wide” registers of a 64-bit CPU. In real-world terms, 64 bits means bigger databases, faster transaction times, and more robust number crunching.

Veteran datacenter hardware vendors Sun Microsystems, IBM, and Hewlett-Packard have been doing 64-bit computing for years, providing highly scalable but inherently proprietary platforms complete with complex installation procedures and mainframe-like pricing. Now the barbarians are at the gate, threatening to cut into Unix’s proprietary high margins. The march toward 64-bit commoditization began with Linux systems built on AMD Opteron and Intel Itanium hardware. In late April of this year, Microsoft joined the wars with Windows

Server 2003 Enterprise Edition and SQL Server 2000 Enterprise Edition for 64-bit Itanium systems.

Is the 64-bit Wintel era close at hand? To find out, I compared performance of 64-bit Windows Server 2003 and SQL Server 2000 to the 32-bit versions of the same software, using similarly configured Itanium 2 and Xeon dual-processor servers from IBM — the IBM eServer x450 and IBM eServer x335, respectively. I not only tested raw transaction performance in two- and three-tier client/server scenarios, where the results were promising, but also ran tests against a number of underlying components, trying to determine how far Microsoft has gone in optimizing all of the Windows and SQL Server code for 64 bits. Here I discovered that Redmond still has work to do.  

But that’s not all. The 64-bit versions of Windows and SQL Server are missing several features of their 32-bit counterparts, hindering their usefulness and manageability (see “64-bit Still Under Construction,” page 51). In particular, the lack of a 64-bit version of the .Net Framework make the new platform unsuitable for virtually any applications aside from massive in-memory databases and ultra-precise floating point calculations. Until Microsoft makes a 64-bit implementation of .Net available (a beta is scheduled for Spring 2004)  Windows for Itanium won’t be a viable platform for running Web applications or Web services.

Clouding the future further is the immaturity of Intel’s 64-bit IA-64 architecture, as embodied by the Intel Itanium and now Itanium 2 CPU platforms. IA-64 is hampered by a slow clock frequency (top-end is still only 1.5GHz) and a relatively complex development model, which requires programmers to pass their native IA-64 code through several stages of post-compile testing and tuning. Considering the dearth of native applications, and an unfamiliar development model that throws the quality of available applications into question, most IT shops would be wise to take a wait-and-see attitude toward Windows on Itanium.

Pipes and Protocols

Experience teaches us that the real showstopper flaws rarely pop up where you expect them. Rather, they tend to trip you up when you’re doing the less glamorous stuff, like managing data flow and scripting routine process automation. So, in addition to focusing on the massive scalability potential provided by the high memory bandwidth and multi-terabyte address space of the IA-64 architecture, my tests zeroed in on the software “plumbing” that glues the various pieces of the new platform together.

Some of these components are part of the core Windows OS image. They include the IIS and the ASP run time. Others are integral to specific applications, such as SQL Server and DTS (Data Transformation Services). Still others transcend OS and application boundaries, providing the kind of pervasive, performance-critical services that can make or break the entire platform.

A good example, and one area that has always been a sore spot for Windows developers is the MDAC (Microsoft Data Access Components) stack. This collection of protocols and drivers constitutes the all important front-end to any two-tier client/server application. It’s the fundamental enabler of transactional interactions under Windows, a lingua franca for all things database-related. A flaw here — whether due to inadequate optimization or simple programming error — could significantly degrade the performance of a ported application.

I was pleasantly surprised to find that not only is the 64-bit MDAC stack in Windows Server 2003 for Itanium on par with the 32-bit version (as tested in a Windows Server 2003 implementation running on a dual-Xeon 3.06GHz server), it actually outperforms its thinner cousin in many instances.

For example, during two-tier client/server transaction testing with SQL Server 2000, I found that Windows Server 2003 on Itanium was on average 5 percent faster than Windows Server 2003 on Xeon when running an identical ADO (ActiveX Data Objects) workload. This was true regardless of which data provider we selected (OLE DB or ODBC over both TCP/IP and Named Pipes), and we verified our results across a range (more than 30 variations) of workload, protocol, and transport combinations.

Given the importance of these plumbing components to the overall performance equation, it’s nice to see that Microsoft has indeed done its homework in optimizing MDAC for 64-bits. A performance discrepancy in this area would be a sufficient cause for delaying any impending client/server deployments.

Taking It to the Nth Tier

Encouraged by my two-tier scenario findings, I shifted my focus to three-tier client/server applications — aka Web services. Once again, I varied both the connection parameters and the transaction structure, scaling the workload from five to 50 clients and the data set size from 250 to more than 5,000 records per second. And again I was impressed by the results: an average eight percent performance improvement over the Windows Server 2003 and SQL Server 2000 combination on 32-bit Xeon hardware. Clearly, raw transaction performance is not a problem area for the 64-bit Windows on Itanium 2 solution (see chart at right). Click for larger view.

It’s important to note that neither of the aforementioned scenarios — the two-tier client/server or the three-tier Web app simulations — represents a particularly demanding workload for these powerhouse servers. In fact, neither box broke a sweat when executing the workloads. Rather, the purpose of these tests was to exercise the core infrastructure components — MDAC, ASP, IIS — that support client/server application deployment. Viewed in such a light, the IA-64 implementation passed with flying colors.

Also worth noting is that, even with the supposedly “state of the art” Windows Server 2003, I was forced to downgrade my testing environment to match the lowest common denominator: the IA-64 implementation. That’s because Windows Server 2003 for Itanium doesn’t support Microsoft’s .Net Framework, an omission which leaves application developers stuck with a programming model that’s nearly two generations out of date. So although the IA-64 implementation performed well as a “legacy” three-tier solution, it’s not clear how it will hold up when Microsoft brings the platform on par with its 32-bit offerings.

Buying in Bulk with DTS

Satisfied that the core protocol and transaction plumbing in Windows Server 2003 for Itanium was up to snuff, I shifted gears yet again, this time zeroing in on bulk data manipulation tasks with DTS  under SQL Server 2000. DTS is the backbone of Microsoft’s SQL database workflow model. A kind of visual, object-oriented scripting environment, DTS permeates virtually every aspect of routine data set maintenance, from basic import/export functions to scheduled connectivity with even the most complex external data sources.

It was with DTS that I first observed subtle performance discrepancies between the 64-bit and 32-bit platforms. It began with a simple bulk import of 4.2 million records. Executed manually from the Microsoft SQL Enterprise Manager console running on my 32-bit control workstation, this decidedly unglamorous task — copying the contents of three large tables from a client PC running SQL Server Developer Edition — took a full 20 percent longer to complete when executed against the Itanium 2 server.

Puzzled by these results, I expanded my test matrix to incorporate an increasingly complex set of DTS packages. These included a direct object copy of the database structure; a reproduction of the aforementioned import task using the server as both source and target; and a query-based transformation that pulled a subset (1.6 million) of the full record set, writing the modified data to a new table. In each of these scenarios, the Itanium server took longer than Xeon to execute the package — as much as 59 percent longer for the internal “import” (bulk transform) scenario (see chart below). Click for larger view.

These results would seem to indicate that 64-bit SQL Server is not as adept at handling bulk data as it is at handling individual transactions. Whatever the reason, the poor performance we observed when executing our DTS scenarios should raise red flags for any IT shop that uses DTS (which would be pretty much all of them). 

Microsoft should be able to resolve these issues in the near future (all indications are that the problems are isolated to bulk data manipulation and are not systemic), but these findings simply reinforce what has long been the conventional wisdom regarding Microsoft software releases: It’s OK to prototype using the initial distribution, but you should wait for the first service pack before considering deployment into a production environment.

Promise and Pitfalls

A “V8 in a Yugo” may be one way to describe this first incarnation of the Windows Server 2003/SQL Server 2000 combination running on 64-bit Itanium. It’s got a great engine, and many of the transmission components (such as MDAC) seem to be falling into place. However, the overall package just isn’t there yet. The Yugo still needs a proper chassis (.Net Framework), and some of its suspension elements (DTS) need tuning, all of which can be worked out over time as Microsoft addresses these issues in the inevitable post-release patches and service packs.

Job one for Microsoft should be to walk through the entire 64-bit Windows and SQL Server code base to identify those areas that are still poorly optimized. A good place to start would be bulk data handling, but other trouble areas no doubt exist and should be addressed. (Another whopper: SQL Server still uses temporary tables to hold sort data larger than 4GB — a 32-bit holdover.)

For IT, the message is clear: Unless you have a compelling and immediate need for 64-bit memory addressing, you’re better off waiting until the first service pack. Microsoft’s newly minted Windows and SQL Server solution for Itanium is simply nowhere near a plug-and-play alternative to the status quo. It could easily take a year or more until the platform has been fully optimized and more pieces of the 64-bit puzzle have been put in place.

Although I can’t unconditionally recommend Windows on Itanium today, my tests suggest that the platform has a promising future. As the platform matures, IT shops should begin to find the 64-bit edition of the “Wintel” bundle an attractive option for applications that crave the terabyte wave.