The conversation is always the same. "This laptop (or desktop, or server) was so fast when I first got it," someone will say to me, "but it just seems to get slower and slower." "Have you run disk defragmentation on it recently?" I ask. "I thought that (Vista/Windows XP/Windows Server) did that automatically now..." And there it is. On the surface, defrag is a simple system maintenance task. And in theory, it sh The conversation is always the same. “This laptop (or desktop, or server) was so fast when I first got it,” someone will say to me, “but it just seems to get slower and slower.”“Have you run disk defragmentation on it recently?” I ask.“I thought that (Vista/Windows XP/Windows Server) did that automatically now…” And there it is. On the surface, defrag is a simple system maintenance task. And in theory, it should be run daily on systems with a lot of storage activity–like Exchange and Sharepoint servers, for example. But with the ever-expanding amount of storage attached to even the most rudimentay of servers makes the window for running a full disk defragmentation on a server daily on a schedule–and squeezing in a system backup as well–something of a challenge without some sort of software help beyond scripts.For some network-attached storage (NAS), the defragmentation is out of Windows’ hands–the NAS server has to handle its own defragmentation. But fragmentation of logical Windows volumes on a SAN are just like fragmentation on local disk–even if they are virtual volumes, the file system appears as a physical device to Windows, and it will reassign blocks just like it will on a local disk. Fail to defrag the SAN volumes, and suddenly that ever-expanding disk farm can start to cut into server read-write performance. Even on desktops, where there’s less utilization on average and defrag needs to be run less frequently, defragmentation often seems like an afterthought for many desktop support managers. And you can’t trust the users to do it: a survey in August found that 42% or PC users never ran a defragmentation utility at all. If there was ever a case for automation of a system task, defragmentation is it. One of Vista’s biggest selling points, in my mind, is that it comes with an automatic disk defragmentation utility–systems can run a scheduled defrag without user intervention. Yes, Vista users, it does do defrag automatically–but you have to turn it on. However, automated Windows Vista defragmentation might not be such a blessing for some users–the utility uses multipass defragmentation, and, according to a Gartner research note, “characteristically fragments the remaining free space on the disk, which accellerates fragmentation later.” That isn’t the case on Windows Server yet. And the standard disk defrag utility that comes with Windows Server 2003 may not be up to handling large logical volumes attached on a SAN. Solutions like DiskKeeper 2008 provide “real-time” defragmentation, correcting fragmentation as it happens on the server. But there’s a school of thought that says continuous defragmentation is evil, in that it requires system overhead to determine when disks are fragmented–dropping server performance. That’s the contention of folks like Raxco Software, which markets PerfectDisk, a “single-pass” defrag tool. A scheduled, high-performance disk defrag, they say, is better, because at least it won’t take away from I/O performance when it’s not running. Since I’m not currently running several terabytes of Windows storage in an enterprise environment, I’m not in a position to really place any value on either the continuous or scripted defrag positions. For servers with low processor utilization but a high volume of file I/O, it would seem the continuous route would make more sense. But I’d like a second, third, and fourth opinion on that first. Anyone? Software DevelopmentSmall and Medium Business