Pass the buck. It's a way of life for politicians and Linux distribution publishers. When something doesn't work - as was the case with my recent ACPI headaches under Ubuntu 7.10 "Gutsy Gibbon" - the response is most often to point fingers. Assign blame. Pass the buck. In the case of the ACPI debacle, where many nVidia-based laptops refuse to resume after a suspend-to-RAM cycle, the blame is being dire Pass the buck. It’s a way of life for politicians and Linux distribution publishers. When something doesn’t work – as was the case with my recent ACPI headaches under Ubuntu 7.10 “Gutsy Gibbon” – the response is most often to point fingers. Assign blame. Pass the buck.In the case of the ACPI debacle, where many nVidia-based laptops refuse to resume after a suspend-to-RAM cycle, the blame is being directed at nVidia and its buggy proprietary drivers. Or is it the Linux kernel team that’s to blame?If you ask Canonical, they’re leaning towards nVidia. As Scott Remnant pointed out in an email response from the company: “..early indications really do suggest that it’s the nVidia card that’s causing the problem.”But in the same message he states that they (Canonical) hope that “the 2.6.24 kernel for hardy is better…”So, is it the driver? Or is it the kernel? Is it the “evil, greedy, proprietary” (my words) nVidia that’s at fault? Or is it the “noble, angelic, gang-of-the-L-man” (again, my words) kernel development team that dropped the ball? Several user accounts (as documented in the official tracking thread on the Ubuntu forums) seem to contradict the driver-as-culprit theory. Specifically, they show that *downgrading* the Linux kernel to a previous version – something closer to the Ubuntu “Feisty Fawn” 7.04 revision – alleviates the problem. This theory also seems to be supported by the myriad “Feisty” users who’ve had great success with suspend/resume in the past. In other words, the nVidia drivers work fine – until you upgrade your kernel to the version that comes with “Gutsy” 7.10.I have my own opinion as to where to lay blame (hint: their logo is black, white and yellow, not green and white). However, this entire experience simply reaffirms what I believe is a key barrier to the widespread desktop adoption of Linux: A lack of accountability. There’s simply no clear chain of responsibility when something like the aforementioned ACPI bug strikes. In the end, you have to trust in the “community” for a fix or, more likely, hunt down your own solution.Contrast this with Microsoft Windows and you begin to appreciate just what a Herculean feat the Redmond giant has accomplished. Over the past dozen years, they’ve managed to ship several generations of increasingly complex operating system platforms while still maintaining support for the the widest, most diverse hardware base imaginable. And while the process is not always perfect (Vista, in particular, required some significant driver re-tooling), it’s sufficiently robust to allow the company to retain its position as the leading OS platform world-wide and the “gold standard” for hardware compatibility. Note, also, that when there have been issues – for example, the early Vista driver headaches – they’ve typically been the result of some major change in how the OS handles a particular device interface (e.g. video drivers in Vista). And with Vista, the company made developers aware of the changes well in advance and also provided copious documentation on how the new model would work. With “Gutsy,” nobody knows why the ACPI support broke or why the previously reliable (under “Feisty”) nVidia drivers stopped working.To their credit, Microsoft has even gone so far as to provide developers with hands-on assistance in tuning and debugging their drivers. Tens of millions of dollars have been poured into the Windows Hardware Quality Labs (WHQL) program to ensure that, when you plug in a new device or component, the drivers load and the device/component works. Period. It’s the kind of investment in QA that Linux distributors simply cannot match. And it’s also why Microsoft gets to charge for the software they deliver.For the record, I’m not saying that I expect Canonical to deliver an equivalent to WHQL (though Novell might want to think about it). For starters, they don’t have control over all the bits – or for that matter, any of the bits. They’re just an aggregator, pulling together a mix of FOSS, drivers and kernel binaries to create their particular “distribution.” However, the current model of distribution-level bug isolation/management is clearly broken and needs to be fixed before Linux can be taken seriously as a replacement for Windows on the desktop. Pass the buck. Great for political rallies and Linux love-ins. Not so hot for enterprise desktop computing.P.S. – Despite the above rant, I’m still a huge Ubuntu fan. It is, by far, the most well-conceived of the desktop-centric distributions. And the people at Canonical seem to genuinely care about the quality of their “product” (a special “shout out” to Scott – thanks for the quotes!). If/when they get the ACPI issues worked out I’ll likely switch back to Ubuntu as my primary desktop environment. However, given my own experiences with “Gutsy,” I cannot in good conscience recommend it as an alternative to Windows, especially in mobile computing scenarios. Software DevelopmentSmall and Medium Business