Multi-core chip rivals AMD and Intel have been beating their chests as of late, but to what end, I wonder, as developers labor to keep up. AMD, for one, has fixed the embarrassing flaw that delayed the quad-core Barcelona chip. As Terry Malloy put it in On the Waterfront, so what? Meanwhile, Intel and Microsoft pat themselves on the back because they've donated $20 million to UC Berkley and the University of Ill Multi-core chip rivals AMD and Intel have been beating their chests as of late, but to what end, I wonder, as developers labor to keep up.AMD, for one, has fixed the embarrassing flaw that delayed the quad-core Barcelona chip. As Terry Malloy put it in On the Waterfront, so what? Meanwhile, Intel and Microsoft pat themselves on the back because they’ve donated $20 million to UC Berkley and the University of Illinois to found the Universal Parallel Computing Research Centers. Well, it’s about time. Why so negative? The dirty little secret (and it’s not all that secret) is that the gap between hardware and software has never been greater. Today’s software can barely (if at all) take advantage of quad-core processors, but Intel and AMD seem to be giddy with rivalry, rushing to push out chips with even more cores. Intel has already demonstrated an 80-core processor, and you can expect x86 servers with as many as 64 processor cores in 2009 and desktops with that many by 2012, says Forrester analyst James Staten. That’s not to say that the IT industry is scoffing at the potential benefits of multi-core processing. But the mountain between IT and some future multi-core promise land — namely, the task of developing parallelized apps that keep pace with continual core advances — is huge, says David Patterson, the Pardee Professor of Computing Science at UC Berkeley and director of the parallel computing lab. “It’s the biggest challenge in 50 years of computing. If we do this, it’s a chance to reset the foundation of computing.”In the short run, Patterson says, we can parallelize legacy software and gamble on getting value out of eight cores. But that would be only an interim solution, as such apps would not scale to 32 or 64 cores, he adds. What is frustrating is that this problem didn’t exactly sneak up on the industry. Chip development cycles are very long, and key software developers are well aware of what’s moving through the pipeline. Sure, software always lags hardware. Many of us complained that we didn’t have software that would take advantage of 500MHz back in the ’90s. But what Patterson and others call the multi-core revolution poses problems for developers that are qualitatively different than the problems of the past. Why wait so long to get serious about solving them?Making sense of the multi-core muddleThe cynical explanation for this growing gap is that Intel and AMD are running on a treadmill that requires selling more and more transistors to support the cost of developing and building fabs. As long as buyers are willing to spend the money for cool new hardware, who cares if they don’t really need it? Ray DePaul, president and CEO of RapidMind, which sells a multi-core software development platform, has a different take. “The first multi-core chips were dual core, and that lulled everyone into thinking this is OK,” DePaul says. Taking advantage of the second core was relatively easy with existing software. But four cores is another story. “It’s the classic disruptive technology,” DePaul says. “If the Microsofts and the Intels always got it right, you’d never see a Google or an AMD.”RapidMind hopes to avoid following in the wake of companies such as Thinking Machines and nCUBE, which attempted to build businesses around solving the parallel computing problem without success. I’m not qualified to say whether the RapidMind solution, which includes an embedded API to allow legacy software to take advantage of multiple cores, is viable. But I agree with DePaul when he says, “The business opportunity is far more mainstream than it was because every desktop is shipped with a multi-core processor.” RapidMind spun out of the University of Waterloo in Ontario, where co-founder Michael McCool studied the problems of parallel computing for years. A one-time competitor called PeakStream was purchased by Google last year. It’s unclear what the search giant intends to do with the technology, though it may well use it internally to bolster its already enormous computing resources. In addition to the business opportunity, there’s an employment opportunity here as well. Developers who can handle parallel processing or concurrent processing are going to be in great demand. Indeed, UC’s Patterson says: “We feel a sense of allegiance to our undergrads but don’t know what to teach them. Course work is all focused on sequential [programming] problems.”I don’t feel like doing the math, but I’ll bet Intel and Microsoft earn $20 million in a matter of hours. So, yeah, I congratulate them for funding some research, but they and other industry heavyweights need to do a lot more. If not, maybe we’ll wise up and stop buying what they’re selling. I welcome your comments, tips and suggestions. Reach me at bill_snyder@infoworld.com Technology Industry