by Neil McAllister

Planning a development project? Bring your wallet

opinion
Apr 14, 20117 mins

Companies are spending ever more of their IT budgets on software development, but are they really making progress or just padding vendors' bottom lines?

Good news for developers! According to a new report from Forrester, titled “The State of Application Development in Enterprises and SMBs,” North American and European companies are more likely to maintain or increase their level of spending on software development in the coming year than to decrease it. Also, software development will account for a larger portion of companies’ IT budgets in 2011 than in recent years.

Does this mean we’ve turned the corner? Has the long-promised economic recovery finally reached the IT sector? Not so fast.

[ Check out Neil McAllister’s list of the 7 deadly sins of software development. | Stay up to date on the latest programming news and insights with the Developer World newsletter. ]

The last time Forrester conducted a survey of this kind was in 2007. At that time, new software projects accounted for 33 percent of companies’ IT budgets, up 5 percent from the previous year. This year, the figure is 50 percent. Forrester isn’t alone in its assessments; in a separate survey conducted by SoftServe, 26 percent of respondents said their software development budgets had increased by more than 10 percent from 2008 to 2009 alone. If we plot the data points, we start to see that Forrester’s results aren’t indicative of any real change in 2011. Rather, they’re merely further evidence of a continued upward trend in software development spending.

But wait a minute — I seem to recall some sort of economic recession during those same years. If companies were cutting costs everywhere else, why did software development expenditures increase on a steady trajectory? Has software really become that essential to modern business?

Maybe. Or maybe the truth isn’t as rosy as Forrester and others would like to paint it. Maybe software development isn’t really a bigger priority for companies today than in previous years. Maybe it just costs more.

Out with the old, in with the new

One part of the story is that software development expenditures are being channeled not from other business areas but from other line items of the IT budget. Even as expenditures for new initiatives has increased, spending on “ongoing operations and maintenance” is down 20 percent since 2007, both for enterprises and small to midsized businesses.

Forrester credits these cost cuts to process automation, productivity enhancements, infrastructure consolidation, and open source tools. But is that all there is to it? One thing that seems clear is that existing installations aren’t growing. New software license purchases now account for just 25 percent of overall expenditures, which makes sense in a down economy.

Some areas are even showing significant decline: Use of IBM’s zSeries and iSeries (nee AS/400) platforms — traditional enterprise stalwarts, both — has declined by nearly 50 percent since 2007. It seems logical to assume that some of the “new initiatives and projects” Forrester cites include efforts to transition applications from these legacy platforms to more modern ones.

Funny thing about legacy platforms, though: They’re called that for a reason. Anyone who has been in IT long enough has a story about an AS/400 server that was kept running long past its prime, often into decades of constant use. And zSeries mainframes are nothing if not reliable; some of them have never known any true downtime.

This isn’t to say that keeping those systems running was free. It certainly wasn’t. But has ripping and replacing them really saved money? The numbers would seem to say otherwise. As IT managers have tackled a steady succession of much ballyhooed alternatives to legacy platforms — first middleware, then Java EE, then Web services, then SOA, and beyond — expenditures on software development have increased much more rapidly than operating costs have declined.

New technologies, same old applications

Let’s examine some of the more recent developments. Two of the latest trends that aim to cut IT costs are SaaS (software as a service) and cloud computing. Both are built around a similar idea: By shifting responsibility for some of the staffing, maintenance, infrastructure, and support requirements of IT from an in-house department to a service provider, CFOs can replace an unpredictable cost center with easy-to-digest monthly subscription pricing.

SaaS was the first to come along, and it especially appeals to small and midsized enterprises, for which capital expenditures on software licenses are particularly burdensome. But SaaS is the very definition of rip-and-replace; for anyone with a significant investment in in-house software systems, the transition is likely to be nontrivial. Enterprise customers, in particular, often chafe at the idea of a one-size-fits-all software solution. They are likely to want heavy customization, which means further, ongoing software development expenditures — in effect, trading an in-house proprietary platform for an outsourced one.

That brings us to cloud computing, including infrastructure as a service and platform as a service, which offer many of the same advantages as SaaS without limiting a customer’s choice of applications. But transitioning to these solutions raises the burden of custom development even higher than does SaaS. For example, cloud-based Java platforms tend to favor the Spring framework rather than Java EE, meaning existing Java EE applications will have to be re-engineered. Even when everything goes smoothly, some experts claim that a badly worded service agreement can mean the cost of hosted applications actually exceeds the cost of in-house deployments.

Another growing concern for today’s businesses, according to the Forrester report, is mobility. More than half of the companies surveyed planned to build mobile apps or mobile-optimized Web pages to augment their existing online offerings — in other words, they want to take what they already have on the Web and get it working on smartphones. Yet only 16 percent of companies planned to engage outside development agencies with existing mobile expertise, while 80 percent planned to retrain in-house developers for mobile platforms. Meanwhile, mobile developers are confronted with a dizzying array of options. According to Forrester, “We expect shops to struggle with the idea (and costs) of supporting four or more smartphone operating systems.”

Rise of the monocultures

By now the theme should be clear: Much of the investment companies are making in “new projects and initiatives” doesn’t represent real progress at all. Instead, it’s reinventing the wheel or transitioning existing applications from one platform to another — and potentially to another and then another.

How do companies cope? They hunker down. For many of its findings, Forrester grouped its developers into those who use Eclipse and those who use Visual Studio. Predictably, Visual Studio developers chose Microsoft products and platforms in far greater numbers than did Eclipse developers. That they used .Net and Windows Forms should be obvious, but they also overwhelmingly chose SQL Server as their database and Windows Azure as their cloud platform. Apparently, for them the efficiencies inherent in a single platform and tools vendor outweigh the dangers of lock-in.

It’s easy to scoff at Windows developers’ single-mindedness, but Microsoft’s model is increasingly becoming the norm. Oracle is exerting ever greater influence over the Java platform, seemingly with the aim of creating a developer monoculture similar to what Microsoft enjoys. No less than the IEEE has launched a standardization effort for cloud computing platforms, arguing that, “without a flexible, common framework for interoperability, innovation could become stifled, leaving [users] with a siloed ecosystem.”

After all, that only makes sense. The more tool vendors can hold developers in their respective silos, the more they can gradually raise the costs of their offerings. If you want to see it in action, you need only look at the numbers — and the trend isn’t slowing down, it’s increasing.

This article, “Planning a development project? Bring your wallet,” originally appeared at InfoWorld.com. Read more of Neil McAllister’s Fatal Exception blog and follow the latest news in programming at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.