simon_phipps
Columnist

The download deception

analysis
Nov 2, 20126 mins

Downloads are an interesting metric, but they reflect brand strength and market opportunity, not current success

As part of Microsoft’s attempt to convince us all that Windows 8 is not the dogs’ dinner some claim it is, Steve Ballmer announced this week that the company had sold 4 million Windows 8 upgrades in the first three days of general availability. While that number (which must include some combination of downloads and discs) sounds impressive, it leaves me cold. After all, I remember the open source download metrics at Sun Microsystems.

Downloads are a popular metric because they are easy to measure, but relating them to something that’s useful to know is much harder.Those Windows 8 upgrades that were downloaded, for example, may have been driven by huge limited-time discounts (hurry, just $14.99 if you’ve bought a Windows 7 PC in the last 11 weeks!).

[ Simon Phipps tells us how he really feels: Why software patents are evil. | Track the latest trends in open source with InfoWorld’s Technology: Open Source newsletter. ]

At best an imperfect measure of adoption, the number of downloads tells you nothing about the satisfaction of the downloader. Downloaded files may not be used at all or may be saved for later. Open source downloads in particular may be used multiple times. They may be evaluations; they may be in error. The downloaders may be new users or established users. These and other variables mean there’s a high margin of error for any conclusion derived from download numbers.

Downloads and business models

At Sun, a focus on downloads arose out of the switch to an open source strategy in 2005. It was clear from the start open source was going to be a long-term strategy rather than a short-term tactic, bearing the most significant fruit on a five- to seven-year timescale. Indeed, that turned out to be correct; the successful communities we built back then, especially those around Java and identity management, have been real assets.

In many ways we were trailblazing. No significant software company had ever switched their entire business strategy to open source like this before. In the absence of a large working example, we had to extrapolate a methodology and invent new practices. Some were very effective — the analysis of copyright ownership was very thorough, for example, leading to strong principles and tools for managing the tracking of open source licensing as software flowed through the organization.

The business model most of the teams in the company decided to use involved monetizing mature deployment of the software. By promoting easy adoption of the software, a market would be created for services and support as the adoption lifecycle progressed. Individual adopters lead to enterprise evaluations, evaluations lead to pilots, pilots lead to deployments, and deployments lead to enterprise standardization. If that process could be seeded, significant, stable business could be expected in later stages as enterprises purchased subscriptions for easy updates to sustain their production systems.

As it turns out, that model was correct. I’m aware of a number of companies that have been able to pick up where Sun left off and carry use of Sun’s software through to the profitable latter stages. While it was ridiculed by some, long-term thinking about open source was working for Sun’s software business, and had Sun not been severely destabilized in 2008 by the Wall Street crash cutting off its legacy income, we might today see an open source powerhouse driven by that model.

Downloads and damn lies The problem with a long-term strategy is how to measure progress in the short term. Because future success depends on broad adoption of the software, most teams at Sun chose to measure their progress by attempting to measure adoption. That’s hard to do. And the approach they took failed to recognize another crucial success factor for open source development: community health.

Measuring the health of the community around a project is difficult because open source is about four freedoms: to use, study, modify, and distribute the software without reference to anyone else. By its nature, proprietary software comes with a meter: The vendor is constantly in your hair, insisting you obtain and track licenses. There’s no such mechanism to measure open source software adoption. As a result, you can’t find reliable market numbers on the use of pure open source software; all the statistics available are measures of some first- or second-order derivative of adoption. Even these can be hard to obtain.

Many of Sun’s product groups took the easy way out and opted to count downloads as their adoption metric. As time progressed, things got out of hand. Marketing groups, seeing their bonuses would be paid based on downloads, started devising programs to inflate download numbers. With advertising, conference giveaways, developer program incentives, and more, they artificially drove the download numbers and won their bonuses. Gradually, that tactic was eliminated, and the metric switched to counting how many times downloaded software “called home” and checked for updates. That was a more reliable measure, but the focus never moved to measuring true community health and growth.

Community metrics If we’d had tools like Ohloh in those days, I’m sure things would have been different. What matters about an open source project is its community. “Community” is a big word, used to describe both multiple layers of co-development as well as multiple layers of deployment, so really useful metrics for an open source project tell you about the health and development of those layers of community. How many core developers are there — people who actually understand the code well enough to make substantial changes? What are their motivations for being there, employment or volunteerism? How many companies are employing them? What sort of changes are being made: bug fixes, of course, but what about major new work? These are by far the most important metrics to track, because they predict the future of the code.

By comparison, adoption metrics in general and downloads in particular only tell you what happened in the past. For example, a project like Apache OpenOffice would be expected to have enormous downloads because of its huge global adoption — in the hundreds of millions. Indeed, that’s what has been happening despite the stasis in the project. All it tells us is the strength of the OpenOffice brand, built over a decade of work; it tells us nothing about the project itself. The opportunity is huge, and so is the responsibility to serve those hundreds of millions of users.

Watch out particularly for claims that large download numbers demonstrate “success.” Anyone who focuses on download numbers alone has something to hide. If it’s a commercial product, like Microsoft Windows 8, ask what proportion of the market is indicated, what this says about the brand, what techniques have been used to artificially push the numbers, and more. For an open source project, ask first and foremost about the developer community — its diversity, stability, contribution rate, and growth. These are better measures of the things that truly contribute to software freedom growing and spreading in the long term.

This article, “The download deception,” was originally published at InfoWorld.com. Read more of the Open Sources blog and follow the latest developments in open source at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.

simon_phipps

Simon Phipps is a well-known and respected leader in the free software community, having been involved at a strategic level in some of the world's leading technology companies and open source communities. He worked with open standards in the 1980s, on the first commercial collaborative conferencing software in the 1990s, helped introduce both Java and XML at IBM and as head of open source at Sun Microsystems opened their whole software portfolio including Java. Today he's managing director of Meshed Insights Ltd and president of the Open Source Initiative and a directory of the Open Rights Group and the Document Foundation. All opinions expressed are his own.

More from this author