For software developers, writing quality code is only the beginning of an effective vulnerability-mitigation strategy Not every software company has to deal with bugs as critical as the ones believed to have contributed to accidents involving Toyota cars, but one thing is becoming increasingly clear: Every software company ships products with hidden security defects. There are virtually no exceptions.According to software testing service provider Veracode, which issued a report to coincide with this week’s RSA Conference in San Francisco, nearly 60 percent of the software submitted to its security testing suite in the last 18 months failed the first round of tests. As Veracode’s senior vice president of marketing Roger Oberg notes, these were applications from vendors who cared enough about security to use Veracode’s services in the first place.[ Keep up to date on the latest developments with InfoWorld’s Security Central newsletter and Developer World newsletter. ] Veracode’s data is not unique. Last year, a study by WhiteHat Security found that 82 percent of enterprise Websites had harbored at least one security flaw of “high, critical, or urgent” severity in recent history, and 63 percent still contained such bugs at the time of the survey.Admittedly, studies by security consultancies are self-serving. Yet their findings should not be dismissed out of hand. You need only sift through the headlines to notice the frequency with which vulnerabilities are discovered in major software products from reputable vendors. Independent developers would be foolish to assume their own software is any different, simply for the fact that bugs are so difficult to avoid. Developers play whack-a-mole Don’t assume that only the obscure, sophisticated bugs slip through the cracks, either. Each year, the SANS Institute and Common Weakness Enumeration (CWE), a government-sponsored security watchdog, publish a list of the 25 most widespread and dangerous programming errors. As in previous years, the 2010 list includes a few gotchas, such as unwittingly revealing security information in error messages or accepting unrestricted uploads of dangerous file types. But it’s also chock-full of such rookie mistakes as race conditions, buffer overflows, and improper validation of array indices. These are timeless errors that date back to the dawn of programming; for them to still be widespread in 2010 is astounding.And yet, evidence suggests that even acknowledged best practices can sometimes lead to bugs. In 2006, Google’s Joshua Bloch blogged that he discovered a bug in the binary sort algorithm found in Jon Bentley’s popular reference book, “Programming Pearls,” first published in 1986. Bloch wasn’t pointing the finger at Bentley, though; as it turned out, the binary search Bloch himself had written for the JDK contained the same bug, and his error had gone unnoticed for around nine years.Can programmers do better? Software testing using services such as Veracode’s can certainly help, but no such solution is perfect. In some cases, application architecture or choice of language can make thorough testing impractical. Open source developers like to tout “Linus’ law,” which says that “given enough eyeballs, all bugs are shallow.” In other words, the transparency of the open source software development process means bugs in open source software will be caught and resolved more quickly than those in proprietary software.But Microsoft security program manager Shawn Hernan disputes that claim, and credibly. According to Hernan, just because programmers can review code for bugs doesn’t mean they actually do; furthermore, the evidence suggests only full-time, paid programmers are motivated enough to spend time reviewing someone else’s code. If that’s true — and I think it’s likely — then only the software vendors with the deepest pockets (and thus the largest staffs) will truly be able to benefit from Linus’ law. Keep the channels open But none of this should be taken to imply that software security is a lost cause. It isn’t, but the key lies in recognizing that only so much can be done with code. It is the responsibility of any developer to ship code of the highest possible quality — “highest possible” being the operative phrase. After that, the heart of any software developer’s security strategy lies not in its development process, but in its process of dealing with security incidents when they inevitably arise.The days of shipping software patches on CDs and floppy disks are long gone. Today, customers expect prompt delivery of patches almost as soon as vulnerabilities are discovered. While this may often be impractical, vendors delay delivery of critical patches at their peril.Mind you, how vendors deliver patches can occasionally be problematic as well. Once, Microsoft delivered patches as soon as they were available, throughout the month. But customers complained that this made it too difficult to evaluate patches before deployment, placing an undue burden on IT staffers. In response, Microsoft switched to its current model of shipping fixes on “Patch Tuesday,” twice a month. This method too has been criticized, particularly by those who claim that every Patch Tuesday leads to an “Exploit Wednesday,” when hackers race to attack those who haven’t applied the latest patches. Customers will always gripe about security flaws and the need to patch them. The key is for developers to be as open and candid as possible about security issues related to their software, and to be forthright in offering assistance and advice to customers who may be affected, even before a patch is available.The alternative is to foster a culture of silence and secrecy around security issues, and that’s a recipe for failure. Toyota’s situation is somewhat atypical. But closer to home, Web developers are champing at the bit for HTML 5, which they hope will free them from the seemingly endless series of bugs that have cropped up in plug-ins such as Adobe Reader and Flash, and that often go unresolved for weeks or longer.As more studies like those from Veracode and WhiteHat Security come to light, customers are beginning to understand that security flaws in software are a fact of life. As that perception takes root, customers will increasingly demand not just patches, but greater disclosure of software security issues as they arise. Soon, software companies that don’t regularly disclose security bugs won’t be seen as the vendors with the highest-quality apps; they’ll just be the ones with something to hide. This article, “Bug-free software? Dream on,” was originally published at InfoWorld.com. Read more of Neil McAllister’s Fatal Exception blog and follow the latest developments in software development at InfoWorld.com. Application SecurityCareers