Let’s face it: HTML5 is no app dev panacea

analysis
Jul 7, 20117 mins

Don't believe the hype: building serious applications still takes more than mere Web markup

Nothing frustrates a professional developer more than hearing someone describe themselves as “an HTML programmer.” Coding Web pages with markup has about as much to do with real programming as writing a menu has to do with cooking a meal. But you wouldn’t think so to hear platform vendors tell it. Lately, HTML has been made out to be a preferred development tool for everything from smartphone and tablet apps to full-blown desktop applications.

Palm launched its WebOS platform claiming developers needed nothing more than Web standards to build apps for it. Microsoft did the same with Windows Phone 7. Google’s Chrome browser has a “Web store” that lets you shop for desktop Web applications. But my jaw finally dropped when a recent demo showed how developers will be able to use HTML5 to write apps for Windows 8, leading panicked Windows developers to speculate that Microsoft was planning to drop support for Silverlight and even .Net itself.

Holy cow. Are we really so blinded by the HTML5 hype wagon that we’d believe Microsoft is ready to scupper core Windows APIs in favor of Web standards? It makes no sense. HTML5 is a fine tool and it will do great things for the Web, but lately it’s been pushed to such lofty heights that it’s plain ridiculous. As welcome as HTML5 is, there are plenty of reasons why nobody should consider it the universal development tool of choice. Here are some caveats to consider.

1. Good luck building anything with HTML alone Anyone who suggests you can build apps in HTML is pulling your leg. What they really mean is you can build apps using HTML and JavaScript, but even this doesn’t give you the whole picture. The minimum for any real Web application is HTML, JavaScript, and CSS — three separate languages, all at once. The W3C’s HTML5 effort has added still more APIs to the mix of Web standards, enabling such capabilities as multithreading and local storage. This is assuming your app won’t communicate with any kind of server-side component — for heavy computation or storage, perhaps — and all the additional languages, APIs, and standards you’d need to confront then.

When anyone suggests that building apps is “as easy as building Web apps,” what are they really saying? Web development has evolved into a complex, multitiered, multilanguage discipline. Often it’s no picnic. Is that really the model we want to foist off on the next generation of developers?

2. HTML wasn’t designed for applications The buzz on HTML5 is that it’s HTML souped up with improvements to support Web applications. But better app support wasn’t always the direction of the HTML standard. Originally, the successor to XHTML 1.1 was going to be XHTML2, which would have emphasized semantic markup and integration with XML. True to its roots, XHTML2 was a document-centric markup language.

The XHTML2 effort foundered, however, and a splinter group called the Web Hypertext Application Technology Working Group (WHATWG) broke off from the W3C’s HTML activity to begin work on a different draft of the standard, one that emphasized elements useful for Web applications. It was this work that eventually became the basis of what we now know as HTML5.

But was HTML5 really the best direction to go? HTML5’s ballyhooed <canvas> tag, for example, essentially means “insert a bunch of programmatically generated graphical content that can’t be described by markup.” That’s a pretty strange use of a markup language. If we keep going down this road, are we not perhaps shoehorning Web standards into a role for which they were never really suited? It may be a necessity for the Web, but do we really want to put ourselves in the same bind everywhere else?

3. HTML sucks for building UIs One of Apple’s big innovations with the original Mac was publishing a detailed set of Human Interface Guidelines for developers. As a result, unlike DOS programs, Mac apps looked alike and behaved alike. They all used the same kind of menus, the same dialog boxes, and the same alerts. The resulting impression of coherence and consistency was a big reason why the Mac OS was so wildly successful, even when GUI desktops were still new and unfamiliar.

With Web apps, we’re back to the DOS days. Interface designers are free to create any kind of buttons they want, have menus that slide down or pop up from anywhere, and generally paint the entire window any way they see fit. Without a standard set of widgets, apps built with Web technologies feel inconsistent and sometimes downright alien. Even if you go out of your way to build a UI that looks dead-on like a native iPhone app, the same UI won’t fit in on an Android phone. Who’s going to take the time to build Web-based apps that feel “native” on every platform? Nobody, that’s who. (Let’s not get started on the screen-size issue.)

4. Building platform-specific HTML apps makes no sense But let’s say you don’t care about targeting every device or platform. Let’s say you’re just building an app for iOS or for Windows 8 — fine. But why on Earth would you pick HTML to build an application for a single platform? The whole point of HTML and its related technologies is that they are open, cross-platform standards.

What’s more, both iOS and Windows already have SDKs that mitigate many of the drawbacks of building apps the HTML way. They do give you a standard set of widgets that allow you to build consistent UIs. They give you access to APIs that let you run algorithms at native processor speed. They allow you to integrate your app with core OS features, ones that aren’t present on other platforms (which were presumably why customers chose those platforms to begin with). And you’d give all that up, why? Because coding Web apps is “easier”? Even if that were true, try putting it on your résumé.

5. Limiting developers to Web technologies is wrong There’s no surer way to start a catfight on a Web development forum than to ask what’s the “best” programming language. Developers can be passionate about their tools, and there certainly is a wide range to choose from.

The Web narrows that range, however. Building Web apps means coding in HTML, CSS, and JavaScript. We all learned them because we had to learn them. That doesn’t mean we have to love them.

But because everybody knows HTML, CSS, and JavaScript, those languages have a huge installed base of developers. That’s the real reason why vendors are so quick to claim that developing for their new platform is “as easy as coding in HTML5.” By doing so, they get to assert there are millions of developers who already know how to work with their platform — even though that’s never strictly true, because every OS and platform has its own idiosyncrasies.

So vendors will continue to tout how much you can do with Web technologies, and they’ll continue to bolt SDKs based on HTML and JavaScript to their existing operating systems — because it’s good marketing. I just wish they wouldn’t. Such tools are almost never as powerful as they’re cracked up to be, and they’re never really popular with professional developers (as opposed to casual “HTML programmers”). In the end, they’re merely a distraction from the many, many other tools that might be more powerful, more elegant, or better suited to the task at hand. Enough already!

This article, “Let’s face it: HTML5 is no app dev panacea,” originally appeared at InfoWorld.com. Read more of Neil McAllister’s Fatal Exception blog and follow the latest news in programming at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.