Galen Gruman
Executive Editor for Global Content

The lessons IT must learn from Apple

analysis
Apr 6, 201210 mins

Apple has wormed its way into the broad population, creating new expectations -- and a model -- for IT

For fanboys, it’s vindication. For old-school IT, it’s a nightmare. For those not in either extreme, it’s further sign of the fundamental shift known as the consumerization of IT. This much debated milestone? A recent CNBC survey shows that more than half of U.S. households now own at least one Apple product. The iPod leads the list, followed by iPhones, then iPads and Macs.

So what’s the big deal? — for nearly a year, half of new cellphones sold in the United States have been smartphones, and a slight plurality of those run Android. Shouldn’t that be as significant as the Apple penetration?

No — Apple’s reach into people’s everyday lives is not merely changing their expectations for what technology should do, but ironically can serve as a guide for IT on how to get what it wants. However, IT must understand the real lessons of this shift. The people most likely to be avid Apple product users are better-paid men, the survey shows — in other words, the businesspeople with authority who often call the shots and set the business expectations of technology in general and IT in specific.

Apple effect isn’t merely consumerization effect Users are shifting into mobile devices, and its implications on computing are indeed profound. But we already know that and can see it manifest itself in everything from Microsoft’s attempt to reinvent Windows and the notion that we’re entering a post-PC era.

Certainly, the fact that the one remaining U.S. PC retail chain, Best Buy, keeps failing to make money and is now regrouping as more of a mobile phone store (sort of like RadioShack has done), tells you that “traditional” PCs are, though still useful, not important — sort of like toasters and microwave ovens.

Apple rides this trend, as does Google’s Android. But Apple lit the fuse with its iPhone, which redefined both mobile computing in particular and computing in general. The iPad lit the second fuse, breaking the separation between mobile and desktop computing. In some cases, an iPad is the primary computer already.

More critical, Apple is defining very much what the new computing means, as well as training users on what to expect computing to be. As the notions of user technology and personal technology continue to blend, Apple’s ideas are reshaping the expectations and requirements of corporate IT as well. Nobody, and I mean nobody, else is doing that. The traditional tech vendors are mainly copying the superficial form of Apple’s directions. Yet it’s clear that users don’t want inferior, superficial copies.

The entrancing Apple ecosystem Many in IT don’t get it. They’ll say that iPods are irrelevant to computing technology, and the fact that those are the majority of Apple products in use distorts any alleged Apple effect. The facts speak otherwise. That CNBC survey shows that the 51 percent of households that have an Apple product have three Apple products each on average, and a quarter of those plan to buy an additional one this year.

What this signifies is the effect of the Apple ecosystem: It’s cliché to say that Apple products are easier to use than rivals, but they almost always are. They also work well together, creating a virtuous cycle, a sort of user interface version of the network effect known as Metcalfe’s Law (named after the Ethernet inventor, venture capitalist, and former InfoWorld publisher, Bob Metcalfe).

You see this effect in the real world. The iPod or iPhone is a gateway drug to more Apple products. iTunes and now iCloud encourage the addition of more Apple products to share your digital goodies and — more important — your user experience. There’s truth to the joke that once you go Mac, you never go back. Regular readers know that, despite my decidedly PC origins, I followed this path to an Apple ecosystem. I see it regularly among not just my tech-savvy friends but my much larger circle of “regular” folks who aren’t in the tech business and don’t salivate over technology devices.

The Apple quotient continues to rise, and the only other platform that evinces any sort of comparable joy and loyalty has been Android — but only briefly, as people discovered the benefits of a smartphone versus a regular cellphone. Their next devices have invariably been iPhones, once they realized that Android’s limited interoperabillity with the rest of the technology world stands in sharp contrast to Apple’s ecosystem. (Except for one IT friend who’s never forgiven Apple for its chatty AppleTalk network protocol of the 1980s and won’t darken his door with Apple products to this day — the rest of us just smile knowingly.)

The lessons for corporate computing When you’ve experienced positivity — whether that’s a great work environment, the taste of real food from local farmers, or computing that both just works and works naturally — it’s hard to accept less. But when you arrived to the office, that’s usually what you got: Rigidly controlled sytems poorly integrated with each other and your actual work processes. Software that forces a mental shift as you move from one tool to another. Bewildering configurations or no configurability.

It’s often either chaotic or overly homogenized — and often inferior. The technology at work stands in stark, unflattering contrast to the technology you have at home, especially if it’s Apple technology.

Now, only a fool would believe Apple technology is flawless. There are bugs and gotchas in Apple’s products (such as OS X Lion’s Mail tendency to stop getting mail in a multiaccount setting), as well as puzzling limitations (such as the lack of the sophisticated repeating-event options for calendar entries long available in Windows and BlackBerry calendars). Nonetheless, they’re significantly better products, and people really notice and appreciate that fact.

User experience rules. IT’s been exhorted since the 1980s to become savvy about user experience. Heck, I used to edit a column called Human Factors in the mid-1980s for IEEE Software magazine advocating what still isn’t done nearly 30 years later. But most software and hardware remains poorly designed, if designed at all. Apple has shown that good design is not only possible but can be made innate to a broad product line over years and years.

Now that users have begun to assert control over the technology they use, they no longer have to wait for IT to get it in its own software or the software and hardware it acquires — they’ll get it themselves. Nor do they have to accept poorly designed tools from IT or the vendors IT has chosen — they’ll get them elsewhere. After all, there are plenty of cloud service providers, social technology providers, and app store options — plus of course computers, tablets, smartphones, and hardware — they can choose from instead of IT. And they will.

Benevolent dictatorship can work. Ironically, Apple’s highly controlled approach to its ecosystem mirrors that of many IT organizations. Apple’s ecosystem works well because Apple has decided how it should work, and it usually ignores anything outside of that worldview. As the Silicon Valley joke goes, it’s Steve Jobs’ world and we just live in it. Apple’s decisions usually involve more than a top exec’s whims and instead come from a heavily examined belief held by Apple’s leadership and those it hires. But at the end of the day, it is Apple’s ecosystem, and you accept it or go elsewhere. Most users who live in it not only accept it, but embrace it.

The difference between this “we know best” approach and the usual high-control IT organization is that Apple almost always makes the right choices, so users happily accept the covenants of their gated community. IT’s choices typically ignore, misunderstand, or disrepect the user. For IT organizations where the desire to control is truly legitimate, they need to apply it in a way users will gladly accept — to follow Apple’s approach rather than ignore or fight it. A strategy of control for its own sake or of poorly executed control means they’ll lose. Even if they win the battle, they lose the war as the business’s staff over time becomes those willing to live in or build poor enviroments — not the most competitive or creative set are they.

Dying technology is euthanized. When Apple decides something needs to die, it kills it. That’s what happened with the floppy drive, then to all its proprietary ports, then to CDs, and most recently to Adobe Flash. PC users whine and point fingers, but their vendors eventually follow suit. Apple users simply deal and move on, perhaps after a brief compaint. That’s something else IT should learn: Stop mollycoddling old technology that slows the company and complicates its technology maintenance. The short-term cost of change is lower than the long-term cost of avoidance.

Case in point: Interet Explorer 6 and ActiveX, Microsoft’s proprietary, pre-AJAX method for delivering Web applications. When ActiveX was invented, it was a revelation and brought app know-how to the Internet. But it was tied to specific versions of the Microsoft browser and to the Windows platform. In the monocuture of the typical IT organization, that was great. But today, ActiveX introduces nightmarish complexity for IT, as different apps use different versions and require different IE releases — but Windows can’t run different versions of IT on the same PC, short of having multiple virtual machines, which adds even more complexity.

Microsoft has been trying to kill ActiveX and older IE versions for some time now, but they’re too entrenched in custom IT apps and specialty apps for dentists, government agencies, and the like from tiny vendors with few development resources. It will continue to be supported even in the forthcoming Windows 8, worsening the problem.

The Apple approach would be to say that ActiveX is dead as of the next version of IE or Windows — to deprecate it, in developer terms — and mean it. All legacy ActiveX apps would go away. Knowing that was the case, IT would not let such legacy buildup occur in the first place. Certainly, as Apple products get more entrenched in the enterprise, IT will have to make that adjustment. Because Apple routinely deprecates old technology — and rarely extends the transition period — it will force the issue (for your own good, of course).

Adapt or die I’m sure at some point Apple will lose its way and what has been a remarkable 20-plus years of innovation under its second Steve Jobs era will come to a close. We’ve seen other companies — Adobe Systems, Dell, Hewlett-Packard, IBM, and Microsoft — devolve into tired, dysfunctional companies with no real innovative spark or drive beyond making the numbers at any long-term cost. There’s a theory at MIT that says this happens to all companies, though some can turn back the clock and reclaim the old magic if their leadership can force the issue. IBM and Apple are two recent tech industry examples; under its founders, HP had been one as well.

If that were to happen, it’s years down the line, and anyone in IT hoping Apple will go away is more likely to be the one who splits. A better approach would be to figure out what Apple is doing right to serve and engage customers, and replicate what is possible within IT. If you do so, you won’t worry about shadow IT, disrespect, irrelevance, or consumerization — you’ll be co-captaining a better company.

This article, “The lessons IT must learn from Apple,” was originally published at InfoWorld.com. Read more of Galen Gruman’s Smart User blog at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.