Galen Gruman
Executive Editor for Global Content

Sorry, Apple: Why the stylus needs to be in the mobile mix

analysis
Jun 12, 20128 mins

As the world focuses on curved touchscreens and voice controls, we may be neglecting a key user experience mechanism

The vanguards of the first mobile revolution, the Palm Pilot and the Apple Newton MessagePad, both used a pen — aka a stylus — as an input device. The Newton, despite its leading-edge (for that era) handwriting recognition, was a flop, but the Pilot was a moderate hit for several years until the BlackBerry came out with its miniature keyboard; from there, the action moved away from electronic organizers to messaging devices. Windows tablets used pens, but almost no one used a Windows tablet.

More than a decade later, Samsung has reintroduced the pen in the Galaxy Note “phablet,” a 5-inch, tablet-like smartphone. It will also put a pen in a full-size (10-inch) Note tablet later this year, at least in parts of Europe. Is it time for the pen to be standard equipment or at least a standard option on tablets today?

[ InfoWorld picks the 12 essential mobile app development tools. | Get the best apps for your mobile device: InfoWorld picks the best iPad office apps, the best iPad specialty business apps, the best Android tablet office apps, and the best Android specialty apps. | Keep up on key mobile developments and insights with the Mobilize newsletter. ]

The answer is a definite maybe.

A sea of glass and sensor technology Last week, I attended the Society for Information Display’s annual conference in Boston — these are the vendors and researchers who create monitors, touchscreens, and other sensor-laden surfaces you interact with. (Scientists and marketers make for an interesting combination!) There wasn’t that much innovation, beyond the usual refinements in the basic display and touch technologies you expect from the tech industry. One exception: Corning’s Willow Glass, a bendable, very thin glass sheet that could bring curved surfaces to displays and bezels, though the challenges of handling LCDs and touchscreen sensors into the curved glass is clearly a manufacturing and engineering challenge that may take a couple years to sort out.

Even the visionary future was the same old future: Microsoft was showing its moldy videos of Microsoft Surface, its tabletop touchscreen interface it’s been hawking for years, for example. There were also cool demos of multiuser interactive displays — another perpetual promise — from both academics and vendors, though the ones actually available to buy were created for very specialized applications. The “Minority Report” and “Iron Man” technologies are not here. We forget how tough the materials, electronics, and software engineering is to create real revolutions, even in high tech.

At the conference were Ntrig pushing pens, a lonely standout from the glass and touch-sensor options that filled the rooms. You likely don’t know Ntrig, but you’ve seen its handiwork in many pen computing devices. (You probably do know its main competitor, Wacom, whose circuitry powers the Galaxy Note’s S Pen and whose Wacom tablets have a 25-year history with Mac-based designers.) Ntrig’s presentation focused on the less plausible pen future, the one where we’re all writing on screens as the basic, “natural” input method. Sorry, but typing is faster, even on an onscreen keyboard. Writing may be ancient, but it’s hardly natural — it’s very a much a trained behavior. And voice dictation is getting much better — just ask Apple and Google. By contrast, a pen is a slow, messy way to enter lots of text.

Revisiting the “failed” pen I decided to give the pen renewed attention on the Galaxy Note, after seeing the Ntrig pitch, though it may qualify as “ancient history” ever since the Palm Pilot faded from the market a decade ago. After all, people said that iPhones would never be accepted because onscreen keyboards were too awkward to use, compared to the BlackBerry’s physical keyboard. Today, the BlackBerry is on a death spiral, and the Android devices such as the Motorola Droid 4 with a physical keyboard are in the minority. Touchscreens won, despite widespread initial rejection. Could the same be true for pens?

A pen is a great way to draw and to annotate, which is how it’s used on the Galaxy Note and on the Windows tablets hardly anyone bought. In fact, although I considered the Galaxy Note an awkward device due to its size and the ill fit of its smartphone OS to its tablet scale, the one thing I liked about it is its pen, which works well with the touchscreen. And its built-in holder gets rid of the question of where to keep the pen when you aren’t using it. You can add annotations easily to the contents of many apps. PDF markup, presentation markup, doodles and diagrams in meeting notes — these are useful and quite doable in apps designed for the Galaxy Note’s S Pen technology.

Traditional handwriting recognition for text input is harder. The tactile feedback of plastic pen on glass is fine for limited text input, but too awkward to do on a frequent basis. It’s amazing how much your handwriting — even block printing — degrades on a screen with a stylus. Not only is the tactile feedback wrong, but on a glass surface, there’s the depth-perception problem, where the eye focuses on the liquid crystals behind the glass but the hand and pen touch the surface, creating maybe a millimeter of disconnect between what you see and what you feel. That really confuses your brain, and it shows in the results.

Then there’s drawing, from simple annotations on a slide to actual painting on a digital canvas. You can’t do that with a keyboard or a microphone — but you can do it with your fingers. On an iPad, there are apps that let you annotate PDFs for onscreen presentation, such as Whiteboard Plus. Also, PDF markup tools such as PDF Expert and GoodReader let you carry out the full complement of lines, shapes, and highlighting with your fingers that typically require a mouse on a computer. Additionally, you can take advantage of drawing programs that appear to react to your finger pressure to enhance photos, such as Adobe’s Photoshop Touch and Apple’s iPhoto, as well as apps that let you create digital artwork through finger motions, such as Paper and Drawing Pad (whose Android version is nowhere near as good).

So who needs a pen? Smartphone users, for sure — phablets and mini-tablets, too. Their smaller screens show the limits of how finely you can control finger movements. Those activities you can do on an iPad fairly well get a lot harder on a 3.5-, 4-, 5-, or even 7-inch screen.

The larger, 10-inch screen of an iPad or Android tablet give you more precision simply because your finger’s size relative to the screen is smaller, making your fingertip more like a pencil than a crayon. But it’s still not as precise as a pen. Nor is the pressure sensitivity there; iPad apps that seem to respond to finger pressure simulate that pressure by tracking finger speed, assuming a slower speed means you want a thicker line, as in a fountain pen or paintbrush.

Also missing from your finger and the simple styli available for the iPad is context. A real pen, such as the one that comes with the Galaxy Note, has a button or two to let you indicate context, much as a modifier key like Ctrl or Alt on a PC does with a mouse. Combining its pressure sensors, its contextual buttons, and its fine tip, a real pen does for onscreen drawing what Apple’s Retina display technology does for onscreen viewing: takes it to a new level of realism.

Unfortuately, in my experimentation with a half-dozen pen-savvy apps for the Galaxy Note, I didn’t get these kinds of benefits. Most of the apps didn’t use pressure information or take advantage of the buttons beyond the basic UI capabilities native to the Galaxy Note (such as to switch from drawing mode to gesture mode). Ironically, the iPad apps I mentioned did more with fingers than most of the Note apps did with a pen.

Developers are in a Catch-22: Only the Galaxy Note supports the S Pen, so the market is small and there’s little economic incentive to deliver pen-savvy apps, especially sophisticated ones. Without such apps, the market stays small, keeping developers without attractive profits.

Pens should join the user input parade If the pen capabilities were part of the Android OS or of iOS, it’d be a different story: The sophisticated annotation and drawing apps on the iPad mentioned here would be even more capable if they were pen-savvy. Windows 8 will support pen input, but it’s unclear if it will be sophisticated enough; the pen support in Windows XP, Vsta, and 7 was elemental, used mainly for handwriting recognition and checking off boxes in list apps.

I believe Google, Apple, and/or Microsoft should make their OSes and SDKs as pen-savvy as they are gesture-, keyboard-, and voice-savvy. The pen deserves to be one of the standard input methods, not in place of the others but alongside them. For certain activities, nothing beats a pen. It would make sense for a pen to be standard issue in phablets and tablets, à la the Galaxy Note. I hope it happens.

This article, “Sorry, Apple: Why the stylus needs to be in the mobile mix,” was originally published at InfoWorld.com. Read more of Galen Gruman’s Mobile Edge blog and follow the latest developments in mobile technology at InfoWorld.com. Follow Galen’s mobile musings on Twitter at MobileGalen. For the latest business technology news, follow InfoWorld.com on Twitter.